Config - vcspull.config¶
Configuration functionality for vcspull.
-
vcspull.config.expand_dir(dir_, cwd=<bound method Path.cwd of <class 'pathlib.Path'>>)¶
Return path with environmental variables and tilde ~ expanded.
- Parameters:
dir (pathlib.Path) – Directory path to expand
cwd (pathlib.Path, optional) – Current working dir (used to resolve relative paths). Defaults to
pathlib.Path.cwd().dir_ (Path)
- Returns:
Absolute directory path
- Return type:
-
vcspull.config.normalize_config_file_path(path, cwd=<bound method Path.cwd of <class 'pathlib.Path'>>)¶
Return absolute config file path without resolving symlinks.
Symlink entry names are preserved intact so that downstream operations (e.g. atomic writes) can resolve them as needed, while the logical path is used for display and identity.
- Parameters:
path (pathlib.Path) – Config file path to normalize.
cwd (pathlib.Path, optional) – Current working dir (used to resolve relative paths). Defaults to
pathlib.Path.cwd().
- Returns:
Absolute config file path with symlink names preserved.
- Return type:
Examples
>>> normalize_config_file_path(pathlib.Path("~/cfg.yaml")).name 'cfg.yaml' >>> normalize_config_file_path( ... pathlib.Path("configs/vcspull.yaml"), ... cwd=pathlib.Path("/tmp/project"), ... ) PosixPath('.../configs/vcspull.yaml')
-
vcspull.config._validate_worktrees_config(worktrees_raw, repo_name)¶
Validate and normalize worktrees configuration.
- Parameters:
worktrees_raw (Any) – Raw worktrees configuration from YAML/JSON.
repo_name (str) – Name of the parent repository (for error messages).
- Returns:
Validated list of worktree configurations.
- Return type:
- Raises:
VCSPullException – If the worktrees configuration is invalid.
Examples
Valid configuration with a tag:
>>> from vcspull.config import _validate_worktrees_config >>> config = [{"dir": "../v1", "tag": "v1.0.0"}] >>> result = _validate_worktrees_config(config, "myrepo") >>> len(result) 1 >>> result[0]["dir"] '../v1' >>> result[0]["tag"] 'v1.0.0'
Valid configuration with a branch:
>>> config = [{"dir": "../dev", "branch": "develop"}] >>> result = _validate_worktrees_config(config, "myrepo") >>> result[0]["branch"] 'develop'
Valid configuration with a commit:
>>> config = [{"dir": "../fix", "commit": "abc123"}] >>> result = _validate_worktrees_config(config, "myrepo") >>> result[0]["commit"] 'abc123'
Error: worktrees must be a list:
>>> _validate_worktrees_config("not-a-list", "myrepo") Traceback (most recent call last): ... vcspull.exc.VCSPullException: ...worktrees must be a list, got str
Error: worktree entry must be a dict:
>>> _validate_worktrees_config(["not-a-dict"], "myrepo") Traceback (most recent call last): ... vcspull.exc.VCSPullException: ...must be a dict, got str
Error: missing required ‘dir’ field:
>>> _validate_worktrees_config([{"tag": "v1.0.0"}], "myrepo") Traceback (most recent call last): ... vcspull.exc.VCSPullException: ...missing required 'dir' field
Error: no ref type specified:
>>> _validate_worktrees_config([{"dir": "../wt"}], "myrepo") Traceback (most recent call last): ... vcspull.exc.VCSPullException: ...must specify one of: tag, branch, or commit
Error: empty ref value:
>>> _validate_worktrees_config([{"dir": "../wt", "tag": ""}], "myrepo") Traceback (most recent call last): ... vcspull.exc.VCSPullException: ...empty ref value...
Error: multiple refs specified:
>>> _validate_worktrees_config( ... [{"dir": "../wt", "tag": "v1", "branch": "main"}], "myrepo" ... ) Traceback (most recent call last): ... vcspull.exc.VCSPullException: ...cannot specify multiple refs...
-
vcspull.config.extract_repos(config, cwd=<bound method Path.cwd of <class 'pathlib.Path'>>)¶
Return expanded configuration.
end-user configuration permit inline configuration shortcuts, expand to identical format for parsing.
- Parameters:
cwd (pathlib.Path) – current working dir (for deciphering relative paths)
- Returns:
list
- Return type:
List of normalized repository information
-
vcspull.config.find_home_config_files(filetype=None)¶
Return configs of
.vcspull.{yaml,json}in user’s home directory.The returned path preserves the logical home entry name so callers keep the config type implied by
.yamlor.jsoneven when the file is a symlink.- Parameters:
filetype (list of str, optional) – File types to search for (default
["json", "yaml"])- Returns:
Absolute paths to discovered config files
- Return type:
list of pathlib.Path
Examples
>>> find_home_config_files() []
-
vcspull.config.find_config_files(path=None, match=None, filetype=None, include_home=False)¶
Return repos from a directory and match. Not recursive.
- Parameters:
- Raises:
LoadConfigRepoConflict : – There are two configs that have same path and name with different repo urls.
- Returns:
list of absolute paths to config files.
- Return type:
-
vcspull.config.load_configs(files, cwd=<bound method Path.cwd of <class 'pathlib.Path'>>, *, merge_duplicates=True)¶
Return repos from a list of files.
- Parameters:
files (list) – paths to config file
cwd (pathlib.Path) – current path (pass down for
extract_repos()merge_duplicates (bool)
- Returns:
expanded config dict item
- Return type:
list of dict
-
vcspull.config.detect_duplicate_repos(config1, config2)¶
Return duplicate repos dict if repo_dir same and vcs different.
- Parameters:
config1 (list[ConfigDict])
config2 (list[ConfigDict])
- Returns:
List of duplicate tuples
- Return type:
-
vcspull.config.in_dir(config_dir=None, extensions=None)¶
Return a list of configs in
config_dir.
-
vcspull.config.filter_repos(config, path=None, vcs_url=None, name=None)¶
Return a
listlist of repos from (expanded) config file.path, vcs_url and name all support fnmatch.
- Parameters:
- Returns:
Repos
- Return type:
-
vcspull.config.is_config_file(filename, extensions=None)¶
Return True if file has a valid config file type.
- Parameters:
filename (str) – filename to check (e.g.
mysession.json).extensions (list or str) – filetypes to check (e.g.
['.yaml', '.json']).
- Returns:
bool
- Return type:
True if is a valid config file type
-
vcspull.config._atomic_write(target, content)¶
Write content to a file atomically via temp-file-then-rename.
If target is a symbolic link the write goes through the symlink: the temporary file is created next to the resolved destination and the rename replaces the resolved path, leaving the symlink intact.
- Parameters:
target (pathlib.Path) – Destination file path (may be a symlink)
content (str) – Content to write
- Return type:
Examples
>>> import pathlib, tempfile >>> with tempfile.TemporaryDirectory() as tmp: ... p = pathlib.Path(tmp) / "test.txt" ... _atomic_write(p, "hello") ... p.read_text(encoding="utf-8") 'hello'
Symlinks are preserved — the real target is updated:
>>> with tempfile.TemporaryDirectory() as tmp: ... real = pathlib.Path(tmp) / "real.txt" ... _ = real.write_text("old", encoding="utf-8") ... link = pathlib.Path(tmp) / "link.txt" ... link.symlink_to(real) ... _atomic_write(link, "new") ... link.is_symlink(), link.read_text(encoding="utf-8") (True, 'new')
-
vcspull.config.save_config_yaml(config_file_path, data)¶
Save configuration data to a YAML file.
- Parameters:
config_file_path (pathlib.Path) – Path to the configuration file to write
data (dict) – Configuration data to save
- Return type:
Examples
>>> import pathlib, tempfile >>> with tempfile.TemporaryDirectory() as tmp: ... p = pathlib.Path(tmp) / "cfg.yaml" ... save_config_yaml(p, {"~/code/": {"myrepo": "git+https://example.com/repo.git"}}) ... "myrepo" in p.read_text(encoding="utf-8") True
-
vcspull.config.save_config_json(config_file_path, data)¶
Save configuration data to a JSON file.
- Parameters:
config_file_path (pathlib.Path) – Path to the configuration file to write
data (dict) – Configuration data to save
- Return type:
Examples
>>> import json, pathlib, tempfile >>> with tempfile.TemporaryDirectory() as tmp: ... p = pathlib.Path(tmp) / "cfg.json" ... save_config_json(p, {"~/code/": {"myrepo": "git+https://example.com/repo.git"}}) ... loaded = json.loads(p.read_text(encoding="utf-8")) ... "~/code/" in loaded True
-
vcspull.config.save_config(config_file_path, data)¶
Save configuration data, dispatching by file extension.
- Parameters:
config_file_path (pathlib.Path) – Path to the configuration file to write
data (dict) – Configuration data to save
- Return type:
Examples
>>> import pathlib, tempfile, json >>> with tempfile.TemporaryDirectory() as tmp: ... p = pathlib.Path(tmp) / "test.json" ... save_config(p, {"~/code/": {"repo": {"repo": "git+https://x"}}}) ... loaded = json.loads(p.read_text(encoding="utf-8")) ... loaded["~/code/"]["repo"]["repo"] 'git+https://x'
>>> with tempfile.TemporaryDirectory() as tmp: ... p = pathlib.Path(tmp) / "test.yaml" ... save_config(p, {"~/code/": {"repo": {"repo": "git+https://x"}}}) ... "repo" in p.read_text(encoding="utf-8") True
-
vcspull.config.save_config_yaml_with_items(config_file_path, items)¶
Persist configuration data while preserving duplicate top-level sections.
Unlike
save_config_yaml(), which loses duplicate keys when given a plaindict, this function accepts ordered(label, data)pairs so that two workspace-root entries with the same key are each serialised as a separate YAML block.- Parameters:
config_file_path (pathlib.Path) – Destination config file (may be a symlink; the real target is updated).
items (list of tuple[str, Any]) – Ordered
(section_label, section_data)pairs. Each pair becomes one YAML document block in the output.
- Return type:
Examples
>>> import pathlib, tempfile >>> with tempfile.TemporaryDirectory() as tmp: ... p = pathlib.Path(tmp) / "cfg.yaml" ... save_config_yaml_with_items(p, [ ... ("~/code/", {"flask": "git+https://github.com/pallets/flask.git"}), ... ("~/code/", {"django": "git+https://github.com/django/django.git"}), ... ]) ... content = p.read_text(encoding="utf-8") ... "flask" in content and "django" in content True
Valid operation names for pin checking.
-
vcspull.config.is_pinned_for_op(entry, op)¶
Return
Trueif the repo config entry is pinned for op.Examples
Global pin applies to all ops:
>>> is_pinned_for_op({"repo": "git+x", "options": {"pin": True}}, "import") True >>> is_pinned_for_op({"repo": "git+x", "options": {"pin": True}}, "fmt") True
Per-op pin is scoped:
>>> entry = {"repo": "git+x", "options": {"pin": {"import": True}}} >>> is_pinned_for_op(entry, "import") True >>> is_pinned_for_op(entry, "fmt") False
allow_overwrite: falseis shorthand forpin: {import: true}(guards against--sync):>>> entry2 = {"repo": "git+x", "options": {"allow_overwrite": False}} >>> is_pinned_for_op(entry2, "import") True >>> is_pinned_for_op(entry2, "add") False
Plain string entries and entries without options are never pinned:
>>> is_pinned_for_op("git+x", "import") False >>> is_pinned_for_op({"repo": "git+x"}, "import") False
Explicit false is not pinned:
>>> is_pinned_for_op({"repo": "git+x", "options": {"pin": False}}, "import") False
String values for pin (not bool) are ignored — not pinned:
>>> is_pinned_for_op({"repo": "git+x", "options": {"pin": "true"}}, "import") False
Invalid op raises ValueError:
>>> is_pinned_for_op( ... {"repo": "git+x"}, "bogus" ... ) Traceback (most recent call last): ... ValueError: Unknown op: 'bogus'
-
vcspull.config.get_pin_reason(entry)¶
Return the human-readable pin reason from a repo config entry.
Non-string values are coerced to
str()so callers can safely interpolate the result into log messages.Examples
>>> entry = {"repo": "git+x", "options": {"pin": True, "pin_reason": "pinned"}} >>> get_pin_reason(entry) 'pinned' >>> get_pin_reason({"repo": "git+x"}) is None True >>> get_pin_reason("git+x") is None True
Non-string pin_reason is coerced:
>>> get_pin_reason({"repo": "git+x", "options": {"pin_reason": 42}}) '42'
-
class vcspull.config.MergeAction¶
Bases:
EnumAction for resolving a duplicate workspace-root repo conflict.
-
vcspull.config._classify_merge_action(existing_entry, incoming_entry)¶
Classify the merge conflict resolution action.
- Parameters:
existing_entry (Any) – The entry already stored (first occurrence).
incoming_entry (Any) – The duplicate entry being merged in.
- Return type:
Examples
Neither pinned — keep existing (first-occurrence-wins):
>>> _classify_merge_action({"repo": "git+a"}, {"repo": "git+b"}) <MergeAction.KEEP_EXISTING: 'keep_existing'>
Incoming is pinned — incoming wins:
>>> _classify_merge_action( ... {"repo": "git+a"}, ... {"repo": "git+b", "options": {"pin": True}}, ... ) <MergeAction.KEEP_INCOMING: 'keep_incoming'>
Existing is pinned — existing wins regardless:
>>> _classify_merge_action( ... {"repo": "git+a", "options": {"pin": True}}, ... {"repo": "git+b"}, ... ) <MergeAction.KEEP_EXISTING: 'keep_existing'>
Both pinned — first-occurrence-wins:
>>> _classify_merge_action( ... {"repo": "git+a", "options": {"pin": True}}, ... {"repo": "git+b", "options": {"pin": True}}, ... ) <MergeAction.KEEP_EXISTING: 'keep_existing'>
-
vcspull.config.merge_duplicate_workspace_root_entries(label, occurrences)¶
Merge duplicate entries for a single workspace root.
-
vcspull.config.merge_duplicate_workspace_roots(config_data, duplicate_roots)¶
Merge duplicate workspace root sections captured during load.
-
vcspull.config.canonicalize_workspace_path(label, *, cwd=None)¶
Convert a workspace root label to an absolute canonical path.
-
vcspull.config.workspace_root_label(workspace_path, *, cwd=None, home=None, preserve_cwd_label=True)¶
Create a normalized label for a workspace root path.
-
vcspull.config.normalize_workspace_roots(config_data, *, cwd=None, home=None, preserve_cwd_label=True)¶
Normalize workspace root labels and merge duplicate sections.