From: Daan De Meyer Date: Tue, 30 Jan 2024 13:50:42 +0000 (+0100) Subject: Rework package manager caching X-Git-Tag: v21~77^2~2 X-Git-Url: http://git.ipfire.org/cgi-bin/gitweb.cgi?a=commitdiff_plain;h=ed41b60dbd2c07faffe55454e5ee44fe203f6db3;p=thirdparty%2Fmkosi.git Rework package manager caching Currently, CacheDirectory= is used for all caching, both incremental images and package manager cache. While this works great for incremental images, there's a few issues with using CacheDirectory= for all package manager caching: - By default we don't do any caching, this has to be explicitly configured by setting CacheDirectory=mkosi.cache or telling users to manually create mkosi.cache. This means new users might be frustrated by their image builds downloading everything again on subsequent builds. - By doing package manager caches per individual mkosi project, we unnecessarily download multiple copies of the same repo metadata and packages. - When using incremental images, if the post-install or finalize scripts install extra packages, these packages can trigger repository metadata updates, which will result in the image being built from two repository metadata snapshots, one from when the incremental image was built, the other from the new packages installed using the refreshed repository metadata. Even if the scripts don't trigger repository metadata updates themselves, because the cache is shared, the repository metadata could already have been updated during another image build. - When using base trees, any images using the base tree might trigger a repository metadata update as well, resulting in the same issue, where the image is built from multiple different snapshots of the repository metadata. To fix the first two issues, we introduce a new setting PackageCacheDirectory= and make its default either a system or per user cache directory depending on how mkosi is invoked. This makes sure we cache by default and use a shared package manager cache directory so that we do not unnecessarily download duplicate copies of repository metadata and packages To fix the two remaining issues, we need to make sure we only sync repository metadata once for each image. We opt to do this at the start of each image build and configure the package manager commands to not do any metadata syncing by default. To make sure the repository metadata snapshot stays available for extension images and for incremental images, we copy the repository metadata from the shared cache into the image itself at /mkosi/cache/. This makes sure that even if the repository metadata in the shared cache is refreshed by another image build it won't remove the old snapshot for incremental builds or images intended to be used as base trees. To make sure the actual packages downloaded during the image build are still written into the shared package cache, in finalize_package_manager_mounts(), we bind mounts the relevant directories from the shared package cache instead of from /mkosi/cache/ in the image to make sure that other image builds can take advantage of the downloaded packages. The /mkosi/ directory is removed from the image at the end of each image build before packaging up the result unless we're building a directory or tar image and CleanPackageMetadata= is explicitly disabled. The initial sync we do at the start of each image build operates on the shared package cache directory so that repository metadata is only refreshed once and can be reused by other image builds. Because package managers now prevent automatic syncing by default, we have to rework the local package repositories slightly to make sure the local package repository is still synced whenever it is updated. We get rid of the localrepo() functions and opt to again write the repo definitions for the local package repository inline to keep things simple and localized. To avoid pacman from writing packages from the local package repository to the shared package cache directory, we configure the local repository itself as an additional read-only cache directory, which makes sure that pacman will read cached packages from this directory but won't write any new packages to this directory. For zypper we disable "keeppackages" for the local package repository to prevent those packages from getting cached. For dnf, we don't mount in any directory from the shared package cache for the mkosi-packages repository to make sure it stays local to the image. Apt doesn't support any mechanism that allows us to prevent packages from the local repository from getting cached so we allow these to be written to the shared package cache. We also take the opportunity to rename the mkosi-packages repo to the mkosi repo, and rename the accompanying config files as well; Because pacman does not employ any sort of cache key for its repository metadata, when using the default shared package cache directory, we use a subdirectory based on the distribution, release and architecture that we're building for to prevent any possible conflicts in the cache directory when different pacman based distributions use the same repo identifiers. To avoid issues when two instances of mkosi operate on the same package cache directory, we take an advisory BSD lock on the cache subdirectory that we're going to sync or copy. When building the default initrd, we make sure it uses the same repository snapshot as the associated image. --- diff --git a/NEWS.md b/NEWS.md index 2dd6de269..8357ff6c9 100644 --- a/NEWS.md +++ b/NEWS.md @@ -1,6 +1,6 @@ # mkosi Changelog -## v20.3 +## v21 - We now handle unmerged-usr systems correctly - Builtin configs (`mkosi-initrd`, `mkosi-tools`) can now be included @@ -18,6 +18,18 @@ - Added `MicrocodeHost=` setting to only include the CPU specific microcode for the current host system. - The kernel-install plugin now only includes the CPU specific microcode +- Introduced `PackageCacheDirectory=` to set the directory for package + manager caches. This setting defaults to a suitable location in the + system or user directory depending on how mkosi is invoked. + `CacheDirectory=` is only used for incremental cached images now. +- Repository metadata is now synced once at the start of each image + build and never during an image build. Each image includes a snapshot + of the repository metadata in `/mkosi` so that incremental images and + extension images can reuse the same snapshot. When building an image + intended to be used with `BaseTrees=`, disable `CleanPackageMetadata=` + to make sure the repository metadata in `/mkosi` is not cleaned up, + otherwise any extension images using this image as their base tree + will not be able to install additional packages. ## v20.2 diff --git a/kernel-install/50-mkosi.install b/kernel-install/50-mkosi.install index aeb110f95..477a1f7cd 100644 --- a/kernel-install/50-mkosi.install +++ b/kernel-install/50-mkosi.install @@ -133,7 +133,7 @@ def main() -> None: "--format", str(format), "--output", output, "--workspace-dir=/var/tmp", - "--cache-dir=/var", + "--package-cache-dir=/var", "--output-dir", context.staging_area, "--extra-tree", f"/usr/lib/modules/{context.kernel_version}:/usr/lib/modules/{context.kernel_version}", "--extra-tree=/usr/lib/firmware:/usr/lib/firmware", diff --git a/mkosi/__init__.py b/mkosi/__init__.py index fcf6a86d2..582e782e3 100644 --- a/mkosi/__init__.py +++ b/mkosi/__init__.py @@ -48,7 +48,7 @@ from mkosi.config import ( ) from mkosi.context import Context from mkosi.distributions import Distribution -from mkosi.installer import clean_package_manager_metadata, finalize_package_manager_mounts +from mkosi.installer import clean_package_manager_metadata from mkosi.kmod import gen_required_kernel_modules, process_kernel_modules from mkosi.log import ARG_DEBUG, complete_step, die, log_notice, log_step from mkosi.manifest import Manifest @@ -67,6 +67,8 @@ from mkosi.tree import copy_tree, move_tree, rmtree from mkosi.types import PathString from mkosi.user import CLONE_NEWNS, INVOKING_USER, become_root, unshare from mkosi.util import ( + flatten, + flock, format_rlimit, make_executable, one_zero, @@ -463,7 +465,7 @@ def run_prepare_scripts(context: Context, build: bool) -> None: "--ro-bind", script, "/work/prepare", "--ro-bind", cd, "/work/scripts", "--bind", context.root, context.root, - *finalize_package_manager_mounts(context), + *context.config.distribution.package_manager(context.config).mounts(context), "--chdir", "/work/src", ], scripts=hd, @@ -545,7 +547,7 @@ def run_build_scripts(context: Context) -> None: if context.config.build_dir else [] ), - *finalize_package_manager_mounts(context), + *context.config.distribution.package_manager(context.config).mounts(context), "--chdir", "/work/src", ], scripts=hd, @@ -610,7 +612,7 @@ def run_postinst_scripts(context: Context) -> None: "--ro-bind", cd, "/work/scripts", "--bind", context.root, context.root, "--bind", context.staging, "/work/out", - *finalize_package_manager_mounts(context), + *context.config.distribution.package_manager(context.config).mounts(context), "--chdir", "/work/src", ], scripts=hd, @@ -671,7 +673,7 @@ def run_finalize_scripts(context: Context) -> None: "--ro-bind", cd, "/work/scripts", "--bind", context.root, context.root, "--bind", context.staging, "/work/out", - *finalize_package_manager_mounts(context), + *context.config.distribution.package_manager(context.config).mounts(context), "--chdir", "/work/src", ], scripts=hd, @@ -1498,7 +1500,8 @@ def build_default_initrd(context: Context) -> Path: "--cache-only", str(context.config.cache_only), "--output-dir", str(context.workspace / "initrd"), *(["--workspace-dir", str(context.config.workspace_dir)] if context.config.workspace_dir else []), - "--cache-dir", str(context.cache_dir), + *(["--cache-dir", str(context.config.cache_dir)] if context.config.cache_dir else []), + *(["--package-cache-dir", str(context.config.package_cache_dir)] if context.config.package_cache_dir else []), *(["--local-mirror", str(context.config.local_mirror)] if context.config.local_mirror else []), "--incremental", str(context.config.incremental), "--acl", str(context.config.acl), @@ -1554,7 +1557,16 @@ def build_default_initrd(context: Context) -> Path: complete_step("Building default initrd"), setup_workspace(args, config) as workspace, ): - build_image(Context(args, config, workspace=workspace, resources=context.resources)) + build_image( + Context( + args, + config, + workspace=workspace, + resources=context.resources, + # Re-use the repository metadata snapshot from the main image for the initrd. + package_cache_dir=context.package_cache_dir, + ) + ) return config.output_dir / config.output @@ -2932,6 +2944,41 @@ def setup_workspace(args: Args, config: Config) -> Iterator[Path]: raise +def copy_package_manager_state(context: Context) -> None: + if have_cache(context.config) or context.config.base_trees: + return + + subdir = context.config.distribution.package_manager(context.config).subdir(context.config) + + for d in ("cache", "lib"): + src = context.config.package_cache_dir_or_default() / d / subdir + if not src.exists(): + continue + + caches = context.config.distribution.package_manager(context.config).cache_subdirs(src) if d == "cache" else [] + + with tempfile.TemporaryDirectory() as tmp: + os.chmod(tmp, 0o755) + + # cp doesn't support excluding directories but we can imitate it by bind mounting an empty directory over + # the directories we want to exclude. + exclude = flatten(["--ro-bind", tmp, os.fspath(p)] for p in caches) + + dst = context.root / "mkosi" / d / subdir + with umask(~0o755): + dst.mkdir(parents=True, exist_ok=True) + + with flock(src): + copy_tree( + src, dst, + tools=context.config.tools(), + preserve=False, + sandbox=context.sandbox( + options=["--ro-bind", src, src, "--bind", dst.parent, dst.parent, *exclude] + ), + ) + + def build_image(context: Context) -> None: manifest = Manifest(context.config) if context.config.manifest_format else None @@ -2941,6 +2988,11 @@ def build_image(context: Context) -> None: install_base_trees(context) cached = reuse_cache(context) + # The repository metadata is copied into the image root directory to ensure it remains static and available + # when using the image to build system extensions. This has to be ordered after setup() as cache keys might + # depend on config files created by the distribution's setup() method. + copy_package_manager_state(context) + context.config.distribution.setup(context) install_package_directories(context) @@ -3431,6 +3483,7 @@ def finalize_default_tools(args: Args, config: Config, *, resources: Path) -> Co *(["--output-dir", str(config.output_dir)] if config.output_dir else []), *(["--workspace-dir", str(config.workspace_dir)] if config.workspace_dir else []), *(["--cache-dir", str(config.cache_dir)] if config.cache_dir else []), + *(["--package-cache-dir", str(config.package_cache_dir)] if config.package_cache_dir else []), "--incremental", str(config.incremental), "--acl", str(config.acl), *([f"--package={package}" for package in config.tools_tree_packages]), @@ -3513,17 +3566,63 @@ def run_clean(args: Args, config: Config) -> None: with complete_step(f"Clearing out build directory of {config.name()} image…"): rmtree(*config.build_dir.iterdir()) - if remove_package_cache and config.cache_dir and config.cache_dir.exists() and any(config.cache_dir.iterdir()): + if ( + remove_package_cache and + config.package_cache_dir and + config.package_cache_dir.exists() and + any(config.package_cache_dir.iterdir()) + ): with complete_step(f"Clearing out package cache of {config.name()} image…"): rmtree( *( - config.cache_dir / p / d - for p in ("cache", "lib") - for d in ("apt", "dnf", "libdnf5", "pacman", "zypp") + config.package_cache_dir / d / config.distribution.package_manager(config).subdir(config) + for d in ("cache", "lib") ), ) +@contextlib.contextmanager +def rchown_package_manager_dirs(config: Config) -> Iterator[None]: + try: + yield + finally: + if INVOKING_USER.is_regular_user(): + with complete_step("Fixing ownership of package manager cache directory"): + subdir = config.distribution.package_manager(config).subdir(config) + for d in ("cache", "lib"): + INVOKING_USER.rchown(config.package_cache_dir_or_default() / d / subdir) + + +def sync_repository_metadata(args: Args, config: Config, *, resources: Path) -> None: + if have_cache(config) or config.cache_only or config.base_trees: + return + + with ( + complete_step(f"Syncing package manager metadata for {config.name()} image"), + prepend_to_environ_path(config), + rchown_package_manager_dirs(config), + setup_workspace(args, config) as workspace, + ): + context = Context( + args, + config, + workspace=workspace, + resources=resources, + package_cache_dir=config.package_cache_dir_or_default(), + ) + + install_package_manager_trees(context) + context.config.distribution.setup(context) + + subdir = context.config.distribution.package_manager(config).subdir(config) + + with ( + flock(context.config.package_cache_dir_or_default() / "cache" / subdir), + flock(context.config.package_cache_dir_or_default() / "lib" / subdir), + ): + context.config.distribution.sync(context) + + def run_build(args: Args, config: Config, *, resources: Path) -> None: check_inputs(config) @@ -3547,14 +3646,28 @@ def run_build(args: Args, config: Config, *, resources: Path) -> None: for p in ( config.output_dir, config.cache_dir, + config.package_cache_dir_or_default(), config.build_dir, config.workspace_dir, ): if p and not p.exists(): INVOKING_USER.mkdir(p) + subdir = config.distribution.package_manager(config).subdir(config) + + for d in ("cache", "lib"): + src = config.package_cache_dir_or_default() / d / subdir + INVOKING_USER.mkdir(src) + + sync_repository_metadata(args, config, resources=resources) + + src = config.package_cache_dir_or_default() / "cache" / subdir + for p in config.distribution.package_manager(config).cache_subdirs(src): + INVOKING_USER.mkdir(p) + with ( acl_toggle_build(config, INVOKING_USER.uid), + rchown_package_manager_dirs(config), setup_workspace(args, config) as workspace, ): build_image(Context(args, config, workspace=workspace, resources=resources)) diff --git a/mkosi/config.py b/mkosi/config.py index 51b8985bc..f8277066a 100644 --- a/mkosi/config.py +++ b/mkosi/config.py @@ -1171,6 +1171,7 @@ class Config: output_dir: Optional[Path] workspace_dir: Optional[Path] cache_dir: Optional[Path] + package_cache_dir: Optional[Path] build_dir: Optional[Path] image_id: Optional[str] image_version: Optional[str] @@ -1301,6 +1302,12 @@ class Config: return Path("/var/tmp") + def package_cache_dir_or_default(self) -> Path: + return ( + self.package_cache_dir or + (INVOKING_USER.cache_dir() / f"{self.distribution}~{self.release}~{self.architecture}") + ) + def tools(self) -> Path: return self.tools_tree or Path("/") @@ -1724,7 +1731,15 @@ SETTINGS = ( section="Output", parse=config_make_path_parser(required=False), paths=("mkosi.cache",), - help="Package cache path", + help="Incremental cache directory", + ), + ConfigSetting( + dest="package_cache_dir", + metavar="PATH", + name="PackageCacheDirectory", + section="Output", + parse=config_make_path_parser(required=False), + help="Package cache directory", ), ConfigSetting( dest="build_dir", @@ -3420,6 +3435,7 @@ def summary(config: Config) -> str: Output Directory: {config.output_dir_or_cwd()} Workspace Directory: {config.workspace_dir_or_default()} Cache Directory: {none_to_none(config.cache_dir)} + Package Cache Directory: {none_to_default(config.package_cache_dir)} Build Directory: {none_to_none(config.build_dir)} Image ID: {config.image_id} Image Version: {config.image_version} diff --git a/mkosi/context.py b/mkosi/context.py index 43aff6d09..2220ed10d 100644 --- a/mkosi/context.py +++ b/mkosi/context.py @@ -14,11 +14,20 @@ from mkosi.util import flatten, umask class Context: """State related properties.""" - def __init__(self, args: Args, config: Config, *, workspace: Path, resources: Path) -> None: + def __init__( + self, + args: Args, + config: Config, + *, + workspace: Path, + resources: Path, + package_cache_dir: Optional[Path] = None, + ) -> None: self.args = args self.config = config self.workspace = workspace self.resources = resources + self.package_cache_dir = package_cache_dir or (self.root / "mkosi") with umask(~0o755): # Using a btrfs subvolume as the upperdir in an overlayfs results in EXDEV so make sure we create @@ -38,7 +47,6 @@ class Context: (self.pkgmngr / "var/log").mkdir(parents=True) self.packages.mkdir() self.install_dir.mkdir(exist_ok=True) - self.cache_dir.mkdir(parents=True, exist_ok=True) @property def root(self) -> Path: @@ -56,10 +64,6 @@ class Context: def packages(self) -> Path: return self.workspace / "packages" - @property - def cache_dir(self) -> Path: - return self.config.cache_dir or (self.workspace / "cache") - @property def install_dir(self) -> Path: return self.workspace / "dest" diff --git a/mkosi/distributions/__init__.py b/mkosi/distributions/__init__.py index 80a408f0a..03aeed8b9 100644 --- a/mkosi/distributions/__init__.py +++ b/mkosi/distributions/__init__.py @@ -76,6 +76,10 @@ class DistributionInstaller: def createrepo(cls, context: "Context") -> None: raise NotImplementedError + @classmethod + def sync(cls, context: "Context") -> None: + raise NotImplementedError + class Distribution(StrEnum): # Please consult docs/distribution-policy.md and contact one @@ -157,6 +161,9 @@ class Distribution(StrEnum): def createrepo(self, context: "Context") -> None: return self.installer().createrepo(context) + def sync(self, context: "Context") -> None: + return self.installer().sync(context) + def installer(self) -> type[DistributionInstaller]: modname = str(self).replace('-', '_') mod = importlib.import_module(f"mkosi.distributions.{modname}") diff --git a/mkosi/distributions/arch.py b/mkosi/distributions/arch.py index e97b4b89b..7038db5f9 100644 --- a/mkosi/distributions/arch.py +++ b/mkosi/distributions/arch.py @@ -43,6 +43,10 @@ class Installer(DistributionInstaller): def setup(cls, context: Context) -> None: Pacman.setup(context, cls.repositories(context)) + @classmethod + def sync(cls, context: Context) -> None: + Pacman.sync(context) + @classmethod def install(cls, context: Context) -> None: cls.install_packages(context, ["filesystem"], apivfs=False) @@ -52,7 +56,7 @@ class Installer(DistributionInstaller): Pacman.invoke( context, "--sync", - ["--refresh", "--needed", "--assume-installed", "initramfs"], + ["--needed", "--assume-installed", "initramfs"], packages, apivfs=apivfs, ) @@ -66,9 +70,6 @@ class Installer(DistributionInstaller): if context.config.local_mirror: yield Pacman.Repository("core", context.config.local_mirror) else: - if context.want_local_repo(): - yield Pacman.localrepo() - if context.config.architecture == Architecture.arm64: url = f"{context.config.mirror or 'http://mirror.archlinuxarm.org'}/$arch/$repo" else: diff --git a/mkosi/distributions/centos.py b/mkosi/distributions/centos.py index 899fdbadf..9288237fd 100644 --- a/mkosi/distributions/centos.py +++ b/mkosi/distributions/centos.py @@ -75,6 +75,10 @@ class Installer(DistributionInstaller): Dnf.setup(context, cls.repositories(context)) (context.pkgmngr / "etc/dnf/vars/stream").write_text(f"{context.config.release}-stream\n") + @classmethod + def sync(cls, context: Context) -> None: + Dnf.sync(context) + @classmethod def install(cls, context: Context) -> None: # Make sure glibc-minimal-langpack is installed instead of glibc-all-langpacks. @@ -228,9 +232,6 @@ class Installer(DistributionInstaller): yield from cls.repository_variants(context, "AppStream") return - if context.want_local_repo(): - yield Dnf.localrepo() - yield from cls.repository_variants(context, "BaseOS") yield from cls.repository_variants(context, "AppStream") yield from cls.repository_variants(context, "extras") diff --git a/mkosi/distributions/custom.py b/mkosi/distributions/custom.py index 3306c29ea..89f0ea38b 100644 --- a/mkosi/distributions/custom.py +++ b/mkosi/distributions/custom.py @@ -17,6 +17,10 @@ class Installer(DistributionInstaller): def setup(cls, context: Context) -> None: pass + @classmethod + def sync(cls, context: Context) -> None: + pass + @classmethod def install(cls, context: Context) -> None: pass diff --git a/mkosi/distributions/debian.py b/mkosi/distributions/debian.py index b3b39b0c4..6d4a232b4 100644 --- a/mkosi/distributions/debian.py +++ b/mkosi/distributions/debian.py @@ -57,9 +57,6 @@ class Installer(DistributionInstaller): ) return - if context.want_local_repo(): - yield Apt.localrepo(context) - mirror = context.config.mirror or "http://deb.debian.org/debian" signedby = "/usr/share/keyrings/debian-archive-keyring.gpg" @@ -110,6 +107,10 @@ class Installer(DistributionInstaller): def createrepo(cls, context: Context) -> None: Apt.createrepo(context) + @classmethod + def sync(cls, context: Context) -> None: + Apt.sync(context) + @classmethod def install(cls, context: Context) -> None: # Instead of using debootstrap, we replicate its core functionality here. Because dpkg does not have @@ -142,8 +143,6 @@ class Installer(DistributionInstaller): (context.root / d).symlink_to(f"usr/{d}") (context.root / f"usr/{d}").mkdir(parents=True, exist_ok=True) - Apt.invoke(context, "update", apivfs=False) - # Next, we invoke apt-get install to download all the essential packages. With DPkg::Pre-Install-Pkgs, # we specify a shell command that will receive the list of packages that will be installed on stdin. # By configuring Debug::pkgDpkgPm=1, apt-get install will not actually execute any dpkg commands, so @@ -168,12 +167,17 @@ class Installer(DistributionInstaller): # then extracting the tar file into the chroot. for deb in essential: - with ( - # The deb paths will be in the form of "/var/cache/apt/" so we transform them to the corresponding - # path in mkosi's package cache directory. - open(context.cache_dir / Path(deb).relative_to("/var"), "rb") as i, - tempfile.NamedTemporaryFile() as o - ): + # If a deb path is in the form of "/var/cache/apt/", we transform it to the corresponding path in + # mkosi's package cache directory. If it's relative to /work/packages, we transform it to the corresponding + # path in mkosi's local package repository. Otherwise, we use the path as is. + if Path(deb).is_relative_to("/var/cache"): + path = context.config.package_cache_dir_or_default() / Path(deb).relative_to("/var") + elif Path(deb).is_relative_to("/work/packages"): + path = context.packages / Path(deb).relative_to("/work/packages") + else: + path = Path(deb) + + with open(path, "rb") as i, tempfile.NamedTemporaryFile() as o: run(["dpkg-deb", "--fsys-tarfile", "/dev/stdin"], stdin=i, stdout=o, sandbox=context.sandbox()) extract_tar( Path(o.name), context.root, @@ -201,7 +205,6 @@ class Installer(DistributionInstaller): with umask(~0o644): policyrcd.write_text("#!/bin/sh\nexit 101\n") - Apt.invoke(context, "update", apivfs=False) Apt.invoke(context, "install", packages, apivfs=apivfs) install_apt_sources(context, cls.repositories(context, local=False)) diff --git a/mkosi/distributions/fedora.py b/mkosi/distributions/fedora.py index 97cfe4f2a..6bfaf9eeb 100644 --- a/mkosi/distributions/fedora.py +++ b/mkosi/distributions/fedora.py @@ -53,6 +53,10 @@ class Installer(DistributionInstaller): def setup(cls, context: Context) -> None: Dnf.setup(context, cls.repositories(context), filelists=False) + @classmethod + def sync(cls, context: Context) -> None: + Dnf.sync(context) + @classmethod def install(cls, context: Context) -> None: cls.install_packages(context, ["filesystem"], apivfs=False) @@ -78,9 +82,6 @@ class Installer(DistributionInstaller): yield RpmRepository("fedora", f"baseurl={context.config.local_mirror}", gpgurls) return - if context.want_local_repo(): - yield Dnf.localrepo() - if context.config.release == "eln": mirror = context.config.mirror or "https://odcs.fedoraproject.org/composes/production/latest-Fedora-ELN/compose" for repo in ("Appstream", "BaseOS", "Extras", "CRB"): diff --git a/mkosi/distributions/mageia.py b/mkosi/distributions/mageia.py index ea6e790f5..c1c11bcfd 100644 --- a/mkosi/distributions/mageia.py +++ b/mkosi/distributions/mageia.py @@ -6,7 +6,6 @@ from collections.abc import Iterable, Sequence from mkosi.config import Architecture from mkosi.context import Context from mkosi.distributions import Distribution, fedora, join_mirror -from mkosi.installer.dnf import Dnf from mkosi.installer.rpm import RpmRepository, find_rpm_gpgkey from mkosi.log import die @@ -51,9 +50,6 @@ class Installer(fedora.Installer): yield RpmRepository("core-release", f"baseurl={context.config.local_mirror}", gpgurls) return - if context.want_local_repo(): - yield Dnf.localrepo() - if context.config.mirror: url = f"baseurl={join_mirror(context.config.mirror, 'distrib/$releasever/$basearch/media/core/')}" yield RpmRepository("core-release", f"{url}/release", gpgurls) diff --git a/mkosi/distributions/openmandriva.py b/mkosi/distributions/openmandriva.py index aba616448..de2e2ade5 100644 --- a/mkosi/distributions/openmandriva.py +++ b/mkosi/distributions/openmandriva.py @@ -6,7 +6,6 @@ from collections.abc import Iterable, Sequence from mkosi.config import Architecture from mkosi.context import Context from mkosi.distributions import Distribution, fedora, join_mirror -from mkosi.installer.dnf import Dnf from mkosi.installer.rpm import RpmRepository, find_rpm_gpgkey from mkosi.log import die @@ -57,9 +56,6 @@ class Installer(fedora.Installer): yield RpmRepository("main-release", f"baseurl={context.config.local_mirror}", gpgurls) return - if context.want_local_repo(): - yield Dnf.localrepo() - url = f"baseurl={join_mirror(mirror, '$releasever/repository/$basearch/main')}" yield RpmRepository("main-release", f"{url}/release", gpgurls) yield RpmRepository("main-updates", f"{url}/updates", gpgurls) diff --git a/mkosi/distributions/opensuse.py b/mkosi/distributions/opensuse.py index 17b3e3b8b..e2e1fbc34 100644 --- a/mkosi/distributions/opensuse.py +++ b/mkosi/distributions/opensuse.py @@ -64,6 +64,13 @@ class Installer(DistributionInstaller): else: Dnf.setup(context, cls.repositories(context)) + @classmethod + def sync(cls, context: Context) -> None: + if find_binary("zypper", root=context.config.tools()): + Zypper.sync(context) + else: + Dnf.sync(context) + @classmethod def install(cls, context: Context) -> None: cls.install_packages(context, ["filesystem", "distribution-release"], apivfs=False) @@ -90,9 +97,6 @@ class Installer(DistributionInstaller): def repositories(cls, context: Context) -> Iterable[RpmRepository]: zypper = find_binary("zypper", root=context.config.tools()) - if context.want_local_repo(): - yield Zypper.localrepo() if zypper else Dnf.localrepo() - release = context.config.release if release == "leap": release = "stable" diff --git a/mkosi/distributions/ubuntu.py b/mkosi/distributions/ubuntu.py index 8c7d1e6a3..2f89e1251 100644 --- a/mkosi/distributions/ubuntu.py +++ b/mkosi/distributions/ubuntu.py @@ -36,9 +36,6 @@ class Installer(debian.Installer): ) return - if context.want_local_repo(): - yield Apt.localrepo(context) - if context.config.architecture in (Architecture.x86, Architecture.x86_64): mirror = context.config.mirror or "http://archive.ubuntu.com/ubuntu" else: diff --git a/mkosi/installer/__init__.py b/mkosi/installer/__init__.py index 2505376b9..d36fd076d 100644 --- a/mkosi/installer/__init__.py +++ b/mkosi/installer/__init__.py @@ -1,21 +1,61 @@ # SPDX-License-Identifier: LGPL-2.1+ +import os from pathlib import Path -from mkosi.config import ConfigFeature +from mkosi.config import Config, ConfigFeature, OutputFormat from mkosi.context import Context from mkosi.run import find_binary from mkosi.sandbox import finalize_crypto_mounts -from mkosi.tree import rmtree +from mkosi.tree import move_tree, rmtree from mkosi.types import PathString from mkosi.util import flatten class PackageManager: + @classmethod + def subdir(cls, config: Config) -> Path: + raise NotImplementedError + + @classmethod + def cache_subdirs(cls, cache: Path) -> list[Path]: + raise NotImplementedError + @classmethod def scripts(cls, context: Context) -> dict[str, list[PathString]]: raise NotImplementedError + @classmethod + def mounts(cls, context: Context) -> list[PathString]: + mounts: list[PathString] = [ + *(["--ro-bind", m, m] if (m := context.config.local_mirror) else []), + *finalize_crypto_mounts(tools=context.config.tools()), + "--bind", context.packages, "/work/packages", + ] + + subdir = context.config.distribution.package_manager(context.config).subdir(context.config) + + for d in ("cache", "lib"): + src = context.package_cache_dir / d / subdir + mounts += ["--bind", src, Path("/var") / d / subdir] + + # If we're not operating on the configured package cache directory, we're operating on a snapshot of the + # repository metadata in the image root directory. To make sure any downloaded packages are still cached in + # the configured package cache directory in this scenario, we mount in the relevant directories from the + # configured package cache directory. + if d == "cache" and context.package_cache_dir != context.config.package_cache_dir_or_default(): + caches = context.config.distribution.package_manager(context.config).cache_subdirs(src) + mounts += flatten( + [ + "--bind", + os.fspath(context.config.package_cache_dir_or_default() / d / subdir / p.relative_to(src)), + Path("/var") / d / subdir / p.relative_to(src), + ] + for p in caches + ) + + return mounts + def clean_package_manager_metadata(context: Context) -> None: """ @@ -24,6 +64,25 @@ def clean_package_manager_metadata(context: Context) -> None: Try them all regardless of the distro: metadata is only removed if the package manager is not present in the image. """ + if ( + context.package_cache_dir.is_relative_to(context.root) and + not context.config.overlay and ( + context.config.clean_package_metadata != ConfigFeature.disabled or + context.config.output_format not in (OutputFormat.directory, OutputFormat.tar) + ) + ): + # Instead of removing the package cache directory from the image, we move it to the workspace so it stays + # available for later steps and is automatically removed along with the workspace when the build finishes. + context.package_cache_dir = move_tree( + context.package_cache_dir, context.workspace / "package-cache-dir", + tools=context.config.tools(), + sandbox=context.sandbox( + options=[ + "--bind", context.package_cache_dir.parent, context.package_cache_dir.parent, + "--bind", context.workspace, context.workspace, + ], + ), + ) if context.config.clean_package_metadata == ConfigFeature.disabled: return @@ -37,28 +96,3 @@ def clean_package_manager_metadata(context: Context) -> None: if always or not find_binary(tool, root=context.root): rmtree(*(context.root / p for p in paths), sandbox=context.sandbox(options=["--bind", context.root, context.root])) - - -def finalize_package_manager_mounts(context: Context) -> list[PathString]: - from mkosi.installer.dnf import Dnf - - mounts: list[PathString] = [ - *(["--ro-bind", m, m] if (m := context.config.local_mirror) else []), - *finalize_crypto_mounts(tools=context.config.tools()), - "--bind", context.packages, "/work/packages", - ] - - mounts += flatten( - ["--bind", context.cache_dir / d, Path("/var") / d] - for d in ( - "lib/apt", - "cache/apt", - f"cache/{Dnf.subdir(context.config)}", - f"lib/{Dnf.subdir(context.config)}", - "cache/pacman/pkg", - "cache/zypp", - ) - if (context.cache_dir / d).exists() - ) - - return mounts diff --git a/mkosi/installer/apt.py b/mkosi/installer/apt.py index 97af70f4a..a498aaad0 100644 --- a/mkosi/installer/apt.py +++ b/mkosi/installer/apt.py @@ -1,10 +1,12 @@ # SPDX-License-Identifier: LGPL-2.1+ import textwrap from collections.abc import Iterable, Sequence +from pathlib import Path from typing import NamedTuple, Optional +from mkosi.config import Config from mkosi.context import Context -from mkosi.installer import PackageManager, finalize_package_manager_mounts +from mkosi.installer import PackageManager from mkosi.mounts import finalize_ephemeral_source_mounts from mkosi.run import find_binary, run from mkosi.sandbox import apivfs_cmd @@ -32,6 +34,14 @@ class Apt(PackageManager): """ ) + @classmethod + def subdir(cls, config: Config) -> Path: + return Path("apt") + + @classmethod + def cache_subdirs(cls, cache: Path) -> list[Path]: + return [cache / "archives"] + @classmethod def scripts(cls, context: Context) -> dict[str, list[PathString]]: return { @@ -55,13 +65,12 @@ class Apt(PackageManager): (context.pkgmngr / "etc/apt/preferences.d").mkdir(exist_ok=True, parents=True) (context.pkgmngr / "etc/apt/sources.list.d").mkdir(exist_ok=True, parents=True) - # TODO: Drop once apt 2.5.4 is widely available. with umask(~0o755): + # TODO: Drop once apt 2.5.4 is widely available. (context.root / "var/lib/dpkg").mkdir(parents=True, exist_ok=True) (context.root / "var/lib/dpkg/status").touch() - (context.cache_dir / "lib/apt").mkdir(exist_ok=True, parents=True) - (context.cache_dir / "cache/apt").mkdir(exist_ok=True, parents=True) + (context.package_cache_dir / "lib/apt/lists/partial").mkdir(parents=True, exist_ok=True) # We have a special apt.conf outside of pkgmngr dir that only configures "Dir::Etc" that we pass to APT_CONFIG # to tell apt it should read config files from /etc/apt in case this is overridden by distributions. This is @@ -144,18 +153,19 @@ class Apt(PackageManager): operation: str, packages: Sequence[str] = (), *, + options: Sequence[str] = (), apivfs: bool = True, mounts: Sequence[PathString] = (), ) -> None: with finalize_ephemeral_source_mounts(context.config) as sources: run( - cls.cmd(context, "apt-get") + [operation, *sort_packages(packages)], + cls.cmd(context, "apt-get") + [operation, *options, *sort_packages(packages)], sandbox=( context.sandbox( network=True, options=[ "--bind", context.root, context.root, - *finalize_package_manager_mounts(context), + *cls.mounts(context), *sources, *mounts, "--chdir", "/work/src", @@ -165,20 +175,44 @@ class Apt(PackageManager): env=context.config.environment, ) + @classmethod + def sync(cls, context: Context) -> None: + cls.invoke(context, "update") @classmethod def createrepo(cls, context: Context) -> None: with (context.packages / "Packages").open("wb") as f: - run(["dpkg-scanpackages", context.packages], - stdout=f, sandbox=context.sandbox(options=["--ro-bind", context.packages, context.packages])) + run( + ["dpkg-scanpackages", "."], + stdout=f, + sandbox=context.sandbox( + options=[ + "--ro-bind", context.packages, context.packages, + "--chdir", context.packages, + ], + ), + ) + (context.pkgmngr / "etc/apt/sources.list.d").mkdir(parents=True, exist_ok=True) + (context.pkgmngr / "etc/apt/sources.list.d/mkosi-local.sources").write_text( + textwrap.dedent( + """\ + Enabled: yes + Types: deb + URIs: file:///work/packages + Suites: ./ + Trusted: yes + """ + ) + ) - @classmethod - def localrepo(cls, context: Context) -> Repository: - return cls.Repository( - types=("deb",), - url="file:///work/packages", - suite=context.config.release, - components=("main",), - signedby=None, + cls.invoke( + context, + "update", + options=[ + "-o", "Dir::Etc::sourcelist=sources.list.d/mkosi-local.sources", + "-o", "Dir::Etc::sourceparts=-", + "-o", "APT::Get::List-Cleanup=0", + ], + apivfs=False, ) diff --git a/mkosi/installer/dnf.py b/mkosi/installer/dnf.py index 6a7ca56ae..9cf9ad735 100644 --- a/mkosi/installer/dnf.py +++ b/mkosi/installer/dnf.py @@ -1,11 +1,11 @@ # SPDX-License-Identifier: LGPL-2.1+ import textwrap -from collections.abc import Iterable +from collections.abc import Iterable, Sequence from pathlib import Path from mkosi.config import Config from mkosi.context import Context -from mkosi.installer import PackageManager, finalize_package_manager_mounts +from mkosi.installer import PackageManager from mkosi.installer.rpm import RpmRepository, fixup_rpmdb_location, rpm_cmd, setup_rpm from mkosi.log import ARG_DEBUG from mkosi.mounts import finalize_ephemeral_source_mounts @@ -28,6 +28,14 @@ class Dnf(PackageManager): def subdir(cls, config: Config) -> Path: return Path("libdnf5" if cls.executable(config) == "dnf5" else "dnf") + @classmethod + def cache_subdirs(cls, cache: Path) -> list[Path]: + return [ + p / "packages" + for p in cache.iterdir() + if p.is_dir() and "-" in p.name and "mkosi" not in p.name + ] + @classmethod def scripts(cls, context: Context) -> dict[str, list[PathString]]: return { @@ -37,11 +45,8 @@ class Dnf(PackageManager): @classmethod def setup(cls, context: Context, repositories: Iterable[RpmRepository], filelists: bool = True) -> None: - (context.pkgmngr / "etc/dnf/vars").mkdir(exist_ok=True, parents=True) - (context.pkgmngr / "etc/yum.repos.d").mkdir(exist_ok=True, parents=True) - - (context.cache_dir / "cache" / cls.subdir(context.config)).mkdir(exist_ok=True, parents=True) - (context.cache_dir / "lib" / cls.subdir(context.config)).mkdir(exist_ok=True, parents=True) + (context.pkgmngr / "etc/dnf/vars").mkdir(parents=True, exist_ok=True) + (context.pkgmngr / "etc/yum.repos.d").mkdir(parents=True, exist_ok=True) config = context.pkgmngr / "etc/dnf/dnf.conf" @@ -64,17 +69,12 @@ class Dnf(PackageManager): [{repo.id}] name={repo.id} {repo.url} - gpgcheck={int(repo.gpgcheck)} + gpgcheck=1 enabled={int(repo.enabled)} """ ) ) - if repo.metadata_expire is not None: - f.write(f"metadata_expire={repo.metadata_expire}\n") - if repo.priority is not None: - f.write(f"priority={repo.priority}\n") - if repo.sslcacert: f.write(f"sslcacert={repo.sslcacert}\n") if repo.sslclientcert: @@ -122,9 +122,12 @@ class Dnf(PackageManager): opt = "--enable-repo" if dnf.endswith("dnf5") else "--enablerepo" cmdline += [f"{opt}={repo}" for repo in context.config.repositories] - # TODO: this breaks with a local, offline repository created with 'createrepo' - if context.config.cache_only and not context.config.local_mirror: + if context.config.cache_only: cmdline += ["--cacheonly"] + else: + cmdline += ["--setopt=metadata_expire=never"] + if dnf == "dnf5": + cmdline += ["--setopt=cacheonly=metadata"] if not context.config.architecture.is_native(): cmdline += [f"--forcearch={context.config.distribution.architecture(context.config.architecture)}"] @@ -144,16 +147,23 @@ class Dnf(PackageManager): return cmdline @classmethod - def invoke(cls, context: Context, operation: str, packages: Iterable[str], apivfs: bool = True) -> None: + def invoke( + cls, + context: Context, + operation: str, + packages: Iterable[str] = (), + options: Sequence[str] = (), + apivfs: bool = True, + ) -> None: with finalize_ephemeral_source_mounts(context.config) as sources: run( - cls.cmd(context) + [operation, *sort_packages(packages)], + cls.cmd(context) + [operation, *options, *sort_packages(packages)], sandbox=( context.sandbox( network=True, options=[ "--bind", context.root, context.root, - *finalize_package_manager_mounts(context), + *cls.mounts(context), *sources, "--chdir", "/work/src", ], @@ -171,18 +181,35 @@ class Dnf(PackageManager): if any(p.name.startswith(prefix) for prefix in ("dnf", "hawkey", "yum")): p.unlink() + @classmethod + def sync(cls, context: Context, options: Sequence[str] = ()) -> None: + cls.invoke( + context, + "makecache", + options=[ + "--refresh", + *(["--setopt=cacheonly=none"] if cls.executable(context.config) == "dnf5" else []), + *options, + ], + apivfs=False, + ) + @classmethod def createrepo(cls, context: Context) -> None: run(["createrepo_c", context.packages], sandbox=context.sandbox(options=["--bind", context.packages, context.packages])) - @classmethod - def localrepo(cls) -> RpmRepository: - return RpmRepository( - id="mkosi-packages", - url="baseurl=file:///work/packages", - gpgcheck=False, - gpgurls=(), - metadata_expire=0, - priority=50, + (context.pkgmngr / "etc/yum.repos.d/mkosi-local.repo").write_text( + textwrap.dedent( + """\ + [mkosi] + name=mkosi + baseurl=file:///work/packages + gpgcheck=0 + metadata_expire=never + priority=50 + """ + ) ) + + cls.sync(context, options=["--disablerepo=*", "--enablerepo=mkosi"]) diff --git a/mkosi/installer/pacman.py b/mkosi/installer/pacman.py index 0e138f67f..2dc4dd369 100644 --- a/mkosi/installer/pacman.py +++ b/mkosi/installer/pacman.py @@ -1,11 +1,13 @@ # SPDX-License-Identifier: LGPL-2.1+ +import shutil import textwrap from collections.abc import Iterable, Sequence from pathlib import Path from typing import NamedTuple +from mkosi.config import Config from mkosi.context import Context -from mkosi.installer import PackageManager, finalize_package_manager_mounts +from mkosi.installer import PackageManager from mkosi.mounts import finalize_ephemeral_source_mounts from mkosi.run import run from mkosi.sandbox import apivfs_cmd @@ -19,10 +21,31 @@ class Pacman(PackageManager): id: str url: str + @classmethod + def subdir(cls, config: Config) -> Path: + return Path("pacman") + + @classmethod + def cache_subdirs(cls, cache: Path) -> list[Path]: + return [cache / "pkg"] + @classmethod def scripts(cls, context: Context) -> dict[str, list[PathString]]: return {"pacman": apivfs_cmd(context.root) + cls.cmd(context)} + @classmethod + def mounts(cls, context: Context) -> list[PathString]: + return [ + *super().mounts(context), + # pacman reuses the same directory for the sync databases and the local database containing the list of + # installed packages. The format should go in the cache directory, the latter should go in the image, so we + # bind mount the local directory from the image to make sure that happens. + "--bind", context.root / "var/lib/pacman/local", "/var/lib/pacman/local", + # pacman writes downloaded packages to the first writable cache directory. We don't want it to write to our + # local repository directory so we expose it as a read-only directory to pacman. + "--ro-bind", context.packages, "/var/cache/pacman/mkosi", + ] + @classmethod def setup(cls, context: Context, repositories: Iterable[Repository]) -> None: if context.config.repository_key_check: @@ -32,11 +55,10 @@ class Pacman(PackageManager): # will be no signatures sig_level = "Never" - # Create base layout for pacman and pacman-key with umask(~0o755): - (context.root / "var/lib/pacman").mkdir(exist_ok=True, parents=True) + (context.root / "var/lib/pacman/local").mkdir(parents=True, exist_ok=True) - (context.cache_dir / "cache/pacman/pkg").mkdir(parents=True, exist_ok=True) + (context.pkgmngr / "etc/mkosi-local.conf").touch() config = context.pkgmngr / "etc/pacman.conf" if config.exists(): @@ -52,6 +74,10 @@ class Pacman(PackageManager): SigLevel = {sig_level} LocalFileSigLevel = Optional ParallelDownloads = 5 + Architecture = {context.config.distribution.architecture(context.config.architecture)} + + # This has to go first so that our local repository always takes precedence over any other ones. + Include = /etc/mkosi-local.conf """ ) ) @@ -83,6 +109,10 @@ class Pacman(PackageManager): "pacman", "--root", context.root, "--logfile=/dev/null", + "--dbpath=/var/lib/pacman", + # Make sure pacman looks at our local repository first by putting it as the first cache directory. We mount + # it read-only so the second directory will still be used for writing new cache entries. + "--cachedir=/var/cache/pacman/mkosi", "--cachedir=/var/cache/pacman/pkg", "--hookdir", context.root / "etc/pacman.d/hooks", "--arch", context.config.distribution.architecture(context.config.architecture), @@ -107,7 +137,7 @@ class Pacman(PackageManager): network=True, options=[ "--bind", context.root, context.root, - *finalize_package_manager_mounts(context), + *cls.mounts(context), *sources, "--chdir", "/work/src", ], @@ -117,15 +147,27 @@ class Pacman(PackageManager): ) @classmethod - def createrepo(cls, context: Context, *, force: bool = False) -> None: - run( - [ - "repo-add", - context.packages / "mkosi-packages.db.tar", - *sorted(context.packages.glob("*.pkg.tar*"), key=lambda p: GenericVersion(Path(p).name)), - ] - ) + def sync(cls, context: Context) -> None: + cls.invoke(context, "--sync", ["--refresh"], apivfs=False) @classmethod - def localrepo(cls) -> Repository: - return cls.Repository(id="mkosi-packages", url="file:///work/packages") + def createrepo(cls, context: Context) -> None: + run(["repo-add", "--quiet", context.packages / "mkosi.db.tar", + *sorted(context.packages.glob("*.pkg.tar*"), key=lambda p: GenericVersion(Path(p).name))]) + + (context.pkgmngr / "etc/mkosi-local.conf").write_text( + textwrap.dedent( + """\ + [mkosi] + Server = file:///i/dont/exist + SigLevel = Never + Usage = Install Search Upgrade + """ + ) + ) + + # pacman can't sync a single repository, so we go behind its back and do it ourselves. + shutil.move( + context.packages / "mkosi.db.tar", + context.package_cache_dir / "lib/pacman/sync/mkosi.db" + ) diff --git a/mkosi/installer/rpm.py b/mkosi/installer/rpm.py index a0eb68272..f388382e9 100644 --- a/mkosi/installer/rpm.py +++ b/mkosi/installer/rpm.py @@ -16,13 +16,10 @@ class RpmRepository(NamedTuple): id: str url: str gpgurls: tuple[str, ...] - gpgcheck: bool = True enabled: bool = True sslcacert: Optional[Path] = None sslclientkey: Optional[Path] = None sslclientcert: Optional[Path] = None - metadata_expire: Optional[int] = None - priority: Optional[int] = None def find_rpm_gpgkey(context: Context, key: str) -> Optional[str]: diff --git a/mkosi/installer/zypper.py b/mkosi/installer/zypper.py index e56269a4b..fd34efe7b 100644 --- a/mkosi/installer/zypper.py +++ b/mkosi/installer/zypper.py @@ -2,10 +2,11 @@ import hashlib import textwrap from collections.abc import Iterable, Sequence +from pathlib import Path -from mkosi.config import yes_no +from mkosi.config import Config, yes_no from mkosi.context import Context -from mkosi.installer import PackageManager, finalize_package_manager_mounts +from mkosi.installer import PackageManager from mkosi.installer.rpm import RpmRepository, fixup_rpmdb_location, rpm_cmd, setup_rpm from mkosi.mounts import finalize_ephemeral_source_mounts from mkosi.run import run @@ -15,6 +16,14 @@ from mkosi.util import sort_packages class Zypper(PackageManager): + @classmethod + def subdir(cls, config: Config) -> Path: + return Path("zypp") + + @classmethod + def cache_subdirs(cls, cache: Path) -> list[Path]: + return [cache / "packages"] + @classmethod def scripts(cls, context: Context) -> dict[str, list[PathString]]: return { @@ -27,8 +36,6 @@ class Zypper(PackageManager): config = context.pkgmngr / "etc/zypp/zypp.conf" config.parent.mkdir(exist_ok=True, parents=True) - (context.cache_dir / "cache/zypp").mkdir(exist_ok=True, parents=True) - # rpm.install.excludedocs can only be configured in zypp.conf so we append # to any user provided config file. Let's also bump the refresh delay to # the same default as dnf which is 48 hours. @@ -59,17 +66,14 @@ class Zypper(PackageManager): [{repo.id}-{key}] name={repo.id} {repo.url} - gpgcheck={int(repo.gpgcheck)} + gpgcheck=1 enabled={int(repo.enabled)} - autorefresh=1 + autorefresh=0 keeppackages=1 """ ) ) - if repo.priority is not None: - f.write(f"priority={repo.priority}\n") - for i, url in enumerate(repo.gpgurls): f.write("gpgkey=" if i == 0 else len("gpgkey=") * " ") f.write(f"{url}\n") @@ -89,6 +93,7 @@ class Zypper(PackageManager): "--cache-dir=/var/cache/zypp", "--gpg-auto-import-keys" if context.config.repository_key_check else "--no-gpg-checks", "--non-interactive", + "--no-refresh", ] @classmethod @@ -109,7 +114,7 @@ class Zypper(PackageManager): network=True, options=[ "--bind", context.root, context.root, - *finalize_package_manager_mounts(context), + *cls.mounts(context), *sources, "--chdir", "/work/src", ], @@ -120,17 +125,27 @@ class Zypper(PackageManager): fixup_rpmdb_location(context) + @classmethod + def sync(cls, context: Context) -> None: + cls.invoke(context, "refresh", apivfs=False) + @classmethod def createrepo(cls, context: Context) -> None: run(["createrepo_c", context.packages], sandbox=context.sandbox(options=["--bind", context.packages, context.packages])) - @classmethod - def localrepo(cls) -> RpmRepository: - return RpmRepository( - id="mkosi-packages", - url="baseurl=file:///work/packages", - gpgcheck=False, - gpgurls=(), - priority=50, + (context.pkgmngr / "etc/zypp/repos.d/mkosi-local.repo").write_text( + textwrap.dedent( + """\ + [mkosi] + name=mkosi + baseurl=file:///work/packages + gpgcheck=0 + autorefresh=0 + keeppackages=0 + priority=50 + """ + ) ) + + cls.invoke(context, "refresh", ["mkosi"], apivfs=False) diff --git a/mkosi/resources/mkosi.md b/mkosi/resources/mkosi.md index 1ca1e63ca..5476228b3 100644 --- a/mkosi/resources/mkosi.md +++ b/mkosi/resources/mkosi.md @@ -695,10 +695,17 @@ boolean argument: either `1`, `yes`, or `true` to enable, or `0`, `no`, `CacheDirectory=`, `--cache-dir=` -: Takes a path to a directory to use as package cache for the - distribution package manager used. If this option is not used, but a - `mkosi.cache/` directory is found in the local directory it is - automatically used for this purpose. +: Takes a path to a directory to use as the incremental cache directory + for the incremental images produced when the `Incremental=` option is + enabled. If this option is not used, but a `mkosi.cache/` directory is + found in the local directory it is automatically used for this + purpose. + +`PackageCacheDirectory=`, `--package-cache-dir` + +: Takes a path to a directory to use as the package cache directory for + the distribution package manager used. If unset, a suitable directory + in the user's home directory or system is used. `BuildDirectory=`, `--build-dir=` @@ -911,8 +918,8 @@ boolean argument: either `1`, `yes`, or `true` to enable, or `0`, `no`, distribution instead of installing the distribution from scratch. Only extra packages are installed on top of the ones already installed in the base trees. Note that for this to work properly, the base image - still needs to contain the package manager metadata (see - `CleanPackageMetadata=`). + still needs to contain the package manager metadata by setting + `CleanPackageMetadata=no` (see `CleanPackageMetadata=`). : Instead of a directory, a tar file or a disk image may be provided. In this case it is unpacked into the OS tree. This mode of operation @@ -975,11 +982,19 @@ boolean argument: either `1`, `yes`, or `true` to enable, or `0`, `no`, `CleanPackageMetadata=`, `--clean-package-metadata=` -: Enable/disable removal of package manager databases at the end of - installation. Can be specified as `true`, `false`, or `auto` (the - default). With `auto`, files will be removed if the respective - package manager executable is *not* present at the end of the - installation. +: Enable/disable removal of package manager databases and repository + metadata in `/mkosi` at the end of installation. Can be specified as + `true`, `false`, or `auto` (the default). With `auto`, package manager + databases will be removed if the respective package manager executable + is *not* present at the end of the installation. + +: Note that when not building a tar or directory image, the repository + metadata in `/mkosi` is always removed, regardless of this setting as + it is only useful for building extensions using `BaseTrees=`. + +: Note that when set to `auto`, repository metadata in `/mkosi` is + removed regardless of whether the respective package manager + executable is present or not. `PrepareScripts=`, `--prepare-script=` diff --git a/mkosi/user.py b/mkosi/user.py index 585b51cfc..83f34612e 100644 --- a/mkosi/user.py +++ b/mkosi/user.py @@ -62,6 +62,11 @@ class INVOKING_USER: run(["mkdir", "--parents", path], user=user, group=group) return path + @classmethod + def rchown(cls, path: Path) -> None: + if cls.is_regular_user() and path.is_relative_to(INVOKING_USER.home()) and path.exists(): + run(["chown", "--recursive", f"{INVOKING_USER.uid}:{INVOKING_USER.gid}", path]) + def read_subrange(path: Path) -> int: uid = str(os.getuid()) diff --git a/tests/test_json.py b/tests/test_json.py index b6dc04a28..aede03b29 100644 --- a/tests/test_json.py +++ b/tests/test_json.py @@ -182,6 +182,7 @@ def test_config() -> None: "Output": "outfile", "OutputDirectory": "/your/output/here", "Overlay": true, + "PackageCacheDirectory": "/a/b/c", "PackageDirectories": [], "PackageManagerTrees": [ { @@ -356,6 +357,7 @@ def test_config() -> None: output_dir = Path("/your/output/here"), output_format = OutputFormat.uki, overlay = True, + package_cache_dir = Path("/a/b/c"), package_directories = [], package_manager_trees = [ConfigTree(Path("/foo/bar"), None)], packages = [],