Initial commit: DarkForge Linux — Phases 0-12
Complete from-scratch Linux distribution targeting AMD Ryzen 9 9950X3D + NVIDIA RTX 5090 on ASUS ROG CROSSHAIR X870E HERO. Deliverables: - dpack: custom package manager in Rust (3,800 lines) - TOML package parser, dependency resolver, build sandbox - CRUX Pkgfile and Gentoo ebuild converters - Shared library conflict detection - 124 package definitions across 4 repos (core/extra/desktop/gaming) - 34 toolchain bootstrap scripts (LFS 13.0 adapted for Zen 5) - Linux 6.19.8 kernel config (hardware-specific, fully commented) - SysVinit init system with rc.d service scripts - Live ISO builder (UEFI-only, squashfs+xorriso) - Interactive installer (GPT partitioning, EFISTUB boot) - Integration test checklist (docs/TESTING.md) No systemd. No bootloader. No display manager. Kernel boots via EFISTUB → auto-login → dwl Wayland compositor. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
1
src/dpack/.gitignore
vendored
Normal file
1
src/dpack/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
/target/
|
||||
2212
src/dpack/Cargo.lock
generated
Normal file
2212
src/dpack/Cargo.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
42
src/dpack/Cargo.toml
Normal file
42
src/dpack/Cargo.toml
Normal file
@@ -0,0 +1,42 @@
|
||||
[package]
|
||||
name = "dpack"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
description = "DarkForge Linux package manager — between CRUX pkgutils and Gentoo emerge"
|
||||
license = "MIT"
|
||||
authors = ["Danny"]
|
||||
|
||||
[dependencies]
|
||||
# TOML parsing for package definitions and database
|
||||
toml = "0.8"
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
|
||||
# CLI argument parsing
|
||||
clap = { version = "4", features = ["derive"] }
|
||||
|
||||
# Error handling
|
||||
anyhow = "1"
|
||||
thiserror = "2"
|
||||
|
||||
# File operations and checksums
|
||||
sha2 = "0.10"
|
||||
walkdir = "2"
|
||||
|
||||
# HTTP for source downloads
|
||||
reqwest = { version = "0.12", features = ["blocking", "rustls-tls"], default-features = false }
|
||||
|
||||
# Logging
|
||||
log = "0.4"
|
||||
env_logger = "0.11"
|
||||
|
||||
# Colorized terminal output
|
||||
colored = "2"
|
||||
|
||||
# Regex for converter modules
|
||||
regex = "1"
|
||||
|
||||
[dev-dependencies]
|
||||
tempfile = "3"
|
||||
assert_cmd = "2"
|
||||
predicates = "3"
|
||||
198
src/dpack/README.md
Normal file
198
src/dpack/README.md
Normal file
@@ -0,0 +1,198 @@
|
||||
# dpack — DarkForge Package Manager
|
||||
|
||||
A source-based package manager for DarkForge Linux, positioned between CRUX's `pkgutils` and Gentoo's `emerge` in complexity. Written in Rust.
|
||||
|
||||
## Features
|
||||
|
||||
- **TOML package definitions** — clean, readable package recipes
|
||||
- **Dependency resolution** — topological sort with circular dependency detection
|
||||
- **Build sandboxing** — bubblewrap (bwrap) isolation with PID/network namespaces
|
||||
- **Installed package database** — file-based TOML tracking in `/var/lib/dpack/db/`
|
||||
- **Full build orchestration** — download → checksum → extract → sandbox build → stage → commit → register
|
||||
- **CRUX Pkgfile converter** — convert CRUX ports to dpack format
|
||||
- **Gentoo ebuild converter** — best-effort conversion of Gentoo ebuilds (handles ~80% of cases)
|
||||
- **Shared library conflict detection** — ELF binary scanning via readelf/objdump
|
||||
- **Reverse dependency tracking** — warns before removing packages that others depend on
|
||||
|
||||
## Requirements
|
||||
|
||||
- Rust 1.75+ (build)
|
||||
- Linux (runtime — uses Linux namespaces for sandboxing)
|
||||
- bubblewrap (`bwrap`) for sandboxed builds (optional, falls back to direct execution)
|
||||
- `curl` or `wget` for source downloads
|
||||
- `tar` for source extraction
|
||||
- `readelf` or `objdump` for shared library scanning
|
||||
|
||||
## Building
|
||||
|
||||
```bash
|
||||
cd src/dpack
|
||||
cargo build --release
|
||||
```
|
||||
|
||||
The binary is at `target/release/dpack`. Install it:
|
||||
|
||||
```bash
|
||||
sudo install -m755 target/release/dpack /usr/local/bin/
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
# Install a package (resolves deps, builds in sandbox, installs, updates db)
|
||||
dpack install zlib
|
||||
|
||||
# Install multiple packages
|
||||
dpack install openssl curl git
|
||||
|
||||
# Remove a package (warns about reverse deps, removes files)
|
||||
dpack remove zlib
|
||||
|
||||
# Upgrade packages (compares installed vs repo versions)
|
||||
dpack upgrade # upgrade all outdated packages
|
||||
dpack upgrade openssl git # upgrade specific packages
|
||||
|
||||
# Search for packages
|
||||
dpack search compression
|
||||
|
||||
# Show package info
|
||||
dpack info zlib
|
||||
|
||||
# List installed packages
|
||||
dpack list
|
||||
|
||||
# Check for file conflicts and shared library issues
|
||||
dpack check
|
||||
|
||||
# Convert foreign package formats
|
||||
dpack convert /path/to/Pkgfile # CRUX → dpack TOML (stdout)
|
||||
dpack convert /path/to/curl-8.19.0.ebuild -o curl.toml # Gentoo → dpack TOML (file)
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
dpack reads its configuration from `/etc/dpack.conf` (TOML format). If the file doesn't exist, sensible defaults are used.
|
||||
|
||||
Example `/etc/dpack.conf`:
|
||||
|
||||
```toml
|
||||
[flags]
|
||||
cflags = "-march=znver5 -O2 -pipe -fomit-frame-pointer"
|
||||
cxxflags = "-march=znver5 -O2 -pipe -fomit-frame-pointer"
|
||||
ldflags = "-Wl,-O1,--as-needed"
|
||||
makeflags = "-j32"
|
||||
|
||||
[paths]
|
||||
db_dir = "/var/lib/dpack/db"
|
||||
repo_dir = "/var/lib/dpack/repos"
|
||||
source_dir = "/var/cache/dpack/sources"
|
||||
build_dir = "/var/tmp/dpack/build"
|
||||
|
||||
[sandbox]
|
||||
enabled = true
|
||||
allow_network = false
|
||||
bwrap_path = "/usr/bin/bwrap"
|
||||
|
||||
[[repos]]
|
||||
name = "core"
|
||||
path = "/var/lib/dpack/repos/core"
|
||||
priority = 0
|
||||
|
||||
[[repos]]
|
||||
name = "extra"
|
||||
path = "/var/lib/dpack/repos/extra"
|
||||
priority = 10
|
||||
|
||||
[[repos]]
|
||||
name = "desktop"
|
||||
path = "/var/lib/dpack/repos/desktop"
|
||||
priority = 20
|
||||
|
||||
[[repos]]
|
||||
name = "gaming"
|
||||
path = "/var/lib/dpack/repos/gaming"
|
||||
priority = 30
|
||||
```
|
||||
|
||||
## Package Definition Format
|
||||
|
||||
Package definitions are TOML files stored at `<repo>/<name>/<name>.toml`:
|
||||
|
||||
```toml
|
||||
[package]
|
||||
name = "zlib"
|
||||
version = "1.3.1"
|
||||
description = "Compression library implementing the deflate algorithm"
|
||||
url = "https://zlib.net/"
|
||||
license = "zlib"
|
||||
|
||||
[source]
|
||||
url = "https://zlib.net/zlib-${version}.tar.xz"
|
||||
sha256 = "38ef96b8dfe510d42707d9c781877914792541133e1870841463bfa73f883e32"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[dependencies.optional]
|
||||
static = { description = "Build static library", default = true }
|
||||
minizip = { description = "Build minizip utility", deps = [] }
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = "./configure --prefix=/usr"
|
||||
make = "make"
|
||||
install = "make DESTDIR=${PKG} install"
|
||||
|
||||
[build.flags]
|
||||
cflags = "" # empty = use global defaults
|
||||
ldflags = ""
|
||||
```
|
||||
|
||||
### Variables available in build commands
|
||||
|
||||
- `${PKG}` — staging directory (DESTDIR)
|
||||
- `${version}` — package version (expanded in source URL)
|
||||
|
||||
### Build systems
|
||||
|
||||
The `system` field is a hint: `autotools`, `cmake`, `meson`, `cargo`, or `custom`.
|
||||
|
||||
## Running Tests
|
||||
|
||||
```bash
|
||||
cargo test
|
||||
```
|
||||
|
||||
Tests cover: TOML parsing, dependency resolution (simple, diamond, circular), database operations (register, unregister, persistence, file ownership, conflicts), and converter parsing.
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
src/
|
||||
├── main.rs # CLI (clap) — install, remove, upgrade, search, info, list, convert, check
|
||||
├── lib.rs # Library re-exports
|
||||
├── config/
|
||||
│ ├── mod.rs # Module root
|
||||
│ ├── package.rs # PackageDefinition TOML structs + parsing + validation
|
||||
│ └── global.rs # DpackConfig (flags, paths, sandbox, repos)
|
||||
├── resolver/
|
||||
│ ├── mod.rs # DependencyGraph, topological sort, reverse deps
|
||||
│ └── solib.rs # Shared library conflict detection (ELF scanning)
|
||||
├── sandbox/
|
||||
│ └── mod.rs # BuildSandbox (bubblewrap + direct backends)
|
||||
├── converter/
|
||||
│ ├── mod.rs # Format auto-detection
|
||||
│ ├── crux.rs # CRUX Pkgfile parser
|
||||
│ └── gentoo.rs # Gentoo ebuild parser
|
||||
├── db/
|
||||
│ └── mod.rs # PackageDb (file-based TOML, installed tracking)
|
||||
└── build/
|
||||
└── mod.rs # BuildOrchestrator (download → build → install pipeline)
|
||||
```
|
||||
|
||||
## Repository
|
||||
|
||||
```
|
||||
git@git.dannyhaslund.dk:danny8632/dpack.git
|
||||
```
|
||||
339
src/dpack/src/build/mod.rs
Normal file
339
src/dpack/src/build/mod.rs
Normal file
@@ -0,0 +1,339 @@
|
||||
//! Package build orchestration.
|
||||
//!
|
||||
//! Coordinates the full install pipeline:
|
||||
//! 1. Resolve dependencies (via `resolver`)
|
||||
//! 2. Download source tarball
|
||||
//! 3. Verify SHA256 checksum
|
||||
//! 4. Extract source
|
||||
//! 5. Build in sandbox (via `sandbox`)
|
||||
//! 6. Collect installed files from staging
|
||||
//! 7. Commit files to the live filesystem
|
||||
//! 8. Update the package database (via `db`)
|
||||
|
||||
use anyhow::{bail, Context, Result};
|
||||
use sha2::{Digest, Sha256};
|
||||
use std::io::Read;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::time::{SystemTime, UNIX_EPOCH};
|
||||
|
||||
use crate::config::{DpackConfig, PackageDefinition};
|
||||
use crate::db::{InstalledPackage, PackageDb};
|
||||
use crate::resolver::{DependencyGraph, ResolvedPackage};
|
||||
use crate::sandbox::{self, BuildSandbox};
|
||||
|
||||
/// Orchestrate the full install of one or more packages.
|
||||
pub struct BuildOrchestrator {
|
||||
config: DpackConfig,
|
||||
db: PackageDb,
|
||||
}
|
||||
|
||||
impl BuildOrchestrator {
|
||||
/// Create a new orchestrator with the given config and database.
|
||||
pub fn new(config: DpackConfig, db: PackageDb) -> Self {
|
||||
Self { config, db }
|
||||
}
|
||||
|
||||
/// Install packages by name. Resolves deps, builds, installs.
|
||||
pub fn install(&mut self, package_names: &[String]) -> Result<()> {
|
||||
log::info!("Resolving dependencies for: {:?}", package_names);
|
||||
|
||||
// Load all repos
|
||||
let mut all_packages = std::collections::HashMap::new();
|
||||
for repo in &self.config.repos {
|
||||
let repo_pkgs = DependencyGraph::load_repo(&repo.path)?;
|
||||
all_packages.extend(repo_pkgs);
|
||||
}
|
||||
|
||||
let installed_versions = self.db.installed_versions();
|
||||
let graph = DependencyGraph::new(all_packages.clone(), installed_versions);
|
||||
|
||||
let plan = graph.resolve(
|
||||
package_names,
|
||||
&std::collections::HashMap::new(),
|
||||
)?;
|
||||
|
||||
if plan.build_order.is_empty() {
|
||||
println!("All requested packages are already installed.");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// Report the plan
|
||||
if !plan.already_installed.is_empty() {
|
||||
println!(
|
||||
"Already installed: {}",
|
||||
plan.already_installed.join(", ")
|
||||
);
|
||||
}
|
||||
|
||||
println!("Build order ({} packages):", plan.build_order.len());
|
||||
for (i, pkg) in plan.build_order.iter().enumerate() {
|
||||
let marker = if pkg.build_only { " [build-only]" } else { "" };
|
||||
println!(" {}. {}-{}{}", i + 1, pkg.name, pkg.version, marker);
|
||||
}
|
||||
println!();
|
||||
|
||||
// Build each package in order
|
||||
for resolved in &plan.build_order {
|
||||
let pkg_def = all_packages.get(&resolved.name).with_context(|| {
|
||||
format!("Package '{}' disappeared from repo", resolved.name)
|
||||
})?;
|
||||
|
||||
self.build_and_install(pkg_def, resolved)?;
|
||||
}
|
||||
|
||||
println!("All packages installed successfully.");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Build and install a single package.
|
||||
fn build_and_install(
|
||||
&mut self,
|
||||
pkg: &PackageDefinition,
|
||||
resolved: &ResolvedPackage,
|
||||
) -> Result<()> {
|
||||
let ident = pkg.ident();
|
||||
println!(">>> Building {}", ident);
|
||||
|
||||
// Step 1: Download source
|
||||
let source_path = self.download_source(pkg)?;
|
||||
|
||||
// Step 2: Verify checksum
|
||||
self.verify_checksum(&source_path, &pkg.source.sha256)?;
|
||||
|
||||
// Step 3: Extract source
|
||||
let build_dir = self.config.paths.build_dir.join(&ident);
|
||||
let staging_dir = self.config.paths.build_dir.join(format!("{}-staging", ident));
|
||||
|
||||
// Clean any previous attempt
|
||||
let _ = std::fs::remove_dir_all(&build_dir);
|
||||
let _ = std::fs::remove_dir_all(&staging_dir);
|
||||
|
||||
self.extract_source(&source_path, &build_dir)?;
|
||||
|
||||
// Step 4: Apply patches
|
||||
// Find the actual source directory (tarballs often have a top-level dir)
|
||||
let actual_build_dir = find_source_dir(&build_dir)?;
|
||||
|
||||
// Step 5: Build in sandbox
|
||||
let sandbox = BuildSandbox::new(
|
||||
&self.config,
|
||||
pkg,
|
||||
&actual_build_dir,
|
||||
&staging_dir,
|
||||
)?;
|
||||
|
||||
sandbox.run_build(pkg)?;
|
||||
|
||||
// Step 6: Collect installed files
|
||||
let staged_files = sandbox::collect_staged_files(&staging_dir)?;
|
||||
|
||||
if staged_files.is_empty() {
|
||||
log::warn!("No files were installed by {} — is the install step correct?", ident);
|
||||
}
|
||||
|
||||
// Step 7: Commit files to the live filesystem
|
||||
self.commit_staged_files(&staging_dir)?;
|
||||
|
||||
// Step 8: Update database
|
||||
let size = calculate_dir_size(&staging_dir);
|
||||
let record = InstalledPackage {
|
||||
name: pkg.package.name.clone(),
|
||||
version: pkg.package.version.clone(),
|
||||
description: pkg.package.description.clone(),
|
||||
run_deps: pkg.effective_run_deps(&resolved.features),
|
||||
build_deps: pkg.effective_build_deps(&resolved.features),
|
||||
features: resolved.features.clone(),
|
||||
files: staged_files,
|
||||
installed_at: SystemTime::now()
|
||||
.duration_since(UNIX_EPOCH)
|
||||
.unwrap()
|
||||
.as_secs(),
|
||||
repo: "core".to_string(), // TODO: track actual repo
|
||||
size,
|
||||
};
|
||||
|
||||
self.db.register(record)?;
|
||||
|
||||
// Cleanup
|
||||
let _ = std::fs::remove_dir_all(&build_dir);
|
||||
let _ = std::fs::remove_dir_all(&staging_dir);
|
||||
|
||||
println!(">>> {} installed successfully", ident);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Download the source tarball to the source cache.
|
||||
fn download_source(&self, pkg: &PackageDefinition) -> Result<PathBuf> {
|
||||
let url = pkg.expanded_source_url();
|
||||
let filename = url
|
||||
.rsplit('/')
|
||||
.next()
|
||||
.unwrap_or("source.tar.gz");
|
||||
let dest = self.config.paths.source_dir.join(filename);
|
||||
|
||||
std::fs::create_dir_all(&self.config.paths.source_dir)?;
|
||||
|
||||
if dest.exists() {
|
||||
log::info!("Source already cached: {}", dest.display());
|
||||
return Ok(dest);
|
||||
}
|
||||
|
||||
log::info!("Downloading: {}", url);
|
||||
|
||||
// Use curl/wget via subprocess for now — avoids pulling in reqwest
|
||||
// at build time for the bootstrap phase
|
||||
let status = std::process::Command::new("curl")
|
||||
.args(["-fLo", &dest.to_string_lossy(), &url])
|
||||
.status()
|
||||
.or_else(|_| {
|
||||
std::process::Command::new("wget")
|
||||
.args(["-O", &dest.to_string_lossy(), &url])
|
||||
.status()
|
||||
})
|
||||
.context("Neither curl nor wget available for downloading")?;
|
||||
|
||||
if !status.success() {
|
||||
bail!("Download failed for: {}", url);
|
||||
}
|
||||
|
||||
Ok(dest)
|
||||
}
|
||||
|
||||
/// Verify the SHA256 checksum of a file.
|
||||
fn verify_checksum(&self, path: &Path, expected: &str) -> Result<()> {
|
||||
log::info!("Verifying checksum: {}", path.display());
|
||||
|
||||
let mut file = std::fs::File::open(path)
|
||||
.with_context(|| format!("Failed to open: {}", path.display()))?;
|
||||
|
||||
let mut hasher = Sha256::new();
|
||||
let mut buffer = [0u8; 8192];
|
||||
|
||||
loop {
|
||||
let n = file.read(&mut buffer)?;
|
||||
if n == 0 {
|
||||
break;
|
||||
}
|
||||
hasher.update(&buffer[..n]);
|
||||
}
|
||||
|
||||
let actual = format!("{:x}", hasher.finalize());
|
||||
|
||||
if actual != expected {
|
||||
bail!(
|
||||
"Checksum mismatch for {}: expected {}, got {}",
|
||||
path.display(),
|
||||
expected,
|
||||
actual
|
||||
);
|
||||
}
|
||||
|
||||
log::info!("Checksum verified: {}", actual);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Extract a source tarball into the build directory.
|
||||
fn extract_source(&self, tarball: &Path, build_dir: &Path) -> Result<()> {
|
||||
std::fs::create_dir_all(build_dir)?;
|
||||
|
||||
let tarball_str = tarball.to_string_lossy();
|
||||
let build_str = build_dir.to_string_lossy();
|
||||
|
||||
// Determine tar flags based on extension
|
||||
let tar_flags = if tarball_str.ends_with(".tar.xz") || tarball_str.ends_with(".txz") {
|
||||
"-xJf"
|
||||
} else if tarball_str.ends_with(".tar.gz") || tarball_str.ends_with(".tgz") {
|
||||
"-xzf"
|
||||
} else if tarball_str.ends_with(".tar.bz2") || tarball_str.ends_with(".tbz2") {
|
||||
"-xjf"
|
||||
} else if tarball_str.ends_with(".tar.zst") {
|
||||
"--zstd -xf"
|
||||
} else {
|
||||
"-xf"
|
||||
};
|
||||
|
||||
let status = std::process::Command::new("tar")
|
||||
.arg(tar_flags)
|
||||
.arg(&*tarball_str)
|
||||
.arg("-C")
|
||||
.arg(&*build_str)
|
||||
.status()
|
||||
.context("Failed to run tar")?;
|
||||
|
||||
if !status.success() {
|
||||
bail!("Failed to extract: {}", tarball.display());
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Copy staged files from the staging directory to the live filesystem root.
|
||||
fn commit_staged_files(&self, staging_dir: &Path) -> Result<()> {
|
||||
if !staging_dir.exists() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// Walk the staging tree and copy each file to its target location
|
||||
for entry in walkdir::WalkDir::new(staging_dir)
|
||||
.min_depth(1)
|
||||
.into_iter()
|
||||
.filter_map(|e| e.ok())
|
||||
{
|
||||
let rel = entry.path().strip_prefix(staging_dir)?;
|
||||
let target = Path::new("/").join(rel);
|
||||
|
||||
if entry.file_type().is_dir() {
|
||||
std::fs::create_dir_all(&target).ok();
|
||||
} else if entry.file_type().is_file() {
|
||||
if let Some(parent) = target.parent() {
|
||||
std::fs::create_dir_all(parent).ok();
|
||||
}
|
||||
std::fs::copy(entry.path(), &target).with_context(|| {
|
||||
format!(
|
||||
"Failed to install file: {} -> {}",
|
||||
entry.path().display(),
|
||||
target.display()
|
||||
)
|
||||
})?;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Get a reference to the database.
|
||||
pub fn db(&self) -> &PackageDb {
|
||||
&self.db
|
||||
}
|
||||
|
||||
/// Get a mutable reference to the database.
|
||||
pub fn db_mut(&mut self) -> &mut PackageDb {
|
||||
&mut self.db
|
||||
}
|
||||
}
|
||||
|
||||
/// Find the actual source directory inside the extraction directory.
|
||||
/// Tarballs usually contain a top-level directory (e.g., `zlib-1.3.1/`).
|
||||
fn find_source_dir(build_dir: &Path) -> Result<PathBuf> {
|
||||
let entries: Vec<_> = std::fs::read_dir(build_dir)?
|
||||
.filter_map(|e| e.ok())
|
||||
.filter(|e| e.file_type().map_or(false, |t| t.is_dir()))
|
||||
.collect();
|
||||
|
||||
if entries.len() == 1 {
|
||||
Ok(entries[0].path())
|
||||
} else {
|
||||
// No single top-level directory — use the build dir itself
|
||||
Ok(build_dir.to_path_buf())
|
||||
}
|
||||
}
|
||||
|
||||
/// Calculate the total size of files in a directory.
|
||||
fn calculate_dir_size(dir: &Path) -> u64 {
|
||||
walkdir::WalkDir::new(dir)
|
||||
.into_iter()
|
||||
.filter_map(|e| e.ok())
|
||||
.filter(|e| e.file_type().is_file())
|
||||
.map(|e| e.metadata().map_or(0, |m| m.len()))
|
||||
.sum()
|
||||
}
|
||||
265
src/dpack/src/config/global.rs
Normal file
265
src/dpack/src/config/global.rs
Normal file
@@ -0,0 +1,265 @@
|
||||
//! Global dpack configuration (`/etc/dpack.conf`).
|
||||
//!
|
||||
//! Controls compiler flags, repository paths, sandbox settings, and other
|
||||
//! system-wide package manager behavior.
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
/// Default configuration file location
|
||||
pub const DEFAULT_CONFIG_PATH: &str = "/etc/dpack.conf";
|
||||
|
||||
/// Default package database location
|
||||
pub const DEFAULT_DB_PATH: &str = "/var/lib/dpack/db";
|
||||
|
||||
/// Default repository root
|
||||
pub const DEFAULT_REPO_PATH: &str = "/var/lib/dpack/repos";
|
||||
|
||||
/// Default source download cache
|
||||
pub const DEFAULT_SOURCE_DIR: &str = "/var/cache/dpack/sources";
|
||||
|
||||
/// Default build directory
|
||||
pub const DEFAULT_BUILD_DIR: &str = "/var/tmp/dpack/build";
|
||||
|
||||
/// Global dpack configuration.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct DpackConfig {
|
||||
/// Compiler and linker flags
|
||||
#[serde(default)]
|
||||
pub flags: GlobalFlags,
|
||||
|
||||
/// Paths for repos, database, sources, build directory
|
||||
#[serde(default)]
|
||||
pub paths: PathConfig,
|
||||
|
||||
/// Sandbox configuration
|
||||
#[serde(default)]
|
||||
pub sandbox: SandboxConfig,
|
||||
|
||||
/// Repository configuration
|
||||
#[serde(default)]
|
||||
pub repos: Vec<RepoConfig>,
|
||||
}
|
||||
|
||||
/// Global compiler flags — applied to all packages unless overridden.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct GlobalFlags {
|
||||
/// C compiler flags (e.g., "-march=znver5 -O2 -pipe -fomit-frame-pointer")
|
||||
pub cflags: String,
|
||||
|
||||
/// C++ compiler flags (defaults to same as cflags)
|
||||
pub cxxflags: String,
|
||||
|
||||
/// Linker flags (e.g., "-Wl,-O1,--as-needed")
|
||||
pub ldflags: String,
|
||||
|
||||
/// Make flags (e.g., "-j32")
|
||||
pub makeflags: String,
|
||||
}
|
||||
|
||||
impl Default for GlobalFlags {
|
||||
fn default() -> Self {
|
||||
// DarkForge defaults — Zen 5 optimized
|
||||
Self {
|
||||
cflags: "-march=znver5 -O2 -pipe -fomit-frame-pointer".to_string(),
|
||||
cxxflags: "-march=znver5 -O2 -pipe -fomit-frame-pointer".to_string(),
|
||||
ldflags: "-Wl,-O1,--as-needed".to_string(),
|
||||
makeflags: "-j32".to_string(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// File system paths used by dpack.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct PathConfig {
|
||||
/// Path to the installed package database
|
||||
pub db_dir: PathBuf,
|
||||
|
||||
/// Path to the package repository definitions
|
||||
pub repo_dir: PathBuf,
|
||||
|
||||
/// Path to cache downloaded source tarballs
|
||||
pub source_dir: PathBuf,
|
||||
|
||||
/// Path for build sandboxes / staging areas
|
||||
pub build_dir: PathBuf,
|
||||
}
|
||||
|
||||
impl Default for PathConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
db_dir: PathBuf::from(DEFAULT_DB_PATH),
|
||||
repo_dir: PathBuf::from(DEFAULT_REPO_PATH),
|
||||
source_dir: PathBuf::from(DEFAULT_SOURCE_DIR),
|
||||
build_dir: PathBuf::from(DEFAULT_BUILD_DIR),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Build sandbox configuration.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SandboxConfig {
|
||||
/// Enable sandboxing (mount/PID/net namespaces via bubblewrap)
|
||||
pub enabled: bool,
|
||||
|
||||
/// Allow network access during build (some packages need it)
|
||||
pub allow_network: bool,
|
||||
|
||||
/// Path to bubblewrap binary
|
||||
pub bwrap_path: PathBuf,
|
||||
}
|
||||
|
||||
impl Default for SandboxConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
enabled: true,
|
||||
allow_network: false,
|
||||
bwrap_path: PathBuf::from("/usr/bin/bwrap"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A package repository definition.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct RepoConfig {
|
||||
/// Repository name (e.g., "core", "extra", "gaming")
|
||||
pub name: String,
|
||||
|
||||
/// Path to the repository directory containing package definitions
|
||||
pub path: PathBuf,
|
||||
|
||||
/// Priority (lower = higher priority for conflict resolution)
|
||||
#[serde(default)]
|
||||
pub priority: u32,
|
||||
}
|
||||
|
||||
impl DpackConfig {
|
||||
/// Load configuration from a TOML file.
|
||||
pub fn from_file(path: &Path) -> Result<Self> {
|
||||
let content = std::fs::read_to_string(path)
|
||||
.with_context(|| format!("Failed to read config: {}", path.display()))?;
|
||||
toml::from_str(&content).context("Failed to parse dpack configuration")
|
||||
}
|
||||
|
||||
/// Load from the default location, or return defaults if not found.
|
||||
pub fn load_default() -> Self {
|
||||
let path = Path::new(DEFAULT_CONFIG_PATH);
|
||||
if path.exists() {
|
||||
Self::from_file(path).unwrap_or_else(|e| {
|
||||
log::warn!("Failed to load {}: {}, using defaults", path.display(), e);
|
||||
Self::default()
|
||||
})
|
||||
} else {
|
||||
Self::default()
|
||||
}
|
||||
}
|
||||
|
||||
/// Get the effective CFLAGS for a package (package override or global default).
|
||||
pub fn effective_cflags<'a>(&'a self, pkg_override: &'a str) -> &'a str {
|
||||
if pkg_override.is_empty() {
|
||||
&self.flags.cflags
|
||||
} else {
|
||||
pkg_override
|
||||
}
|
||||
}
|
||||
|
||||
/// Get the effective LDFLAGS for a package.
|
||||
pub fn effective_ldflags<'a>(&'a self, pkg_override: &'a str) -> &'a str {
|
||||
if pkg_override.is_empty() {
|
||||
&self.flags.ldflags
|
||||
} else {
|
||||
pkg_override
|
||||
}
|
||||
}
|
||||
|
||||
/// Find a package definition across all configured repos.
|
||||
/// Returns the first match by repo priority.
|
||||
pub fn find_package(&self, name: &str) -> Option<PathBuf> {
|
||||
let mut repos = self.repos.clone();
|
||||
repos.sort_by_key(|r| r.priority);
|
||||
|
||||
for repo in &repos {
|
||||
let pkg_path = repo.path.join(name).join(format!("{}.toml", name));
|
||||
if pkg_path.exists() {
|
||||
return Some(pkg_path);
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for DpackConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
flags: GlobalFlags::default(),
|
||||
paths: PathConfig::default(),
|
||||
sandbox: SandboxConfig::default(),
|
||||
repos: vec![
|
||||
RepoConfig {
|
||||
name: "core".to_string(),
|
||||
path: PathBuf::from("/var/lib/dpack/repos/core"),
|
||||
priority: 0,
|
||||
},
|
||||
RepoConfig {
|
||||
name: "extra".to_string(),
|
||||
path: PathBuf::from("/var/lib/dpack/repos/extra"),
|
||||
priority: 10,
|
||||
},
|
||||
RepoConfig {
|
||||
name: "desktop".to_string(),
|
||||
path: PathBuf::from("/var/lib/dpack/repos/desktop"),
|
||||
priority: 20,
|
||||
},
|
||||
RepoConfig {
|
||||
name: "gaming".to_string(),
|
||||
path: PathBuf::from("/var/lib/dpack/repos/gaming"),
|
||||
priority: 30,
|
||||
},
|
||||
],
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_default_config() {
|
||||
let config = DpackConfig::default();
|
||||
assert_eq!(config.flags.cflags, "-march=znver5 -O2 -pipe -fomit-frame-pointer");
|
||||
assert_eq!(config.flags.makeflags, "-j32");
|
||||
assert!(config.sandbox.enabled);
|
||||
assert!(!config.sandbox.allow_network);
|
||||
assert_eq!(config.repos.len(), 4);
|
||||
assert_eq!(config.repos[0].name, "core");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_effective_cflags_default() {
|
||||
let config = DpackConfig::default();
|
||||
assert_eq!(
|
||||
config.effective_cflags(""),
|
||||
"-march=znver5 -O2 -pipe -fomit-frame-pointer"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_effective_cflags_override() {
|
||||
let config = DpackConfig::default();
|
||||
assert_eq!(
|
||||
config.effective_cflags("-march=znver5 -O3 -pipe"),
|
||||
"-march=znver5 -O3 -pipe"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_config_roundtrip() {
|
||||
let config = DpackConfig::default();
|
||||
let toml_str = toml::to_string_pretty(&config).unwrap();
|
||||
let reparsed: DpackConfig = toml::from_str(&toml_str).unwrap();
|
||||
assert_eq!(reparsed.flags.cflags, config.flags.cflags);
|
||||
assert_eq!(reparsed.repos.len(), config.repos.len());
|
||||
}
|
||||
}
|
||||
19
src/dpack/src/config/mod.rs
Normal file
19
src/dpack/src/config/mod.rs
Normal file
@@ -0,0 +1,19 @@
|
||||
//! Configuration and package definition parsing.
|
||||
//!
|
||||
//! Handles reading `.toml` package definition files and the global dpack
|
||||
//! configuration. The package definition format is documented in CLAUDE.md §dpack.
|
||||
//!
|
||||
//! # Package Definition Format
|
||||
//!
|
||||
//! Package definitions are TOML files with these sections:
|
||||
//! - `[package]` — name, version, description, URL, license
|
||||
//! - `[source]` — download URL and SHA256 checksum
|
||||
//! - `[dependencies]` — runtime, build, and optional dependencies
|
||||
//! - `[build]` — configure, make, and install commands
|
||||
//! - `[build.flags]` — per-package compiler flag overrides
|
||||
|
||||
pub mod package;
|
||||
pub mod global;
|
||||
|
||||
pub use package::PackageDefinition;
|
||||
pub use global::DpackConfig;
|
||||
364
src/dpack/src/config/package.rs
Normal file
364
src/dpack/src/config/package.rs
Normal file
@@ -0,0 +1,364 @@
|
||||
//! Package definition structs and TOML parsing.
|
||||
//!
|
||||
//! A `.toml` package definition describes how to download, build, and install
|
||||
//! a single software package. This module defines the Rust structs that map
|
||||
//! to the TOML schema, plus loading/validation logic.
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
|
||||
/// Top-level package definition — the entire contents of a `.toml` file.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct PackageDefinition {
|
||||
pub package: PackageMetadata,
|
||||
pub source: SourceInfo,
|
||||
pub dependencies: Dependencies,
|
||||
pub build: BuildInstructions,
|
||||
}
|
||||
|
||||
/// The `[package]` section — basic metadata.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct PackageMetadata {
|
||||
/// Package name (must be unique within a repository)
|
||||
pub name: String,
|
||||
|
||||
/// Package version (semver or upstream version string)
|
||||
pub version: String,
|
||||
|
||||
/// Short description of the package
|
||||
pub description: String,
|
||||
|
||||
/// Upstream project URL
|
||||
pub url: String,
|
||||
|
||||
/// License identifier (SPDX preferred)
|
||||
pub license: String,
|
||||
|
||||
/// Optional epoch for version comparison when upstream resets versions
|
||||
#[serde(default)]
|
||||
pub epoch: u32,
|
||||
|
||||
/// Package revision (for repackaging without upstream version change)
|
||||
#[serde(default = "default_revision")]
|
||||
pub revision: u32,
|
||||
}
|
||||
|
||||
fn default_revision() -> u32 {
|
||||
1
|
||||
}
|
||||
|
||||
/// The `[source]` section — where to get the source code.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SourceInfo {
|
||||
/// Download URL. May contain `${version}` which is expanded at runtime.
|
||||
pub url: String,
|
||||
|
||||
/// SHA256 checksum of the source tarball
|
||||
pub sha256: String,
|
||||
|
||||
/// Optional: additional source files or patches to download
|
||||
#[serde(default)]
|
||||
pub patches: Vec<PatchInfo>,
|
||||
}
|
||||
|
||||
/// A patch to apply to the source before building.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct PatchInfo {
|
||||
/// Download URL for the patch
|
||||
pub url: String,
|
||||
|
||||
/// SHA256 checksum of the patch file
|
||||
pub sha256: String,
|
||||
|
||||
/// Strip level for `patch -p<N>` (default: 1)
|
||||
#[serde(default = "default_strip")]
|
||||
pub strip: u32,
|
||||
}
|
||||
|
||||
fn default_strip() -> u32 {
|
||||
1
|
||||
}
|
||||
|
||||
/// The `[dependencies]` section — what this package needs.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
pub struct Dependencies {
|
||||
/// Runtime dependencies (must be installed for the package to function)
|
||||
#[serde(default)]
|
||||
pub run: Vec<String>,
|
||||
|
||||
/// Build-time dependencies (only needed during compilation)
|
||||
#[serde(default)]
|
||||
pub build: Vec<String>,
|
||||
|
||||
/// Optional features — maps feature name to its definition
|
||||
#[serde(default)]
|
||||
pub optional: HashMap<String, OptionalDep>,
|
||||
}
|
||||
|
||||
/// An optional dependency / feature flag (inspired by Gentoo USE flags).
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct OptionalDep {
|
||||
/// Human-readable description of what this feature does
|
||||
pub description: String,
|
||||
|
||||
/// Whether this feature is enabled by default
|
||||
#[serde(default)]
|
||||
pub default: bool,
|
||||
|
||||
/// Additional runtime dependencies required by this feature
|
||||
#[serde(default)]
|
||||
pub deps: Vec<String>,
|
||||
|
||||
/// Additional build-time dependencies required by this feature
|
||||
#[serde(default)]
|
||||
pub build_deps: Vec<String>,
|
||||
}
|
||||
|
||||
/// The `[build]` section — how to compile and install.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct BuildInstructions {
|
||||
/// Configure command (e.g., `./configure --prefix=/usr`)
|
||||
/// May be empty for packages that don't need configuration.
|
||||
#[serde(default)]
|
||||
pub configure: String,
|
||||
|
||||
/// Build command (e.g., `make`)
|
||||
#[serde(default = "default_make")]
|
||||
pub make: String,
|
||||
|
||||
/// Install command (e.g., `make DESTDIR=${PKG} install`)
|
||||
pub install: String,
|
||||
|
||||
/// Optional: commands to run before configure (e.g., autoreconf, patching)
|
||||
#[serde(default)]
|
||||
pub prepare: String,
|
||||
|
||||
/// Optional: commands to run after install (e.g., cleanup, stripping)
|
||||
#[serde(default)]
|
||||
pub post_install: String,
|
||||
|
||||
/// Optional: custom test command
|
||||
#[serde(default)]
|
||||
pub check: String,
|
||||
|
||||
/// Per-package compiler flag overrides
|
||||
#[serde(default)]
|
||||
pub flags: BuildFlags,
|
||||
|
||||
/// Build system type hint (autotools, cmake, meson, cargo, custom)
|
||||
#[serde(default)]
|
||||
pub system: BuildSystem,
|
||||
}
|
||||
|
||||
fn default_make() -> String {
|
||||
"make".to_string()
|
||||
}
|
||||
|
||||
/// Per-package compiler flag overrides.
|
||||
/// Empty strings mean "use global defaults from dpack.conf".
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
pub struct BuildFlags {
|
||||
#[serde(default)]
|
||||
pub cflags: String,
|
||||
|
||||
#[serde(default)]
|
||||
pub cxxflags: String,
|
||||
|
||||
#[serde(default)]
|
||||
pub ldflags: String,
|
||||
|
||||
#[serde(default)]
|
||||
pub makeflags: String,
|
||||
}
|
||||
|
||||
/// Hint for the build system used by this package.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default, PartialEq)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum BuildSystem {
|
||||
#[default]
|
||||
Autotools,
|
||||
Cmake,
|
||||
Meson,
|
||||
Cargo,
|
||||
Custom,
|
||||
}
|
||||
|
||||
impl PackageDefinition {
|
||||
/// Load a package definition from a `.toml` file.
|
||||
pub fn from_file(path: &Path) -> Result<Self> {
|
||||
let content = std::fs::read_to_string(path)
|
||||
.with_context(|| format!("Failed to read package file: {}", path.display()))?;
|
||||
Self::from_str(&content)
|
||||
}
|
||||
|
||||
/// Parse a package definition from a TOML string.
|
||||
pub fn from_str(content: &str) -> Result<Self> {
|
||||
let pkg: Self = toml::from_str(content)
|
||||
.context("Failed to parse package definition TOML")?;
|
||||
pkg.validate()?;
|
||||
Ok(pkg)
|
||||
}
|
||||
|
||||
/// Serialize this definition back to TOML.
|
||||
pub fn to_toml(&self) -> Result<String> {
|
||||
toml::to_string_pretty(self).context("Failed to serialize package definition")
|
||||
}
|
||||
|
||||
/// Validate the package definition for correctness.
|
||||
fn validate(&self) -> Result<()> {
|
||||
anyhow::ensure!(!self.package.name.is_empty(), "Package name cannot be empty");
|
||||
anyhow::ensure!(!self.package.version.is_empty(), "Package version cannot be empty");
|
||||
anyhow::ensure!(!self.source.url.is_empty(), "Source URL cannot be empty");
|
||||
anyhow::ensure!(
|
||||
self.source.sha256.len() == 64 && self.source.sha256.chars().all(|c| c.is_ascii_hexdigit()),
|
||||
"SHA256 checksum must be exactly 64 hex characters, got: '{}'",
|
||||
self.source.sha256
|
||||
);
|
||||
anyhow::ensure!(!self.build.install.is_empty(), "Install command cannot be empty");
|
||||
|
||||
// Validate optional dep names don't contain spaces or special chars
|
||||
for name in self.dependencies.optional.keys() {
|
||||
anyhow::ensure!(
|
||||
name.chars().all(|c| c.is_alphanumeric() || c == '_' || c == '-'),
|
||||
"Optional dependency name '{}' contains invalid characters", name
|
||||
);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Expand `${version}` in the source URL.
|
||||
pub fn expanded_source_url(&self) -> String {
|
||||
self.source.url.replace("${version}", &self.package.version)
|
||||
}
|
||||
|
||||
/// Get all runtime dependencies, including those from enabled optional features.
|
||||
pub fn effective_run_deps(&self, enabled_features: &[String]) -> Vec<String> {
|
||||
let mut deps = self.dependencies.run.clone();
|
||||
for feature in enabled_features {
|
||||
if let Some(opt) = self.dependencies.optional.get(feature) {
|
||||
deps.extend(opt.deps.clone());
|
||||
}
|
||||
}
|
||||
deps
|
||||
}
|
||||
|
||||
/// Get all build dependencies, including those from enabled optional features.
|
||||
pub fn effective_build_deps(&self, enabled_features: &[String]) -> Vec<String> {
|
||||
let mut deps = self.dependencies.build.clone();
|
||||
for feature in enabled_features {
|
||||
if let Some(opt) = self.dependencies.optional.get(feature) {
|
||||
deps.extend(opt.build_deps.clone());
|
||||
}
|
||||
}
|
||||
deps
|
||||
}
|
||||
|
||||
/// Get the list of default-enabled features.
|
||||
pub fn default_features(&self) -> Vec<String> {
|
||||
self.dependencies
|
||||
.optional
|
||||
.iter()
|
||||
.filter(|(_, v)| v.default)
|
||||
.map(|(k, _)| k.clone())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Full identifier: "name-version"
|
||||
pub fn ident(&self) -> String {
|
||||
format!("{}-{}", self.package.name, self.package.version)
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
const SAMPLE_TOML: &str = r#"
|
||||
[package]
|
||||
name = "zlib"
|
||||
version = "1.3.1"
|
||||
description = "Compression library implementing the deflate algorithm"
|
||||
url = "https://zlib.net/"
|
||||
license = "zlib"
|
||||
|
||||
[source]
|
||||
url = "https://zlib.net/zlib-${version}.tar.xz"
|
||||
sha256 = "38ef96b8dfe510d42707d9c781877914792541133e1870841463bfa73f883e32"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[dependencies.optional]
|
||||
static = { description = "Build static library", default = true }
|
||||
minizip = { description = "Build minizip utility", deps = [] }
|
||||
|
||||
[build]
|
||||
configure = "./configure --prefix=/usr"
|
||||
make = "make"
|
||||
install = "make DESTDIR=${PKG} install"
|
||||
"#;
|
||||
|
||||
#[test]
|
||||
fn test_parse_zlib() {
|
||||
let pkg = PackageDefinition::from_str(SAMPLE_TOML).unwrap();
|
||||
assert_eq!(pkg.package.name, "zlib");
|
||||
assert_eq!(pkg.package.version, "1.3.1");
|
||||
assert_eq!(pkg.package.license, "zlib");
|
||||
assert_eq!(pkg.dependencies.build, vec!["gcc", "make"]);
|
||||
assert!(pkg.dependencies.optional.contains_key("static"));
|
||||
assert!(pkg.dependencies.optional.contains_key("minizip"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_expanded_source_url() {
|
||||
let pkg = PackageDefinition::from_str(SAMPLE_TOML).unwrap();
|
||||
assert_eq!(
|
||||
pkg.expanded_source_url(),
|
||||
"https://zlib.net/zlib-1.3.1.tar.xz"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_default_features() {
|
||||
let pkg = PackageDefinition::from_str(SAMPLE_TOML).unwrap();
|
||||
let defaults = pkg.default_features();
|
||||
assert!(defaults.contains(&"static".to_string()));
|
||||
assert!(!defaults.contains(&"minizip".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_effective_deps() {
|
||||
let pkg = PackageDefinition::from_str(SAMPLE_TOML).unwrap();
|
||||
let run_deps = pkg.effective_run_deps(&["minizip".to_string()]);
|
||||
// minizip has empty deps, so run_deps should still be empty
|
||||
assert!(run_deps.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_invalid_sha256() {
|
||||
let bad_toml = SAMPLE_TOML.replace(
|
||||
"38ef96b8dfe510d42707d9c781877914792541133e1870841463bfa73f883e32",
|
||||
"bad",
|
||||
);
|
||||
assert!(PackageDefinition::from_str(&bad_toml).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_empty_name() {
|
||||
let bad_toml = SAMPLE_TOML.replace("name = \"zlib\"", "name = \"\"");
|
||||
assert!(PackageDefinition::from_str(&bad_toml).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_roundtrip_toml() {
|
||||
let pkg = PackageDefinition::from_str(SAMPLE_TOML).unwrap();
|
||||
let serialized = pkg.to_toml().unwrap();
|
||||
let reparsed = PackageDefinition::from_str(&serialized).unwrap();
|
||||
assert_eq!(pkg.package.name, reparsed.package.name);
|
||||
assert_eq!(pkg.package.version, reparsed.package.version);
|
||||
}
|
||||
}
|
||||
432
src/dpack/src/converter/crux.rs
Normal file
432
src/dpack/src/converter/crux.rs
Normal file
@@ -0,0 +1,432 @@
|
||||
//! CRUX Pkgfile converter.
|
||||
//!
|
||||
//! Parses CRUX `Pkgfile` format (bash-like syntax) and emits a dpack
|
||||
//! `PackageDefinition`. Handles the common patterns:
|
||||
//! - Variable assignments: `name=`, `version=`, `release=`, `source=()`
|
||||
//! - Comment metadata: `# Description:`, `# URL:`, `# Depends on:`
|
||||
//! - Build function: `build() { ... }`
|
||||
//!
|
||||
//! CRUX Pkgfile format reference:
|
||||
//! - Variables are plain bash assignments
|
||||
//! - `source=()` is a bash array of URLs (may span multiple lines)
|
||||
//! - `build()` contains the full build logic
|
||||
//! - Dependencies are in comments, not formal fields
|
||||
|
||||
use anyhow::Result;
|
||||
use regex::Regex;
|
||||
use std::collections::HashMap;
|
||||
|
||||
use crate::config::package::*;
|
||||
|
||||
/// Parse a CRUX Pkgfile string into a dpack PackageDefinition.
|
||||
pub fn parse_pkgfile(content: &str) -> Result<PackageDefinition> {
|
||||
let mut name = String::new();
|
||||
let mut version = String::new();
|
||||
let mut release = 1u32;
|
||||
let mut description = String::new();
|
||||
let mut url = String::new();
|
||||
let mut _maintainer = String::new();
|
||||
let mut depends: Vec<String> = Vec::new();
|
||||
let mut optional_deps: Vec<String> = Vec::new();
|
||||
let source_urls: Vec<String>;
|
||||
|
||||
// --- Extract comment metadata ---
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
if let Some(desc) = trimmed.strip_prefix("# Description:") {
|
||||
description = desc.trim().to_string();
|
||||
} else if let Some(u) = trimmed.strip_prefix("# URL:") {
|
||||
url = u.trim().to_string();
|
||||
} else if let Some(m) = trimmed.strip_prefix("# Maintainer:") {
|
||||
_maintainer = m.trim().to_string();
|
||||
} else if let Some(d) = trimmed.strip_prefix("# Depends on:") {
|
||||
depends = d
|
||||
.split([',', ' '])
|
||||
.map(|s| s.trim().to_string())
|
||||
.filter(|s| !s.is_empty())
|
||||
.collect();
|
||||
} else if let Some(o) = trimmed.strip_prefix("# Optional:") {
|
||||
optional_deps = o
|
||||
.split([',', ' '])
|
||||
.map(|s| s.trim().to_string())
|
||||
.filter(|s| !s.is_empty())
|
||||
.collect();
|
||||
}
|
||||
}
|
||||
|
||||
// --- Extract variable assignments ---
|
||||
// name=value (no quotes or with quotes)
|
||||
let var_re = Regex::new(r#"^(\w+)=["']?([^"'\n]*)["']?\s*$"#).unwrap();
|
||||
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
if let Some(caps) = var_re.captures(trimmed) {
|
||||
let key = caps.get(1).unwrap().as_str();
|
||||
let val = caps.get(2).unwrap().as_str().trim();
|
||||
match key {
|
||||
"name" => name = val.to_string(),
|
||||
"version" => version = val.to_string(),
|
||||
"release" => release = val.parse().unwrap_or(1),
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// --- Extract source array ---
|
||||
source_urls = extract_source_array(content);
|
||||
|
||||
// --- Extract build function ---
|
||||
let build_body = extract_build_function(content);
|
||||
|
||||
// --- Parse build commands from the build function ---
|
||||
let (configure_cmd, make_cmd, install_cmd, prepare_cmd) =
|
||||
parse_build_commands(&build_body, &name, &version);
|
||||
|
||||
// --- Expand source URL (replace $name, $version, ${name}, ${version}) ---
|
||||
let primary_source = source_urls
|
||||
.first()
|
||||
.cloned()
|
||||
.unwrap_or_default();
|
||||
|
||||
let expanded_url = expand_crux_vars(&primary_source, &name, &version);
|
||||
|
||||
// Convert to template URL (replace version back with ${version})
|
||||
let template_url = expanded_url.replace(&version, "${version}");
|
||||
|
||||
// --- Build the PackageDefinition ---
|
||||
let mut optional_map = HashMap::new();
|
||||
for opt in &optional_deps {
|
||||
optional_map.insert(
|
||||
opt.clone(),
|
||||
OptionalDep {
|
||||
description: format!("Optional: {} support", opt),
|
||||
default: false,
|
||||
deps: vec![opt.clone()],
|
||||
build_deps: vec![],
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
Ok(PackageDefinition {
|
||||
package: PackageMetadata {
|
||||
name: name.clone(),
|
||||
version: version.clone(),
|
||||
description,
|
||||
url,
|
||||
license: String::new(), // CRUX doesn't track license in Pkgfile
|
||||
epoch: 0,
|
||||
revision: release,
|
||||
},
|
||||
source: SourceInfo {
|
||||
url: template_url,
|
||||
sha256: "FIXME_CHECKSUM".repeat(4)[..64].to_string(), // Placeholder
|
||||
patches: vec![],
|
||||
},
|
||||
dependencies: Dependencies {
|
||||
run: depends,
|
||||
build: vec![], // CRUX doesn't distinguish build vs runtime deps
|
||||
optional: optional_map,
|
||||
},
|
||||
build: BuildInstructions {
|
||||
configure: configure_cmd,
|
||||
make: make_cmd,
|
||||
install: install_cmd,
|
||||
prepare: prepare_cmd,
|
||||
post_install: String::new(),
|
||||
check: String::new(),
|
||||
flags: BuildFlags::default(),
|
||||
system: detect_build_system(&build_body),
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
/// Extract the source=() array from a Pkgfile.
|
||||
/// Handles single-line and multi-line arrays.
|
||||
fn extract_source_array(content: &str) -> Vec<String> {
|
||||
let mut sources = Vec::new();
|
||||
let mut in_source = false;
|
||||
let mut source_text = String::new();
|
||||
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
|
||||
if trimmed.starts_with("source=") || trimmed.starts_with("source =") {
|
||||
in_source = true;
|
||||
// Get everything after source=(
|
||||
let after_eq = trimmed.splitn(2, '=').nth(1).unwrap_or("");
|
||||
source_text.push_str(after_eq);
|
||||
if after_eq.contains(')') {
|
||||
in_source = false;
|
||||
}
|
||||
} else if in_source {
|
||||
source_text.push(' ');
|
||||
source_text.push_str(trimmed);
|
||||
if trimmed.contains(')') {
|
||||
in_source = false;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Strip parens and parse individual URLs
|
||||
let cleaned = source_text
|
||||
.trim_start_matches('(')
|
||||
.trim_end_matches(')')
|
||||
.trim();
|
||||
|
||||
for url in cleaned.split_whitespace() {
|
||||
let u = url.trim().to_string();
|
||||
if !u.is_empty() {
|
||||
sources.push(u);
|
||||
}
|
||||
}
|
||||
|
||||
sources
|
||||
}
|
||||
|
||||
/// Extract the build() function body from a Pkgfile.
|
||||
fn extract_build_function(content: &str) -> String {
|
||||
let mut in_build = false;
|
||||
let mut brace_depth = 0;
|
||||
let mut body = String::new();
|
||||
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
|
||||
if !in_build && (trimmed.starts_with("build()") || trimmed.starts_with("build ()")) {
|
||||
in_build = true;
|
||||
// Count braces on this line
|
||||
for ch in trimmed.chars() {
|
||||
match ch {
|
||||
'{' => brace_depth += 1,
|
||||
'}' => brace_depth -= 1,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
if in_build {
|
||||
for ch in trimmed.chars() {
|
||||
match ch {
|
||||
'{' => brace_depth += 1,
|
||||
'}' => brace_depth -= 1,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if brace_depth <= 0 {
|
||||
break;
|
||||
}
|
||||
|
||||
body.push_str(trimmed);
|
||||
body.push('\n');
|
||||
}
|
||||
}
|
||||
|
||||
body
|
||||
}
|
||||
|
||||
/// Parse configure/make/install commands from the build function body.
|
||||
fn parse_build_commands(
|
||||
body: &str,
|
||||
_name: &str,
|
||||
_version: &str,
|
||||
) -> (String, String, String, String) {
|
||||
let mut configure = String::new();
|
||||
let mut make = String::new();
|
||||
let mut install = String::new();
|
||||
let mut prepare = String::new();
|
||||
|
||||
let mut continuation = String::new();
|
||||
|
||||
for line in body.lines() {
|
||||
let trimmed = line.trim();
|
||||
|
||||
// Handle line continuations
|
||||
if trimmed.ends_with('\\') {
|
||||
continuation.push_str(&trimmed[..trimmed.len() - 1]);
|
||||
continuation.push(' ');
|
||||
continue;
|
||||
}
|
||||
|
||||
let full_line = if !continuation.is_empty() {
|
||||
let result = format!("{}{}", continuation, trimmed);
|
||||
continuation.clear();
|
||||
result
|
||||
} else {
|
||||
trimmed.to_string()
|
||||
};
|
||||
|
||||
let fl = full_line.trim();
|
||||
|
||||
// Detect configure-like commands
|
||||
if fl.starts_with("./configure")
|
||||
|| fl.starts_with("../configure")
|
||||
|| fl.starts_with("cmake")
|
||||
|| fl.starts_with("meson setup")
|
||||
|| fl.starts_with("meson ")
|
||||
{
|
||||
// Replace $PKG with ${PKG} for dpack template
|
||||
configure = fl.replace("$PKG", "${PKG}");
|
||||
}
|
||||
// Detect install commands
|
||||
else if fl.contains("DESTDIR=") && fl.contains("install")
|
||||
|| fl.starts_with("make install")
|
||||
|| fl.starts_with("make DESTDIR")
|
||||
|| fl.starts_with("meson install")
|
||||
|| fl.starts_with("DESTDIR=")
|
||||
|| fl.starts_with("ninja -C") && fl.contains("install")
|
||||
{
|
||||
install = fl.replace("$PKG", "${PKG}");
|
||||
}
|
||||
// Detect make/build commands
|
||||
else if fl == "make" || fl.starts_with("make -") || fl.starts_with("make ") && !fl.contains("install") {
|
||||
make = fl.to_string();
|
||||
} else if fl.starts_with("meson compile") || fl.starts_with("ninja") && !fl.contains("install") {
|
||||
make = fl.to_string();
|
||||
}
|
||||
// Detect prepare steps (patching, sed, autoreconf)
|
||||
else if fl.starts_with("sed ") || fl.starts_with("patch ") || fl.starts_with("autoreconf") {
|
||||
if !prepare.is_empty() {
|
||||
prepare.push_str(" && ");
|
||||
}
|
||||
prepare.push_str(fl);
|
||||
}
|
||||
}
|
||||
|
||||
// Default make if not found
|
||||
if make.is_empty() {
|
||||
make = "make".to_string();
|
||||
}
|
||||
|
||||
// Default install if not found
|
||||
if install.is_empty() {
|
||||
install = "make DESTDIR=${PKG} install".to_string();
|
||||
}
|
||||
|
||||
(configure, make, install, prepare)
|
||||
}
|
||||
|
||||
/// Expand CRUX variables in a string ($name, $version, ${name}, ${version}).
|
||||
fn expand_crux_vars(s: &str, name: &str, version: &str) -> String {
|
||||
s.replace("$name", name)
|
||||
.replace("${name}", name)
|
||||
.replace("$version", version)
|
||||
.replace("${version}", version)
|
||||
}
|
||||
|
||||
/// Detect the build system from the build function body.
|
||||
fn detect_build_system(body: &str) -> BuildSystem {
|
||||
if body.contains("meson setup") || body.contains("meson compile") {
|
||||
BuildSystem::Meson
|
||||
} else if body.contains("cmake") || body.contains("CMakeLists") {
|
||||
BuildSystem::Cmake
|
||||
} else if body.contains("cargo build") || body.contains("cargo install") {
|
||||
BuildSystem::Cargo
|
||||
} else if body.contains("./configure") || body.contains("../configure") {
|
||||
BuildSystem::Autotools
|
||||
} else {
|
||||
BuildSystem::Custom
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
const SAMPLE_PKGFILE: &str = r#"# Description: Compression library
|
||||
# URL: https://zlib.net/
|
||||
# Maintainer: Danny, danny@example.com
|
||||
# Depends on: gcc
|
||||
|
||||
name=zlib
|
||||
version=1.3.1
|
||||
release=1
|
||||
|
||||
source=(https://zlib.net/$name-$version.tar.xz)
|
||||
|
||||
build() {
|
||||
cd $name-$version
|
||||
|
||||
./configure --prefix=/usr
|
||||
|
||||
make
|
||||
make DESTDIR=$PKG install
|
||||
}
|
||||
"#;
|
||||
|
||||
#[test]
|
||||
fn test_parse_simple_pkgfile() {
|
||||
let pkg = parse_pkgfile(SAMPLE_PKGFILE).unwrap();
|
||||
assert_eq!(pkg.package.name, "zlib");
|
||||
assert_eq!(pkg.package.version, "1.3.1");
|
||||
assert_eq!(pkg.package.description, "Compression library");
|
||||
assert_eq!(pkg.package.url, "https://zlib.net/");
|
||||
assert_eq!(pkg.dependencies.run, vec!["gcc"]);
|
||||
assert_eq!(pkg.build.configure, "./configure --prefix=/usr");
|
||||
assert_eq!(pkg.build.install, "make DESTDIR=${PKG} install");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_source_url_expansion() {
|
||||
let pkg = parse_pkgfile(SAMPLE_PKGFILE).unwrap();
|
||||
let expanded = pkg.expanded_source_url();
|
||||
assert_eq!(expanded, "https://zlib.net/zlib-1.3.1.tar.xz");
|
||||
}
|
||||
|
||||
const COMPLEX_PKGFILE: &str = r#"# Description: A tool for transfering files with URL syntax
|
||||
# URL: https://curl.haxx.se
|
||||
# Maintainer: CRUX System Team
|
||||
# Depends on: libnghttp2 openssl zstd
|
||||
# Optional: brotli c-ares libpsl
|
||||
|
||||
name=curl
|
||||
version=8.19.0
|
||||
release=1
|
||||
|
||||
source=(https://curl.haxx.se/download/$name-$version.tar.xz)
|
||||
|
||||
build() {
|
||||
cd $name-$version
|
||||
|
||||
sed -i 's|/usr/share/curl|/etc/ssl/certs|' lib/url.c
|
||||
|
||||
./configure \
|
||||
--prefix=/usr \
|
||||
--enable-ipv6 \
|
||||
--with-openssl \
|
||||
--with-nghttp2 \
|
||||
--disable-ldap
|
||||
|
||||
make
|
||||
make DESTDIR=$PKG install
|
||||
}
|
||||
"#;
|
||||
|
||||
#[test]
|
||||
fn test_parse_complex_pkgfile() {
|
||||
let pkg = parse_pkgfile(COMPLEX_PKGFILE).unwrap();
|
||||
assert_eq!(pkg.package.name, "curl");
|
||||
assert_eq!(pkg.package.version, "8.19.0");
|
||||
assert_eq!(
|
||||
pkg.dependencies.run,
|
||||
vec!["libnghttp2", "openssl", "zstd"]
|
||||
);
|
||||
assert!(pkg.dependencies.optional.contains_key("brotli"));
|
||||
assert!(pkg.dependencies.optional.contains_key("c-ares"));
|
||||
assert!(pkg.build.configure.contains("--with-openssl"));
|
||||
assert!(pkg.build.prepare.contains("sed"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_detect_meson_build_system() {
|
||||
let body = "meson setup build --prefix=/usr\nmeson compile -C build\nDESTDIR=$PKG meson install -C build";
|
||||
assert_eq!(detect_build_system(body), BuildSystem::Meson);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_detect_cmake_build_system() {
|
||||
let body = "cmake -B build -DCMAKE_INSTALL_PREFIX=/usr\nmake -C build\nmake -C build DESTDIR=$PKG install";
|
||||
assert_eq!(detect_build_system(body), BuildSystem::Cmake);
|
||||
}
|
||||
}
|
||||
570
src/dpack/src/converter/gentoo.rs
Normal file
570
src/dpack/src/converter/gentoo.rs
Normal file
@@ -0,0 +1,570 @@
|
||||
//! Gentoo ebuild converter.
|
||||
//!
|
||||
//! Parses Gentoo `.ebuild` files and emits dpack `PackageDefinition` TOML.
|
||||
//! This is a best-effort converter — ebuilds can be extraordinarily complex
|
||||
//! (eclasses, slot deps, multilib, conditional USE deps). We handle the
|
||||
//! common 80% and flag the rest for manual review.
|
||||
//!
|
||||
//! What we extract:
|
||||
//! - DESCRIPTION, HOMEPAGE, SRC_URI, LICENSE
|
||||
//! - IUSE (USE flags → dpack optional deps)
|
||||
//! - RDEPEND, DEPEND, BDEPEND (dependencies)
|
||||
//! - src_configure/src_compile/src_install phase functions
|
||||
//!
|
||||
//! What requires manual review:
|
||||
//! - Complex eclass-dependent logic
|
||||
//! - Multilib builds (inherit multilib-minimal)
|
||||
//! - Slot dependencies and subslots
|
||||
//! - REQUIRED_USE constraints
|
||||
//! - Conditional dependency atoms with nested logic
|
||||
|
||||
use anyhow::Result;
|
||||
use regex::Regex;
|
||||
use std::collections::HashMap;
|
||||
|
||||
use crate::config::package::*;
|
||||
|
||||
/// Warnings generated during conversion that require manual review.
|
||||
#[derive(Debug, Default)]
|
||||
pub struct ConversionWarnings {
|
||||
pub warnings: Vec<String>,
|
||||
}
|
||||
|
||||
impl ConversionWarnings {
|
||||
fn warn(&mut self, msg: impl Into<String>) {
|
||||
self.warnings.push(msg.into());
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse a Gentoo ebuild string into a dpack PackageDefinition.
|
||||
///
|
||||
/// The filename is needed to extract name and version (Gentoo convention:
|
||||
/// `<name>-<version>.ebuild`).
|
||||
pub fn parse_ebuild(content: &str, filename: &str) -> Result<PackageDefinition> {
|
||||
let mut warnings = ConversionWarnings::default();
|
||||
|
||||
// Extract name and version from filename
|
||||
// Format: <name>-<version>.ebuild (e.g., curl-8.19.0.ebuild)
|
||||
let (name, version) = parse_ebuild_filename(filename)?;
|
||||
|
||||
// Extract simple variables
|
||||
let description = extract_var(content, "DESCRIPTION").unwrap_or_default();
|
||||
let homepage = extract_var(content, "HOMEPAGE").unwrap_or_default();
|
||||
let license = extract_var(content, "LICENSE").unwrap_or_default();
|
||||
let src_uri = extract_var(content, "SRC_URI").unwrap_or_default();
|
||||
let iuse = extract_var(content, "IUSE").unwrap_or_default();
|
||||
|
||||
// Check for eclasses that need manual review
|
||||
let inherits = extract_var(content, "inherit").unwrap_or_default();
|
||||
if inherits.contains("multilib-minimal") || inherits.contains("meson-multilib") {
|
||||
warnings.warn("Package uses multilib — may need separate 32-bit build definitions");
|
||||
}
|
||||
if inherits.contains("cargo") {
|
||||
warnings.warn("Package uses Rust cargo eclass — Rust crate deps may need manual handling");
|
||||
}
|
||||
if inherits.contains("git-r3") {
|
||||
warnings.warn("Package fetches from git — needs a release tarball URL instead");
|
||||
}
|
||||
|
||||
// Parse USE flags into optional dependencies
|
||||
let optional_deps = parse_use_flags(&iuse);
|
||||
|
||||
// Parse dependencies
|
||||
let rdepend = extract_multiline_var(content, "RDEPEND");
|
||||
let depend = extract_multiline_var(content, "DEPEND");
|
||||
let bdepend = extract_multiline_var(content, "BDEPEND");
|
||||
|
||||
let run_deps = parse_dep_atoms(&rdepend, &mut warnings);
|
||||
let build_deps = parse_dep_atoms(&bdepend, &mut warnings);
|
||||
|
||||
// If DEPEND is different from RDEPEND, merge its unique entries into build_deps
|
||||
let depend_parsed = parse_dep_atoms(&depend, &mut warnings);
|
||||
let extra_build_deps: Vec<String> = depend_parsed
|
||||
.into_iter()
|
||||
.filter(|d| !run_deps.contains(d) && !build_deps.contains(d))
|
||||
.collect();
|
||||
|
||||
let mut all_build_deps = build_deps;
|
||||
all_build_deps.extend(extra_build_deps);
|
||||
|
||||
// Parse build phase functions
|
||||
let configure_cmd = extract_phase_function(content, "src_configure");
|
||||
let compile_cmd = extract_phase_function(content, "src_compile");
|
||||
let install_cmd = extract_phase_function(content, "src_install");
|
||||
let prepare_cmd = extract_phase_function(content, "src_prepare");
|
||||
let test_cmd = extract_phase_function(content, "src_test");
|
||||
|
||||
// Determine build system from eclasses and configure commands
|
||||
let build_system = if inherits.contains("meson") {
|
||||
BuildSystem::Meson
|
||||
} else if inherits.contains("cmake") {
|
||||
BuildSystem::Cmake
|
||||
} else if inherits.contains("cargo") {
|
||||
BuildSystem::Cargo
|
||||
} else if inherits.contains("autotools") || configure_cmd.contains("econf") {
|
||||
BuildSystem::Autotools
|
||||
} else {
|
||||
BuildSystem::Custom
|
||||
};
|
||||
|
||||
// Convert econf to ./configure
|
||||
let configure_converted = convert_phase_to_commands(&configure_cmd, &build_system);
|
||||
let make_converted = convert_phase_to_commands(&compile_cmd, &build_system);
|
||||
let install_converted = convert_phase_to_commands(&install_cmd, &build_system);
|
||||
let prepare_converted = convert_phase_to_commands(&prepare_cmd, &build_system);
|
||||
let check_converted = convert_phase_to_commands(&test_cmd, &build_system);
|
||||
|
||||
// Parse SRC_URI into a clean URL
|
||||
let source_url = parse_src_uri(&src_uri, &name, &version);
|
||||
|
||||
// Check REQUIRED_USE for constraints
|
||||
let required_use = extract_multiline_var(content, "REQUIRED_USE");
|
||||
if !required_use.is_empty() {
|
||||
warnings.warn(format!(
|
||||
"REQUIRED_USE constraints exist — validate feature combinations: {}",
|
||||
required_use.chars().take(200).collect::<String>()
|
||||
));
|
||||
}
|
||||
|
||||
// Build the PackageDefinition
|
||||
let mut pkg = PackageDefinition {
|
||||
package: PackageMetadata {
|
||||
name: name.clone(),
|
||||
version: version.clone(),
|
||||
description,
|
||||
url: homepage,
|
||||
license,
|
||||
epoch: 0,
|
||||
revision: 1,
|
||||
},
|
||||
source: SourceInfo {
|
||||
url: source_url,
|
||||
sha256: "FIXME_CHECKSUM".repeat(4)[..64].to_string(),
|
||||
patches: vec![],
|
||||
},
|
||||
dependencies: Dependencies {
|
||||
run: run_deps,
|
||||
build: all_build_deps,
|
||||
optional: optional_deps,
|
||||
},
|
||||
build: BuildInstructions {
|
||||
configure: configure_converted,
|
||||
make: make_converted,
|
||||
install: if install_converted.is_empty() {
|
||||
"make DESTDIR=${PKG} install".to_string()
|
||||
} else {
|
||||
install_converted
|
||||
},
|
||||
prepare: prepare_converted,
|
||||
post_install: String::new(),
|
||||
check: check_converted,
|
||||
flags: BuildFlags::default(),
|
||||
system: build_system,
|
||||
},
|
||||
};
|
||||
|
||||
// Append warnings as comments in the TOML output
|
||||
// We do this by adding a note to the description
|
||||
if !warnings.warnings.is_empty() {
|
||||
let warning_text = warnings
|
||||
.warnings
|
||||
.iter()
|
||||
.map(|w| format!(" # REVIEW: {}", w))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n");
|
||||
pkg.package.description = format!(
|
||||
"{}\n# --- Conversion warnings (manual review needed) ---\n{}",
|
||||
pkg.package.description, warning_text
|
||||
);
|
||||
}
|
||||
|
||||
Ok(pkg)
|
||||
}
|
||||
|
||||
/// Parse ebuild filename into (name, version).
|
||||
/// Convention: `<name>-<version>.ebuild`
|
||||
fn parse_ebuild_filename(filename: &str) -> Result<(String, String)> {
|
||||
let stem = filename.strip_suffix(".ebuild").unwrap_or(filename);
|
||||
|
||||
// Find the version part: starts at the first `-` followed by a digit
|
||||
let re = Regex::new(r"^(.+?)-(\d.*)$").unwrap();
|
||||
|
||||
if let Some(caps) = re.captures(stem) {
|
||||
let name = caps.get(1).unwrap().as_str().to_string();
|
||||
let version = caps.get(2).unwrap().as_str().to_string();
|
||||
Ok((name, version))
|
||||
} else {
|
||||
anyhow::bail!("Cannot parse name/version from ebuild filename: {}", filename);
|
||||
}
|
||||
}
|
||||
|
||||
/// Extract a single-line variable assignment from ebuild content.
|
||||
fn extract_var(content: &str, var_name: &str) -> Option<String> {
|
||||
let re = Regex::new(&format!(
|
||||
r#"(?m)^{}=["']([^"']*?)["']\s*$"#,
|
||||
regex::escape(var_name)
|
||||
))
|
||||
.ok()?;
|
||||
|
||||
re.captures(content)
|
||||
.and_then(|caps| caps.get(1))
|
||||
.map(|m| m.as_str().to_string())
|
||||
}
|
||||
|
||||
/// Extract a multi-line variable (handles heredoc-style and continuation).
|
||||
fn extract_multiline_var(content: &str, var_name: &str) -> String {
|
||||
let mut result = String::new();
|
||||
let mut in_var = false;
|
||||
let mut quote_char = '"';
|
||||
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
|
||||
if !in_var {
|
||||
// Match: VARNAME="value or VARNAME='value
|
||||
let pattern = format!("{}=", var_name);
|
||||
if trimmed.starts_with(&pattern) {
|
||||
let after_eq = &trimmed[pattern.len()..];
|
||||
if after_eq.starts_with('"') {
|
||||
quote_char = '"';
|
||||
let inner = &after_eq[1..];
|
||||
if inner.ends_with('"') {
|
||||
// Single-line
|
||||
result = inner[..inner.len() - 1].to_string();
|
||||
return result;
|
||||
}
|
||||
result.push_str(inner);
|
||||
result.push('\n');
|
||||
in_var = true;
|
||||
} else if after_eq.starts_with('\'') {
|
||||
quote_char = '\'';
|
||||
let inner = &after_eq[1..];
|
||||
if inner.ends_with('\'') {
|
||||
result = inner[..inner.len() - 1].to_string();
|
||||
return result;
|
||||
}
|
||||
result.push_str(inner);
|
||||
result.push('\n');
|
||||
in_var = true;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
let close = format!("{}", quote_char);
|
||||
if trimmed.ends_with(quote_char) || trimmed == &close {
|
||||
let end = if trimmed.ends_with(quote_char) {
|
||||
&trimmed[..trimmed.len() - 1]
|
||||
} else {
|
||||
""
|
||||
};
|
||||
result.push_str(end);
|
||||
in_var = false;
|
||||
} else {
|
||||
result.push_str(trimmed);
|
||||
result.push('\n');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
result.trim().to_string()
|
||||
}
|
||||
|
||||
/// Parse IUSE string into optional dependency map.
|
||||
fn parse_use_flags(iuse: &str) -> HashMap<String, OptionalDep> {
|
||||
let mut map = HashMap::new();
|
||||
|
||||
for flag in iuse.split_whitespace() {
|
||||
let (name, default) = if let Some(stripped) = flag.strip_prefix('+') {
|
||||
(stripped.to_string(), true)
|
||||
} else if let Some(stripped) = flag.strip_prefix('-') {
|
||||
(stripped.to_string(), false)
|
||||
} else {
|
||||
(flag.to_string(), false)
|
||||
};
|
||||
|
||||
// Skip internal/system flags
|
||||
if name.starts_with("cpu_flags_")
|
||||
|| name.starts_with("video_cards_")
|
||||
|| name.starts_with("python_")
|
||||
|| name == "test"
|
||||
|| name == "doc"
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
map.insert(
|
||||
name.clone(),
|
||||
OptionalDep {
|
||||
description: format!("Enable {} support", name),
|
||||
default,
|
||||
deps: vec![], // Would need dep analysis to fill
|
||||
build_deps: vec![],
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
map
|
||||
}
|
||||
|
||||
/// Parse Gentoo dependency atoms into a flat list of package names.
|
||||
///
|
||||
/// Handles:
|
||||
/// - Simple atoms: `dev-libs/openssl`
|
||||
/// - Versioned: `>=dev-libs/openssl-1.0.2`
|
||||
/// - USE-conditional: `ssl? ( dev-libs/openssl )`
|
||||
/// - Slot: `dev-libs/openssl:0=`
|
||||
///
|
||||
/// Strips category prefixes and version constraints for dpack format.
|
||||
fn parse_dep_atoms(deps: &str, warnings: &mut ConversionWarnings) -> Vec<String> {
|
||||
let mut result = Vec::new();
|
||||
let atom_re = Regex::new(
|
||||
r"(?:>=|<=|~|=)?([a-zA-Z0-9_-]+/[a-zA-Z0-9_.+-]+?)(?:-\d[^\s\[\]:]*)?(?:\[.*?\])?(?::[\w/=*]*)?(?:\s|$)"
|
||||
).unwrap();
|
||||
|
||||
for caps in atom_re.captures_iter(deps) {
|
||||
if let Some(m) = caps.get(1) {
|
||||
let full_atom = m.as_str();
|
||||
// Strip category prefix (e.g., "dev-libs/" -> "")
|
||||
let pkg_name = full_atom
|
||||
.rsplit('/')
|
||||
.next()
|
||||
.unwrap_or(full_atom)
|
||||
.to_string();
|
||||
|
||||
// Skip virtual packages and test-only deps
|
||||
if full_atom.starts_with("virtual/") {
|
||||
continue;
|
||||
}
|
||||
|
||||
if !result.contains(&pkg_name) {
|
||||
result.push(pkg_name);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Detect complex constructs we can't fully parse
|
||||
if deps.contains("^^") || deps.contains("||") {
|
||||
warnings.warn("Complex dependency logic (^^ or ||) detected — manual review needed");
|
||||
}
|
||||
if deps.contains("${MULTILIB_USEDEP}") {
|
||||
warnings.warn("Multilib dependencies detected — 32-bit builds may be needed");
|
||||
}
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
/// Extract a phase function body (e.g., src_configure, src_install).
|
||||
fn extract_phase_function(content: &str, func_name: &str) -> String {
|
||||
let mut in_func = false;
|
||||
let mut brace_depth = 0;
|
||||
let mut body = String::new();
|
||||
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
|
||||
if !in_func {
|
||||
// Match: func_name() { or func_name () {
|
||||
if trimmed.starts_with(func_name) && trimmed.contains('{') {
|
||||
in_func = true;
|
||||
for ch in trimmed.chars() {
|
||||
match ch {
|
||||
'{' => brace_depth += 1,
|
||||
'}' => brace_depth -= 1,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
if brace_depth <= 0 {
|
||||
break;
|
||||
}
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
if in_func {
|
||||
for ch in trimmed.chars() {
|
||||
match ch {
|
||||
'{' => brace_depth += 1,
|
||||
'}' => {
|
||||
brace_depth -= 1;
|
||||
if brace_depth <= 0 {
|
||||
return body.trim().to_string();
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
body.push_str(trimmed);
|
||||
body.push('\n');
|
||||
}
|
||||
}
|
||||
|
||||
body.trim().to_string()
|
||||
}
|
||||
|
||||
/// Convert Gentoo eclass helper calls to plain shell commands.
|
||||
fn convert_phase_to_commands(body: &str, _build_system: &BuildSystem) -> String {
|
||||
if body.is_empty() {
|
||||
return String::new();
|
||||
}
|
||||
|
||||
let mut result = body.to_string();
|
||||
|
||||
// Replace common Gentoo helpers
|
||||
result = result.replace("econf ", "./configure ");
|
||||
result = result.replace("econf\n", "./configure\n");
|
||||
result = result.replace("emake ", "make ");
|
||||
result = result.replace("emake\n", "make\n");
|
||||
result = result.replace("${ED}", "${PKG}");
|
||||
result = result.replace("${D}", "${PKG}");
|
||||
result = result.replace("${FILESDIR}", "./files");
|
||||
result = result.replace("${WORKDIR}", ".");
|
||||
result = result.replace("${S}", ".");
|
||||
result = result.replace("${P}", "${name}-${version}");
|
||||
result = result.replace("${PV}", "${version}");
|
||||
result = result.replace("${PN}", "${name}");
|
||||
|
||||
// Replace einstall
|
||||
result = result.replace("einstall", "make DESTDIR=${PKG} install");
|
||||
|
||||
// Remove Gentoo-specific calls that have no equivalent
|
||||
let remove_patterns = [
|
||||
"default",
|
||||
"eapply_user",
|
||||
"multilib_src_configure",
|
||||
"multilib_src_compile",
|
||||
"multilib_src_install",
|
||||
];
|
||||
for pattern in &remove_patterns {
|
||||
result = result
|
||||
.lines()
|
||||
.filter(|l| !l.trim().starts_with(pattern))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n");
|
||||
}
|
||||
|
||||
result.trim().to_string()
|
||||
}
|
||||
|
||||
/// Parse SRC_URI into a clean download URL.
|
||||
fn parse_src_uri(src_uri: &str, name: &str, version: &str) -> String {
|
||||
// SRC_URI can have multiple entries, redirects, and mirror:// prefixes
|
||||
// Take the first real URL
|
||||
for token in src_uri.split_whitespace() {
|
||||
if token.starts_with("http://") || token.starts_with("https://") || token.starts_with("mirror://") {
|
||||
let url = token
|
||||
.replace("mirror://sourceforge", "https://downloads.sourceforge.net")
|
||||
.replace("mirror://gnu", "https://ftp.gnu.org/gnu")
|
||||
.replace("mirror://gentoo", "https://distfiles.gentoo.org/distfiles");
|
||||
|
||||
// Replace ${P}, ${PV}, ${PN} with template vars
|
||||
let templated = url
|
||||
.replace(&format!("{}-{}", name, version), "${name}-${version}")
|
||||
.replace(version, "${version}")
|
||||
.replace(name, "${name}");
|
||||
|
||||
return templated;
|
||||
}
|
||||
}
|
||||
|
||||
// If no URL found, return a placeholder
|
||||
format!("https://FIXME/{}-{}.tar.xz", name, version)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_parse_ebuild_filename() {
|
||||
let (name, version) = parse_ebuild_filename("curl-8.19.0.ebuild").unwrap();
|
||||
assert_eq!(name, "curl");
|
||||
assert_eq!(version, "8.19.0");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_ebuild_filename_complex() {
|
||||
let (name, version) = parse_ebuild_filename("qt6-base-6.8.0.ebuild").unwrap();
|
||||
assert_eq!(name, "qt6-base");
|
||||
assert_eq!(version, "6.8.0");
|
||||
}
|
||||
|
||||
const SIMPLE_EBUILD: &str = r#"
|
||||
EAPI=8
|
||||
|
||||
DESCRIPTION="Standard (de)compression library"
|
||||
HOMEPAGE="https://zlib.net/"
|
||||
SRC_URI="https://zlib.net/zlib-${PV}.tar.xz"
|
||||
|
||||
LICENSE="ZLIB"
|
||||
SLOT="0/1"
|
||||
KEYWORDS="~alpha amd64 arm arm64"
|
||||
IUSE="minizip static-libs"
|
||||
|
||||
RDEPEND=""
|
||||
DEPEND=""
|
||||
|
||||
src_configure() {
|
||||
econf
|
||||
}
|
||||
|
||||
src_install() {
|
||||
emake DESTDIR="${D}" install
|
||||
}
|
||||
"#;
|
||||
|
||||
#[test]
|
||||
fn test_parse_simple_ebuild() {
|
||||
let pkg = parse_ebuild(SIMPLE_EBUILD, "zlib-1.3.2.ebuild").unwrap();
|
||||
assert_eq!(pkg.package.name, "zlib");
|
||||
assert_eq!(pkg.package.version, "1.3.2");
|
||||
assert_eq!(pkg.package.description, "Standard (de)compression library");
|
||||
assert_eq!(pkg.package.license, "ZLIB");
|
||||
assert!(pkg.dependencies.optional.contains_key("minizip"));
|
||||
assert!(pkg.dependencies.optional.contains_key("static-libs"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_multiline_var() {
|
||||
let content = r#"
|
||||
RDEPEND="
|
||||
dev-libs/openssl:=
|
||||
>=net-libs/nghttp2-1.0
|
||||
sys-libs/zlib
|
||||
"
|
||||
"#;
|
||||
let result = extract_multiline_var(content, "RDEPEND");
|
||||
assert!(result.contains("openssl"));
|
||||
assert!(result.contains("nghttp2"));
|
||||
assert!(result.contains("zlib"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_dep_atoms() {
|
||||
let deps = ">=dev-libs/openssl-1.0.2:=[static-libs?] net-libs/nghttp2:= sys-libs/zlib";
|
||||
let mut warnings = ConversionWarnings::default();
|
||||
let result = parse_dep_atoms(deps, &mut warnings);
|
||||
assert!(result.contains(&"openssl".to_string()));
|
||||
assert!(result.contains(&"nghttp2".to_string()));
|
||||
assert!(result.contains(&"zlib".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_use_flags() {
|
||||
let iuse = "+http2 +quic brotli debug test doc";
|
||||
let flags = parse_use_flags(iuse);
|
||||
assert!(flags.get("http2").unwrap().default);
|
||||
assert!(flags.get("quic").unwrap().default);
|
||||
assert!(!flags.get("brotli").unwrap().default);
|
||||
// test and doc should be filtered out
|
||||
assert!(!flags.contains_key("test"));
|
||||
assert!(!flags.contains_key("doc"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_convert_phase_to_commands() {
|
||||
let body = "econf --prefix=/usr\nemake\nemake DESTDIR=\"${D}\" install";
|
||||
let result = convert_phase_to_commands(body, &BuildSystem::Autotools);
|
||||
assert!(result.contains("./configure --prefix=/usr"));
|
||||
assert!(result.contains("make DESTDIR=\"${PKG}\" install"));
|
||||
}
|
||||
}
|
||||
34
src/dpack/src/converter/mod.rs
Normal file
34
src/dpack/src/converter/mod.rs
Normal file
@@ -0,0 +1,34 @@
|
||||
//! Foreign package format converters.
|
||||
//!
|
||||
//! Converts CRUX Pkgfiles and Gentoo ebuilds to `.toml` dpack format.
|
||||
//! Both converters are best-effort: they handle common patterns and flag
|
||||
//! anything that requires manual review.
|
||||
|
||||
pub mod crux;
|
||||
pub mod gentoo;
|
||||
|
||||
use anyhow::{bail, Result};
|
||||
use std::path::Path;
|
||||
|
||||
/// Detect the format of a foreign package file and convert it.
|
||||
pub fn convert_file(path: &Path) -> Result<String> {
|
||||
let filename = path
|
||||
.file_name()
|
||||
.map(|f| f.to_string_lossy().to_string())
|
||||
.unwrap_or_default();
|
||||
|
||||
if filename == "Pkgfile" {
|
||||
let content = std::fs::read_to_string(path)?;
|
||||
let pkg = crux::parse_pkgfile(&content)?;
|
||||
pkg.to_toml()
|
||||
} else if filename.ends_with(".ebuild") {
|
||||
let content = std::fs::read_to_string(path)?;
|
||||
let pkg = gentoo::parse_ebuild(&content, &filename)?;
|
||||
pkg.to_toml()
|
||||
} else {
|
||||
bail!(
|
||||
"Unknown package format: '{}'. Expected 'Pkgfile' or '*.ebuild'",
|
||||
filename
|
||||
);
|
||||
}
|
||||
}
|
||||
329
src/dpack/src/db/mod.rs
Normal file
329
src/dpack/src/db/mod.rs
Normal file
@@ -0,0 +1,329 @@
|
||||
//! Installed package database.
|
||||
//!
|
||||
//! File-based database stored at `/var/lib/dpack/db/`. One TOML file per
|
||||
//! installed package, tracking: name, version, installed files, dependencies,
|
||||
//! features enabled, install timestamp, and link type (shared/static).
|
||||
//!
|
||||
//! The database is the source of truth for what's installed on the system.
|
||||
//! It's used by the resolver (to skip already-installed packages), the
|
||||
//! remove command (to know which files to delete), and the upgrade command
|
||||
//! (to compare installed vs available versions).
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashMap;
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
/// A record of a single installed package.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct InstalledPackage {
|
||||
/// Package name
|
||||
pub name: String,
|
||||
|
||||
/// Installed version
|
||||
pub version: String,
|
||||
|
||||
/// Package description (copied from definition at install time)
|
||||
pub description: String,
|
||||
|
||||
/// Runtime dependencies at time of installation
|
||||
pub run_deps: Vec<String>,
|
||||
|
||||
/// Build dependencies used during installation
|
||||
pub build_deps: Vec<String>,
|
||||
|
||||
/// Features that were enabled during build
|
||||
pub features: Vec<String>,
|
||||
|
||||
/// All files installed by this package (absolute paths)
|
||||
pub files: Vec<PathBuf>,
|
||||
|
||||
/// Install timestamp (seconds since epoch)
|
||||
pub installed_at: u64,
|
||||
|
||||
/// Which repository this package came from
|
||||
pub repo: String,
|
||||
|
||||
/// Size in bytes of all installed files
|
||||
pub size: u64,
|
||||
}
|
||||
|
||||
/// The package database — manages the collection of installed package records.
|
||||
pub struct PackageDb {
|
||||
/// Path to the database directory
|
||||
db_dir: PathBuf,
|
||||
|
||||
/// In-memory cache of installed packages
|
||||
cache: HashMap<String, InstalledPackage>,
|
||||
}
|
||||
|
||||
impl PackageDb {
|
||||
/// Open or create the package database at the given directory.
|
||||
pub fn open(db_dir: &Path) -> Result<Self> {
|
||||
std::fs::create_dir_all(db_dir)
|
||||
.with_context(|| format!("Failed to create db dir: {}", db_dir.display()))?;
|
||||
|
||||
let mut db = Self {
|
||||
db_dir: db_dir.to_path_buf(),
|
||||
cache: HashMap::new(),
|
||||
};
|
||||
|
||||
db.load_all()?;
|
||||
Ok(db)
|
||||
}
|
||||
|
||||
/// Load all package records from disk into the cache.
|
||||
fn load_all(&mut self) -> Result<()> {
|
||||
self.cache.clear();
|
||||
|
||||
if !self.db_dir.exists() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
for entry in std::fs::read_dir(&self.db_dir)
|
||||
.with_context(|| format!("Failed to read db dir: {}", self.db_dir.display()))?
|
||||
{
|
||||
let entry = entry?;
|
||||
let path = entry.path();
|
||||
|
||||
if path.extension().map_or(false, |ext| ext == "toml") {
|
||||
match self.load_one(&path) {
|
||||
Ok(pkg) => {
|
||||
self.cache.insert(pkg.name.clone(), pkg);
|
||||
}
|
||||
Err(e) => {
|
||||
log::warn!("Skipping corrupt db entry {}: {}", path.display(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Load a single package record from a TOML file.
|
||||
fn load_one(&self, path: &Path) -> Result<InstalledPackage> {
|
||||
let content = std::fs::read_to_string(path)
|
||||
.with_context(|| format!("Failed to read: {}", path.display()))?;
|
||||
toml::from_str(&content).context("Failed to parse package db entry")
|
||||
}
|
||||
|
||||
/// Register a newly installed package in the database.
|
||||
pub fn register(&mut self, pkg: InstalledPackage) -> Result<()> {
|
||||
let path = self.db_dir.join(format!("{}.toml", pkg.name));
|
||||
let content = toml::to_string_pretty(&pkg)
|
||||
.context("Failed to serialize package record")?;
|
||||
std::fs::write(&path, content)
|
||||
.with_context(|| format!("Failed to write db entry: {}", path.display()))?;
|
||||
|
||||
self.cache.insert(pkg.name.clone(), pkg);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Remove a package record from the database.
|
||||
pub fn unregister(&mut self, name: &str) -> Result<Option<InstalledPackage>> {
|
||||
let path = self.db_dir.join(format!("{}.toml", name));
|
||||
if path.exists() {
|
||||
std::fs::remove_file(&path)
|
||||
.with_context(|| format!("Failed to remove db entry: {}", path.display()))?;
|
||||
}
|
||||
Ok(self.cache.remove(name))
|
||||
}
|
||||
|
||||
/// Check if a package is installed.
|
||||
pub fn is_installed(&self, name: &str) -> bool {
|
||||
self.cache.contains_key(name)
|
||||
}
|
||||
|
||||
/// Get the installed version of a package.
|
||||
pub fn installed_version(&self, name: &str) -> Option<&str> {
|
||||
self.cache.get(name).map(|p| p.version.as_str())
|
||||
}
|
||||
|
||||
/// Get the full record of an installed package.
|
||||
pub fn get(&self, name: &str) -> Option<&InstalledPackage> {
|
||||
self.cache.get(name)
|
||||
}
|
||||
|
||||
/// List all installed packages.
|
||||
pub fn list_all(&self) -> Vec<&InstalledPackage> {
|
||||
let mut pkgs: Vec<_> = self.cache.values().collect();
|
||||
pkgs.sort_by_key(|p| &p.name);
|
||||
pkgs
|
||||
}
|
||||
|
||||
/// Get a map of all installed packages: name -> version.
|
||||
/// Used by the dependency resolver.
|
||||
pub fn installed_versions(&self) -> HashMap<String, String> {
|
||||
self.cache
|
||||
.iter()
|
||||
.map(|(k, v)| (k.clone(), v.version.clone()))
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Find all packages that own a specific file.
|
||||
pub fn who_owns(&self, file_path: &Path) -> Vec<String> {
|
||||
self.cache
|
||||
.values()
|
||||
.filter(|pkg| pkg.files.iter().any(|f| f == file_path))
|
||||
.map(|pkg| pkg.name.clone())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Find packages with file conflicts (files owned by multiple packages).
|
||||
pub fn find_conflicts(&self) -> HashMap<PathBuf, Vec<String>> {
|
||||
let mut file_owners: HashMap<PathBuf, Vec<String>> = HashMap::new();
|
||||
|
||||
for pkg in self.cache.values() {
|
||||
for file in &pkg.files {
|
||||
file_owners
|
||||
.entry(file.clone())
|
||||
.or_default()
|
||||
.push(pkg.name.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// Return only files with multiple owners
|
||||
file_owners
|
||||
.into_iter()
|
||||
.filter(|(_, owners)| owners.len() > 1)
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get total number of installed packages.
|
||||
pub fn count(&self) -> usize {
|
||||
self.cache.len()
|
||||
}
|
||||
|
||||
/// Get total disk usage of all installed packages.
|
||||
pub fn total_size(&self) -> u64 {
|
||||
self.cache.values().map(|p| p.size).sum()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use std::time::{SystemTime, UNIX_EPOCH};
|
||||
|
||||
fn make_installed(name: &str, version: &str, files: Vec<&str>) -> InstalledPackage {
|
||||
InstalledPackage {
|
||||
name: name.to_string(),
|
||||
version: version.to_string(),
|
||||
description: format!("Test package {}", name),
|
||||
run_deps: vec![],
|
||||
build_deps: vec![],
|
||||
features: vec![],
|
||||
files: files.into_iter().map(PathBuf::from).collect(),
|
||||
installed_at: SystemTime::now()
|
||||
.duration_since(UNIX_EPOCH)
|
||||
.unwrap()
|
||||
.as_secs(),
|
||||
repo: "core".to_string(),
|
||||
size: 1024,
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_register_and_get() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-reg");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
assert_eq!(db.count(), 0);
|
||||
|
||||
let pkg = make_installed("zlib", "1.3.1", vec!["/usr/lib/libz.so"]);
|
||||
db.register(pkg).unwrap();
|
||||
|
||||
assert!(db.is_installed("zlib"));
|
||||
assert_eq!(db.installed_version("zlib"), Some("1.3.1"));
|
||||
assert_eq!(db.count(), 1);
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_unregister() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-unreg");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
db.register(make_installed("zlib", "1.3.1", vec![])).unwrap();
|
||||
|
||||
assert!(db.is_installed("zlib"));
|
||||
let removed = db.unregister("zlib").unwrap();
|
||||
assert!(removed.is_some());
|
||||
assert!(!db.is_installed("zlib"));
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_persistence() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-persist");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
{
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
db.register(make_installed("zlib", "1.3.1", vec!["/usr/lib/libz.so"])).unwrap();
|
||||
db.register(make_installed("gcc", "15.2.0", vec!["/usr/bin/gcc"])).unwrap();
|
||||
}
|
||||
|
||||
// Re-open and verify data persisted
|
||||
let db = PackageDb::open(&tmpdir).unwrap();
|
||||
assert_eq!(db.count(), 2);
|
||||
assert!(db.is_installed("zlib"));
|
||||
assert!(db.is_installed("gcc"));
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_who_owns() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-owns");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
db.register(make_installed("zlib", "1.3.1", vec!["/usr/lib/libz.so"])).unwrap();
|
||||
|
||||
let owners = db.who_owns(Path::new("/usr/lib/libz.so"));
|
||||
assert_eq!(owners, vec!["zlib"]);
|
||||
|
||||
let owners = db.who_owns(Path::new("/usr/lib/nonexistent.so"));
|
||||
assert!(owners.is_empty());
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_find_conflicts() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-conflicts");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
db.register(make_installed("pkg-a", "1.0", vec!["/usr/lib/shared.so"])).unwrap();
|
||||
db.register(make_installed("pkg-b", "2.0", vec!["/usr/lib/shared.so"])).unwrap();
|
||||
|
||||
let conflicts = db.find_conflicts();
|
||||
assert!(conflicts.contains_key(&PathBuf::from("/usr/lib/shared.so")));
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_list_all_sorted() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-list");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
db.register(make_installed("zlib", "1.3.1", vec![])).unwrap();
|
||||
db.register(make_installed("bash", "5.3", vec![])).unwrap();
|
||||
db.register(make_installed("gcc", "15.2.0", vec![])).unwrap();
|
||||
|
||||
let all = db.list_all();
|
||||
let names: Vec<&str> = all.iter().map(|p| p.name.as_str()).collect();
|
||||
assert_eq!(names, vec!["bash", "gcc", "zlib"]); // sorted
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
}
|
||||
21
src/dpack/src/lib.rs
Normal file
21
src/dpack/src/lib.rs
Normal file
@@ -0,0 +1,21 @@
|
||||
//! dpack library — core functionality for the DarkForge package manager.
|
||||
//!
|
||||
//! This crate provides:
|
||||
//! - Package definition parsing (`config`)
|
||||
//! - Dependency resolution (`resolver`)
|
||||
//! - Build sandboxing (`sandbox`)
|
||||
//! - Foreign format converters (`converter`)
|
||||
//! - Installed package database (`db`)
|
||||
//! - Build orchestration (`build`)
|
||||
|
||||
// Many public API items are not yet used from main.rs but will be
|
||||
// consumed as later phases are implemented. Suppress dead_code warnings
|
||||
// for the library crate.
|
||||
#![allow(dead_code)]
|
||||
|
||||
pub mod config;
|
||||
pub mod resolver;
|
||||
pub mod sandbox;
|
||||
pub mod converter;
|
||||
pub mod db;
|
||||
pub mod build;
|
||||
389
src/dpack/src/main.rs
Normal file
389
src/dpack/src/main.rs
Normal file
@@ -0,0 +1,389 @@
|
||||
//! dpack — DarkForge Linux Package Manager
|
||||
//!
|
||||
//! A source-based package manager inspired by CRUX's pkgutils and Gentoo's emerge.
|
||||
//! Supports TOML package definitions, dependency resolution, sandboxed builds,
|
||||
//! and converters for CRUX Pkgfiles and Gentoo ebuilds.
|
||||
|
||||
// Public API items in submodules are used across phases — suppress dead_code
|
||||
// warnings for items not yet wired into CLI commands.
|
||||
#![allow(dead_code)]
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use clap::{Parser, Subcommand};
|
||||
use colored::Colorize;
|
||||
|
||||
mod config;
|
||||
mod resolver;
|
||||
mod sandbox;
|
||||
mod converter;
|
||||
mod db;
|
||||
mod build;
|
||||
|
||||
use config::{DpackConfig, PackageDefinition};
|
||||
use db::PackageDb;
|
||||
use build::BuildOrchestrator;
|
||||
|
||||
/// DarkForge package manager
|
||||
#[derive(Parser)]
|
||||
#[command(name = "dpack")]
|
||||
#[command(about = "DarkForge Linux package manager")]
|
||||
#[command(version)]
|
||||
struct Cli {
|
||||
/// Path to dpack configuration file
|
||||
#[arg(short, long, default_value = "/etc/dpack.conf")]
|
||||
config: String,
|
||||
|
||||
#[command(subcommand)]
|
||||
command: Commands,
|
||||
}
|
||||
|
||||
#[derive(Subcommand)]
|
||||
enum Commands {
|
||||
/// Install a package (resolve deps → build → install → update db)
|
||||
Install {
|
||||
/// Package name(s) to install
|
||||
#[arg(required = true)]
|
||||
packages: Vec<String>,
|
||||
},
|
||||
|
||||
/// Remove an installed package
|
||||
Remove {
|
||||
/// Package name(s) to remove
|
||||
#[arg(required = true)]
|
||||
packages: Vec<String>,
|
||||
},
|
||||
|
||||
/// Upgrade installed package(s) to latest version
|
||||
Upgrade {
|
||||
/// Package name(s) to upgrade, or empty for all
|
||||
packages: Vec<String>,
|
||||
},
|
||||
|
||||
/// Search for packages in the repository
|
||||
Search {
|
||||
/// Search query
|
||||
query: String,
|
||||
},
|
||||
|
||||
/// Show information about a package
|
||||
Info {
|
||||
/// Package name
|
||||
package: String,
|
||||
},
|
||||
|
||||
/// List installed packages
|
||||
List,
|
||||
|
||||
/// Convert a foreign package definition to dpack format
|
||||
Convert {
|
||||
/// Path to the foreign package file (Pkgfile or .ebuild)
|
||||
path: String,
|
||||
|
||||
/// Output path for the generated .toml file
|
||||
#[arg(short, long)]
|
||||
output: Option<String>,
|
||||
},
|
||||
|
||||
/// Check for shared library conflicts
|
||||
Check,
|
||||
}
|
||||
|
||||
fn main() {
|
||||
env_logger::init();
|
||||
let cli = Cli::parse();
|
||||
|
||||
let result = run(cli);
|
||||
|
||||
if let Err(e) = result {
|
||||
eprintln!("{}: {:#}", "error".red().bold(), e);
|
||||
std::process::exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
fn run(cli: Cli) -> Result<()> {
|
||||
let config = if std::path::Path::new(&cli.config).exists() {
|
||||
DpackConfig::from_file(std::path::Path::new(&cli.config))?
|
||||
} else {
|
||||
DpackConfig::default()
|
||||
};
|
||||
|
||||
match cli.command {
|
||||
Commands::Install { packages } => {
|
||||
let db = PackageDb::open(&config.paths.db_dir)?;
|
||||
let mut orchestrator = BuildOrchestrator::new(config, db);
|
||||
orchestrator.install(&packages)?;
|
||||
}
|
||||
|
||||
Commands::Remove { packages } => {
|
||||
let mut db = PackageDb::open(&config.paths.db_dir)?;
|
||||
|
||||
// Load repos for reverse-dep checking
|
||||
let mut all_repo_packages = std::collections::HashMap::new();
|
||||
for repo in &config.repos {
|
||||
let repo_pkgs = resolver::DependencyGraph::load_repo(&repo.path)?;
|
||||
all_repo_packages.extend(repo_pkgs);
|
||||
}
|
||||
|
||||
let installed_names: std::collections::HashSet<String> =
|
||||
db.list_all().iter().map(|p| p.name.clone()).collect();
|
||||
|
||||
for name in &packages {
|
||||
// Check reverse dependencies before removing
|
||||
let rdeps = resolver::reverse_deps(name, &all_repo_packages, &installed_names);
|
||||
if !rdeps.is_empty() {
|
||||
println!(
|
||||
"{} '{}' is required by: {}",
|
||||
"Warning:".yellow().bold(),
|
||||
name,
|
||||
rdeps.join(", ")
|
||||
);
|
||||
println!(" Removing it may break these packages.");
|
||||
println!(" Proceeding anyway...");
|
||||
}
|
||||
|
||||
match db.unregister(name)? {
|
||||
Some(pkg) => {
|
||||
// Remove installed files (in reverse order to clean dirs)
|
||||
let mut files = pkg.files.clone();
|
||||
files.sort();
|
||||
files.reverse();
|
||||
|
||||
let mut removed_count = 0;
|
||||
for file in &files {
|
||||
if file.is_file() {
|
||||
if std::fs::remove_file(file).is_ok() {
|
||||
removed_count += 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
// Try to remove empty parent directories
|
||||
for file in &files {
|
||||
if let Some(parent) = file.parent() {
|
||||
std::fs::remove_dir(parent).ok();
|
||||
}
|
||||
}
|
||||
println!(
|
||||
"{} {} (removed {}/{} files)",
|
||||
"Removed".green().bold(),
|
||||
name,
|
||||
removed_count,
|
||||
files.len()
|
||||
);
|
||||
}
|
||||
None => {
|
||||
println!("{} '{}' is not installed", "Warning:".yellow().bold(), name);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Commands::Upgrade { packages } => {
|
||||
let db = PackageDb::open(&config.paths.db_dir)?;
|
||||
|
||||
// Load all repos to compare available vs installed versions
|
||||
let mut all_repo_packages = std::collections::HashMap::new();
|
||||
for repo in &config.repos {
|
||||
let repo_pkgs = resolver::DependencyGraph::load_repo(&repo.path)?;
|
||||
all_repo_packages.extend(repo_pkgs);
|
||||
}
|
||||
|
||||
// Determine which packages to upgrade
|
||||
let targets: Vec<String> = if packages.is_empty() {
|
||||
// Upgrade all installed packages that have newer versions
|
||||
db.list_all()
|
||||
.iter()
|
||||
.filter(|installed| {
|
||||
all_repo_packages
|
||||
.get(&installed.name)
|
||||
.map_or(false, |repo_pkg| repo_pkg.package.version != installed.version)
|
||||
})
|
||||
.map(|p| p.name.clone())
|
||||
.collect()
|
||||
} else {
|
||||
packages
|
||||
};
|
||||
|
||||
if targets.is_empty() {
|
||||
println!("{}", "All packages are up to date.".green());
|
||||
} else {
|
||||
println!("Packages to upgrade:");
|
||||
for name in &targets {
|
||||
let installed_ver = db.installed_version(name).unwrap_or("?");
|
||||
let repo_ver = all_repo_packages
|
||||
.get(name)
|
||||
.map(|p| p.package.version.as_str())
|
||||
.unwrap_or("?");
|
||||
println!(" {} {} → {}", name.bold(), installed_ver.red(), repo_ver.green());
|
||||
}
|
||||
|
||||
// Check for shared library conflicts before proceeding
|
||||
let _solib_map = resolver::solib::build_solib_map(&db);
|
||||
|
||||
for name in &targets {
|
||||
// Warn about packages that depend on this one
|
||||
let rdeps = resolver::reverse_deps(
|
||||
name,
|
||||
&all_repo_packages,
|
||||
&db.list_all().iter().map(|p| p.name.clone()).collect(),
|
||||
);
|
||||
if !rdeps.is_empty() {
|
||||
println!(
|
||||
"\n{} {} is depended on by: {}",
|
||||
"Note:".cyan().bold(),
|
||||
name,
|
||||
rdeps.join(", ")
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Proceed with the upgrade (remove old, install new)
|
||||
println!("\nProceeding with upgrade...");
|
||||
let mut orchestrator = BuildOrchestrator::new(config, db);
|
||||
orchestrator.install(&targets)?;
|
||||
}
|
||||
}
|
||||
|
||||
Commands::Search { query } => {
|
||||
// Search through all repos for matching package names/descriptions
|
||||
for repo in &config.repos {
|
||||
let packages = resolver::DependencyGraph::load_repo(&repo.path)?;
|
||||
for (name, pkg) in &packages {
|
||||
if name.contains(&query) || pkg.package.description.to_lowercase().contains(&query.to_lowercase()) {
|
||||
println!(
|
||||
"{}/{} {} — {}",
|
||||
repo.name.cyan(),
|
||||
name.bold(),
|
||||
pkg.package.version.green(),
|
||||
pkg.package.description
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Commands::Info { package } => {
|
||||
// Check installed first
|
||||
let db = PackageDb::open(&config.paths.db_dir)?;
|
||||
if let Some(installed) = db.get(&package) {
|
||||
println!("{}: {}", "Name".bold(), installed.name);
|
||||
println!("{}: {}", "Version".bold(), installed.version);
|
||||
println!("{}: {}", "Description".bold(), installed.description);
|
||||
println!("{}: {}", "Status".bold(), "installed".green());
|
||||
println!("{}: {}", "Repo".bold(), installed.repo);
|
||||
println!("{}: {}", "Files".bold(), installed.files.len());
|
||||
println!("{}: {} bytes", "Size".bold(), installed.size);
|
||||
if !installed.features.is_empty() {
|
||||
println!("{}: {}", "Features".bold(), installed.features.join(", "));
|
||||
}
|
||||
if !installed.run_deps.is_empty() {
|
||||
println!("{}: {}", "Run deps".bold(), installed.run_deps.join(", "));
|
||||
}
|
||||
} else {
|
||||
// Search repos
|
||||
for repo in &config.repos {
|
||||
if let Some(pkg_path) = repo.path.join(&package).join(format!("{}.toml", package)).exists().then(|| repo.path.join(&package).join(format!("{}.toml", package))) {
|
||||
let pkg = PackageDefinition::from_file(&pkg_path)?;
|
||||
println!("{}: {}", "Name".bold(), pkg.package.name);
|
||||
println!("{}: {}", "Version".bold(), pkg.package.version);
|
||||
println!("{}: {}", "Description".bold(), pkg.package.description);
|
||||
println!("{}: {}", "Status".bold(), "not installed".yellow());
|
||||
println!("{}: {}", "URL".bold(), pkg.package.url);
|
||||
println!("{}: {}", "License".bold(), pkg.package.license);
|
||||
if !pkg.dependencies.run.is_empty() {
|
||||
println!("{}: {}", "Run deps".bold(), pkg.dependencies.run.join(", "));
|
||||
}
|
||||
if !pkg.dependencies.build.is_empty() {
|
||||
println!("{}: {}", "Build deps".bold(), pkg.dependencies.build.join(", "));
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
println!("{} Package '{}' not found", "Error:".red().bold(), package);
|
||||
}
|
||||
}
|
||||
|
||||
Commands::List => {
|
||||
let db = PackageDb::open(&config.paths.db_dir)?;
|
||||
let all = db.list_all();
|
||||
if all.is_empty() {
|
||||
println!("No packages installed.");
|
||||
} else {
|
||||
println!("{} installed packages:", all.len());
|
||||
for pkg in &all {
|
||||
println!(
|
||||
" {} {} [{}]",
|
||||
pkg.name.bold(),
|
||||
pkg.version.green(),
|
||||
pkg.repo.cyan()
|
||||
);
|
||||
}
|
||||
println!("\nTotal disk usage: {} MB", db.total_size() / (1024 * 1024));
|
||||
}
|
||||
}
|
||||
|
||||
Commands::Convert { path, output } => {
|
||||
let input_path = std::path::Path::new(&path);
|
||||
if !input_path.exists() {
|
||||
anyhow::bail!("Input file not found: {}", path);
|
||||
}
|
||||
|
||||
println!("Converting: {}", path);
|
||||
let toml_output = converter::convert_file(input_path)?;
|
||||
|
||||
if let Some(out_path) = output {
|
||||
std::fs::write(&out_path, &toml_output)
|
||||
.with_context(|| format!("Failed to write: {}", out_path))?;
|
||||
println!("{} Written to: {}", "Converted!".green().bold(), out_path);
|
||||
} else {
|
||||
// Print to stdout
|
||||
println!("{}", "--- Converted TOML ---".cyan().bold());
|
||||
println!("{}", toml_output);
|
||||
}
|
||||
}
|
||||
|
||||
Commands::Check => {
|
||||
let db = PackageDb::open(&config.paths.db_dir)?;
|
||||
|
||||
// Check for file ownership conflicts
|
||||
let file_conflicts = db.find_conflicts();
|
||||
if file_conflicts.is_empty() {
|
||||
println!("{}", "No file ownership conflicts detected.".green());
|
||||
} else {
|
||||
println!(
|
||||
"{} {} file conflict(s) found:",
|
||||
"Warning:".yellow().bold(),
|
||||
file_conflicts.len()
|
||||
);
|
||||
for (file, owners) in &file_conflicts {
|
||||
println!(" {} — owned by: {}", file.display(), owners.join(", "));
|
||||
}
|
||||
}
|
||||
|
||||
// Build solib dependency map
|
||||
println!("\nScanning shared library dependencies...");
|
||||
let solib_map = resolver::solib::build_solib_map(&db);
|
||||
println!(
|
||||
"Tracked {} unique shared libraries across {} packages.",
|
||||
solib_map.len(),
|
||||
db.count()
|
||||
);
|
||||
|
||||
// Report any libraries linked by multiple packages
|
||||
let multi_user_libs: Vec<_> = solib_map
|
||||
.iter()
|
||||
.filter(|(_, pkgs)| pkgs.len() > 2)
|
||||
.collect();
|
||||
if !multi_user_libs.is_empty() {
|
||||
println!(
|
||||
"\n{} libraries used by 3+ packages (upgrade with care):",
|
||||
"Widely-used".cyan().bold()
|
||||
);
|
||||
for (lib, pkgs) in &multi_user_libs {
|
||||
println!(" {} — {} packages", lib, pkgs.len());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
389
src/dpack/src/resolver/mod.rs
Normal file
389
src/dpack/src/resolver/mod.rs
Normal file
@@ -0,0 +1,389 @@
|
||||
//! Dependency resolution engine.
|
||||
//!
|
||||
//! Resolves a package's full dependency tree into a topologically sorted
|
||||
//! build order. Handles:
|
||||
//! - Direct runtime dependencies
|
||||
//! - Build-time dependencies
|
||||
//! - Optional feature dependencies
|
||||
//! - Circular dependency detection
|
||||
//! - Version constraints (basic)
|
||||
//!
|
||||
//! The resolver operates on a `PackageGraph` built from the repository's
|
||||
//! package definitions and the installed package database.
|
||||
|
||||
pub mod solib;
|
||||
|
||||
use anyhow::{bail, Context, Result};
|
||||
use std::collections::{HashMap, HashSet};
|
||||
use std::path::Path;
|
||||
|
||||
use crate::config::PackageDefinition;
|
||||
|
||||
/// The result of dependency resolution: an ordered list of packages to build.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ResolutionPlan {
|
||||
/// Packages in topological order (build these first-to-last)
|
||||
pub build_order: Vec<ResolvedPackage>,
|
||||
|
||||
/// Packages that are already installed and don't need rebuilding
|
||||
pub already_installed: Vec<String>,
|
||||
}
|
||||
|
||||
/// A single package in the resolution plan.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ResolvedPackage {
|
||||
/// Package name
|
||||
pub name: String,
|
||||
|
||||
/// Package version
|
||||
pub version: String,
|
||||
|
||||
/// Whether this is a build-only dependency (not needed at runtime)
|
||||
pub build_only: bool,
|
||||
|
||||
/// Which features are enabled for this package
|
||||
pub features: Vec<String>,
|
||||
|
||||
/// Path to the package definition file
|
||||
pub definition_path: std::path::PathBuf,
|
||||
}
|
||||
|
||||
/// The dependency graph used internally for resolution.
|
||||
pub struct DependencyGraph {
|
||||
/// All known package definitions, keyed by name
|
||||
packages: HashMap<String, PackageDefinition>,
|
||||
|
||||
/// Set of currently installed packages (name -> version)
|
||||
installed: HashMap<String, String>,
|
||||
}
|
||||
|
||||
impl DependencyGraph {
|
||||
/// Create a new graph from a set of package definitions and installed state.
|
||||
pub fn new(
|
||||
packages: HashMap<String, PackageDefinition>,
|
||||
installed: HashMap<String, String>,
|
||||
) -> Self {
|
||||
Self {
|
||||
packages,
|
||||
installed,
|
||||
}
|
||||
}
|
||||
|
||||
/// Load all package definitions from a repository directory.
|
||||
///
|
||||
/// Expects: `repo_dir/<name>/<name>.toml`
|
||||
pub fn load_repo(repo_dir: &Path) -> Result<HashMap<String, PackageDefinition>> {
|
||||
let mut packages = HashMap::new();
|
||||
|
||||
if !repo_dir.is_dir() {
|
||||
return Ok(packages);
|
||||
}
|
||||
|
||||
for entry in std::fs::read_dir(repo_dir)
|
||||
.with_context(|| format!("Failed to read repo: {}", repo_dir.display()))?
|
||||
{
|
||||
let entry = entry?;
|
||||
if !entry.file_type()?.is_dir() {
|
||||
continue;
|
||||
}
|
||||
|
||||
let pkg_name = entry.file_name().to_string_lossy().to_string();
|
||||
let toml_path = entry.path().join(format!("{}.toml", pkg_name));
|
||||
|
||||
if toml_path.exists() {
|
||||
match PackageDefinition::from_file(&toml_path) {
|
||||
Ok(pkg) => {
|
||||
packages.insert(pkg_name, pkg);
|
||||
}
|
||||
Err(e) => {
|
||||
log::warn!("Skipping {}: {}", toml_path.display(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(packages)
|
||||
}
|
||||
|
||||
/// Resolve all dependencies for the given package names.
|
||||
///
|
||||
/// Returns a topologically sorted build order. Detects circular deps.
|
||||
pub fn resolve(
|
||||
&self,
|
||||
targets: &[String],
|
||||
enabled_features: &HashMap<String, Vec<String>>,
|
||||
) -> Result<ResolutionPlan> {
|
||||
let mut visited: HashSet<String> = HashSet::new();
|
||||
let mut in_stack: HashSet<String> = HashSet::new();
|
||||
let mut order: Vec<ResolvedPackage> = Vec::new();
|
||||
let mut already_installed: Vec<String> = Vec::new();
|
||||
|
||||
for target in targets {
|
||||
self.resolve_recursive(
|
||||
target,
|
||||
false, // not build-only
|
||||
enabled_features,
|
||||
&mut visited,
|
||||
&mut in_stack,
|
||||
&mut order,
|
||||
&mut already_installed,
|
||||
)?;
|
||||
}
|
||||
|
||||
Ok(ResolutionPlan {
|
||||
build_order: order,
|
||||
already_installed,
|
||||
})
|
||||
}
|
||||
|
||||
/// Recursive DFS for topological sort with cycle detection.
|
||||
fn resolve_recursive(
|
||||
&self,
|
||||
name: &str,
|
||||
build_only: bool,
|
||||
enabled_features: &HashMap<String, Vec<String>>,
|
||||
visited: &mut HashSet<String>,
|
||||
in_stack: &mut HashSet<String>,
|
||||
order: &mut Vec<ResolvedPackage>,
|
||||
already_installed: &mut Vec<String>,
|
||||
) -> Result<()> {
|
||||
// Already fully resolved
|
||||
if visited.contains(name) {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// Circular dependency detected
|
||||
if in_stack.contains(name) {
|
||||
bail!(
|
||||
"Circular dependency detected: '{}' depends on itself (chain: {:?})",
|
||||
name,
|
||||
in_stack
|
||||
);
|
||||
}
|
||||
|
||||
// Check if already installed at the right version
|
||||
if let Some(installed_version) = self.installed.get(name) {
|
||||
if let Some(pkg) = self.packages.get(name) {
|
||||
if installed_version == &pkg.package.version {
|
||||
already_installed.push(name.to_string());
|
||||
visited.insert(name.to_string());
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Look up the package definition
|
||||
let pkg = self
|
||||
.packages
|
||||
.get(name)
|
||||
.with_context(|| format!("Package '{}' not found in any repository", name))?;
|
||||
|
||||
in_stack.insert(name.to_string());
|
||||
|
||||
// Get features for this package
|
||||
let features = enabled_features
|
||||
.get(name)
|
||||
.cloned()
|
||||
.unwrap_or_else(|| pkg.default_features());
|
||||
|
||||
// Resolve build dependencies first
|
||||
for dep in &pkg.effective_build_deps(&features) {
|
||||
self.resolve_recursive(
|
||||
dep,
|
||||
true,
|
||||
enabled_features,
|
||||
visited,
|
||||
in_stack,
|
||||
order,
|
||||
already_installed,
|
||||
)?;
|
||||
}
|
||||
|
||||
// Then resolve runtime dependencies
|
||||
for dep in &pkg.effective_run_deps(&features) {
|
||||
self.resolve_recursive(
|
||||
dep,
|
||||
false,
|
||||
enabled_features,
|
||||
visited,
|
||||
in_stack,
|
||||
order,
|
||||
already_installed,
|
||||
)?;
|
||||
}
|
||||
|
||||
in_stack.remove(name);
|
||||
visited.insert(name.to_string());
|
||||
|
||||
order.push(ResolvedPackage {
|
||||
name: name.to_string(),
|
||||
version: pkg.package.version.clone(),
|
||||
build_only,
|
||||
features,
|
||||
definition_path: std::path::PathBuf::new(), // Set by caller
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Perform a simple reverse-dependency lookup: which installed packages
|
||||
/// depend on the given package?
|
||||
pub fn reverse_deps(
|
||||
package: &str,
|
||||
all_packages: &HashMap<String, PackageDefinition>,
|
||||
installed: &HashSet<String>,
|
||||
) -> Vec<String> {
|
||||
let mut rdeps = Vec::new();
|
||||
|
||||
for inst_name in installed {
|
||||
if let Some(pkg) = all_packages.get(inst_name) {
|
||||
let features = pkg.default_features();
|
||||
let all_deps: Vec<String> = pkg
|
||||
.effective_run_deps(&features)
|
||||
.into_iter()
|
||||
.chain(pkg.effective_build_deps(&features))
|
||||
.collect();
|
||||
|
||||
if all_deps.iter().any(|d| d == package) {
|
||||
rdeps.push(inst_name.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
rdeps
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::config::package::*;
|
||||
use std::collections::HashMap;
|
||||
|
||||
/// Helper: create a minimal PackageDefinition for testing.
|
||||
fn make_pkg(name: &str, version: &str, run_deps: Vec<&str>, build_deps: Vec<&str>) -> PackageDefinition {
|
||||
PackageDefinition {
|
||||
package: PackageMetadata {
|
||||
name: name.to_string(),
|
||||
version: version.to_string(),
|
||||
description: format!("Test package {}", name),
|
||||
url: "https://example.com".to_string(),
|
||||
license: "MIT".to_string(),
|
||||
epoch: 0,
|
||||
revision: 1,
|
||||
},
|
||||
source: SourceInfo {
|
||||
url: format!("https://example.com/{}-{}.tar.xz", name, version),
|
||||
sha256: "a".repeat(64),
|
||||
patches: vec![],
|
||||
},
|
||||
dependencies: Dependencies {
|
||||
run: run_deps.into_iter().map(String::from).collect(),
|
||||
build: build_deps.into_iter().map(String::from).collect(),
|
||||
optional: HashMap::new(),
|
||||
},
|
||||
build: BuildInstructions {
|
||||
configure: "./configure --prefix=/usr".to_string(),
|
||||
make: "make".to_string(),
|
||||
install: "make DESTDIR=${PKG} install".to_string(),
|
||||
prepare: String::new(),
|
||||
post_install: String::new(),
|
||||
check: String::new(),
|
||||
flags: BuildFlags::default(),
|
||||
system: BuildSystem::default(),
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_simple_resolution() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("zlib".to_string(), make_pkg("zlib", "1.3.1", vec![], vec!["gcc", "make"]));
|
||||
packages.insert("gcc".to_string(), make_pkg("gcc", "15.2.0", vec![], vec![]));
|
||||
packages.insert("make".to_string(), make_pkg("make", "4.4.1", vec![], vec![]));
|
||||
|
||||
let graph = DependencyGraph::new(packages, HashMap::new());
|
||||
let plan = graph.resolve(&["zlib".to_string()], &HashMap::new()).unwrap();
|
||||
|
||||
assert_eq!(plan.build_order.len(), 3);
|
||||
// gcc and make should come before zlib
|
||||
let names: Vec<&str> = plan.build_order.iter().map(|p| p.name.as_str()).collect();
|
||||
let zlib_pos = names.iter().position(|&n| n == "zlib").unwrap();
|
||||
let gcc_pos = names.iter().position(|&n| n == "gcc").unwrap();
|
||||
let make_pos = names.iter().position(|&n| n == "make").unwrap();
|
||||
assert!(gcc_pos < zlib_pos);
|
||||
assert!(make_pos < zlib_pos);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_circular_dependency_detection() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("a".to_string(), make_pkg("a", "1.0", vec!["b"], vec![]));
|
||||
packages.insert("b".to_string(), make_pkg("b", "1.0", vec!["a"], vec![]));
|
||||
|
||||
let graph = DependencyGraph::new(packages, HashMap::new());
|
||||
let result = graph.resolve(&["a".to_string()], &HashMap::new());
|
||||
assert!(result.is_err());
|
||||
assert!(result.unwrap_err().to_string().contains("Circular"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_already_installed() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("zlib".to_string(), make_pkg("zlib", "1.3.1", vec![], vec![]));
|
||||
|
||||
let mut installed = HashMap::new();
|
||||
installed.insert("zlib".to_string(), "1.3.1".to_string());
|
||||
|
||||
let graph = DependencyGraph::new(packages, installed);
|
||||
let plan = graph.resolve(&["zlib".to_string()], &HashMap::new()).unwrap();
|
||||
|
||||
assert!(plan.build_order.is_empty());
|
||||
assert_eq!(plan.already_installed, vec!["zlib"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_missing_dependency() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("foo".to_string(), make_pkg("foo", "1.0", vec!["missing"], vec![]));
|
||||
|
||||
let graph = DependencyGraph::new(packages, HashMap::new());
|
||||
let result = graph.resolve(&["foo".to_string()], &HashMap::new());
|
||||
assert!(result.is_err());
|
||||
assert!(result.unwrap_err().to_string().contains("missing"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_diamond_dependency() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("a".to_string(), make_pkg("a", "1.0", vec![], vec![]));
|
||||
packages.insert("b".to_string(), make_pkg("b", "1.0", vec!["a"], vec![]));
|
||||
packages.insert("c".to_string(), make_pkg("c", "1.0", vec!["a"], vec![]));
|
||||
packages.insert("d".to_string(), make_pkg("d", "1.0", vec!["b", "c"], vec![]));
|
||||
|
||||
let graph = DependencyGraph::new(packages, HashMap::new());
|
||||
let plan = graph.resolve(&["d".to_string()], &HashMap::new()).unwrap();
|
||||
|
||||
let names: Vec<&str> = plan.build_order.iter().map(|p| p.name.as_str()).collect();
|
||||
// A should appear only once
|
||||
assert_eq!(names.iter().filter(|&&n| n == "a").count(), 1);
|
||||
// A before B and C, B and C before D
|
||||
let a_pos = names.iter().position(|&n| n == "a").unwrap();
|
||||
let d_pos = names.iter().position(|&n| n == "d").unwrap();
|
||||
assert!(a_pos < d_pos);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_reverse_deps() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("zlib".to_string(), make_pkg("zlib", "1.3.1", vec![], vec![]));
|
||||
packages.insert("curl".to_string(), make_pkg("curl", "8.0", vec!["zlib"], vec![]));
|
||||
packages.insert("git".to_string(), make_pkg("git", "2.0", vec!["curl"], vec![]));
|
||||
|
||||
let installed: HashSet<String> = ["zlib", "curl", "git"].iter().map(|s| s.to_string()).collect();
|
||||
|
||||
let rdeps = reverse_deps("zlib", &packages, &installed);
|
||||
assert!(rdeps.contains(&"curl".to_string()));
|
||||
assert!(!rdeps.contains(&"git".to_string())); // git depends on curl, not zlib directly
|
||||
}
|
||||
}
|
||||
311
src/dpack/src/resolver/solib.rs
Normal file
311
src/dpack/src/resolver/solib.rs
Normal file
@@ -0,0 +1,311 @@
|
||||
//! Shared library conflict detection and resolution.
|
||||
//!
|
||||
//! When upgrading or installing a package that provides a shared library,
|
||||
//! check if other installed packages depend on the old version of that library.
|
||||
//!
|
||||
//! Resolution strategies:
|
||||
//! 1. Check if dependents have an update that works with the new lib version
|
||||
//! 2. If yes, offer to upgrade them too
|
||||
//! 3. If no, offer: (a) static compilation, (b) hold back, (c) force
|
||||
//!
|
||||
//! Implementation uses `readelf` or `objdump` to parse ELF shared library
|
||||
//! dependencies from installed binaries.
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use std::collections::{HashMap, HashSet};
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::process::Command;
|
||||
|
||||
use crate::db::PackageDb;
|
||||
|
||||
/// A shared library dependency found in an ELF binary.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
|
||||
pub struct SharedLib {
|
||||
/// Library soname (e.g., "libz.so.1")
|
||||
pub soname: String,
|
||||
|
||||
/// Full path to the library file
|
||||
pub path: Option<PathBuf>,
|
||||
}
|
||||
|
||||
/// A conflict where a library upgrade would break a dependent package.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct LibConflict {
|
||||
/// The library being upgraded
|
||||
pub library: String,
|
||||
|
||||
/// The old soname that dependents link against
|
||||
pub old_soname: String,
|
||||
|
||||
/// The new soname after the upgrade
|
||||
pub new_soname: String,
|
||||
|
||||
/// Packages that depend on the old soname
|
||||
pub affected_packages: Vec<String>,
|
||||
}
|
||||
|
||||
/// Resolution action chosen by the user.
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum ConflictResolution {
|
||||
/// Upgrade all affected packages
|
||||
UpgradeAll,
|
||||
/// Compile the new package with static linking
|
||||
StaticLink,
|
||||
/// Hold back the library (don't upgrade)
|
||||
HoldBack,
|
||||
/// Force the upgrade (user accepts the risk)
|
||||
Force,
|
||||
}
|
||||
|
||||
/// Scan an ELF binary for shared library dependencies.
|
||||
///
|
||||
/// Uses `readelf -d` to extract NEEDED entries.
|
||||
pub fn get_needed_libs(binary_path: &Path) -> Result<Vec<String>> {
|
||||
let output = Command::new("readelf")
|
||||
.args(["-d", &binary_path.to_string_lossy()])
|
||||
.output()
|
||||
.or_else(|_| {
|
||||
// Fallback to objdump
|
||||
Command::new("objdump")
|
||||
.args(["-p", &binary_path.to_string_lossy()])
|
||||
.output()
|
||||
})
|
||||
.context("Neither readelf nor objdump available")?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&output.stdout);
|
||||
let mut libs = Vec::new();
|
||||
|
||||
for line in stdout.lines() {
|
||||
// readelf format: 0x0000000000000001 (NEEDED) Shared library: [libz.so.1]
|
||||
if line.contains("NEEDED") {
|
||||
if let Some(start) = line.find('[') {
|
||||
if let Some(end) = line.find(']') {
|
||||
libs.push(line[start + 1..end].to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
// objdump format: NEEDED libz.so.1
|
||||
else if line.trim().starts_with("NEEDED") {
|
||||
if let Some(lib) = line.split_whitespace().last() {
|
||||
libs.push(lib.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(libs)
|
||||
}
|
||||
|
||||
/// Get the soname of a shared library file.
|
||||
///
|
||||
/// Uses `readelf -d` to extract the SONAME entry.
|
||||
pub fn get_soname(lib_path: &Path) -> Result<Option<String>> {
|
||||
let output = Command::new("readelf")
|
||||
.args(["-d", &lib_path.to_string_lossy()])
|
||||
.output()
|
||||
.context("readelf not available")?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&output.stdout);
|
||||
|
||||
for line in stdout.lines() {
|
||||
if line.contains("SONAME") {
|
||||
if let Some(start) = line.find('[') {
|
||||
if let Some(end) = line.find(']') {
|
||||
return Ok(Some(line[start + 1..end].to_string()));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(None)
|
||||
}
|
||||
|
||||
/// Build a map of soname → packages that link against it.
|
||||
///
|
||||
/// Scans all ELF binaries/libraries in installed packages.
|
||||
pub fn build_solib_map(db: &PackageDb) -> HashMap<String, Vec<String>> {
|
||||
let mut map: HashMap<String, Vec<String>> = HashMap::new();
|
||||
|
||||
for pkg in db.list_all() {
|
||||
for file in &pkg.files {
|
||||
// Only check ELF binaries and shared libraries
|
||||
let ext = file.extension().map(|e| e.to_string_lossy().to_string());
|
||||
let is_elf = file.starts_with("/usr/bin")
|
||||
|| file.starts_with("/usr/lib")
|
||||
|| file.starts_with("/usr/sbin")
|
||||
|| ext.as_deref() == Some("so")
|
||||
|| file.to_string_lossy().contains(".so.");
|
||||
|
||||
if !is_elf || !file.exists() {
|
||||
continue;
|
||||
}
|
||||
|
||||
if let Ok(libs) = get_needed_libs(file) {
|
||||
for lib in libs {
|
||||
map.entry(lib)
|
||||
.or_default()
|
||||
.push(pkg.name.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Deduplicate package names per soname
|
||||
for pkgs in map.values_mut() {
|
||||
pkgs.sort();
|
||||
pkgs.dedup();
|
||||
}
|
||||
|
||||
map
|
||||
}
|
||||
|
||||
/// Check if upgrading a package would cause shared library conflicts.
|
||||
///
|
||||
/// Compares the old package's provided sonames with the new package's sonames.
|
||||
/// If a soname changes (e.g., `libfoo.so.1` → `libfoo.so.2`), find all
|
||||
/// packages that link against the old soname.
|
||||
pub fn check_upgrade_conflicts(
|
||||
package_name: &str,
|
||||
old_files: &[PathBuf],
|
||||
new_files: &[PathBuf],
|
||||
solib_map: &HashMap<String, Vec<String>>,
|
||||
) -> Vec<LibConflict> {
|
||||
let mut conflicts = Vec::new();
|
||||
|
||||
// Find sonames provided by the old version
|
||||
let old_sonames = collect_provided_sonames(old_files);
|
||||
let new_sonames = collect_provided_sonames(new_files);
|
||||
|
||||
// Check for sonames that exist in old but not in new
|
||||
for old_so in &old_sonames {
|
||||
if !new_sonames.contains(old_so) {
|
||||
// Find the replacement (if any) — same base name, different version
|
||||
let base = soname_base(old_so);
|
||||
let replacement = new_sonames
|
||||
.iter()
|
||||
.find(|s| soname_base(s) == base)
|
||||
.cloned()
|
||||
.unwrap_or_else(|| "REMOVED".to_string());
|
||||
|
||||
// Find affected packages
|
||||
if let Some(dependents) = solib_map.get(old_so) {
|
||||
let affected: Vec<String> = dependents
|
||||
.iter()
|
||||
.filter(|p| p.as_str() != package_name) // Exclude self
|
||||
.cloned()
|
||||
.collect();
|
||||
|
||||
if !affected.is_empty() {
|
||||
conflicts.push(LibConflict {
|
||||
library: package_name.to_string(),
|
||||
old_soname: old_so.clone(),
|
||||
new_soname: replacement,
|
||||
affected_packages: affected,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
conflicts
|
||||
}
|
||||
|
||||
/// Collect sonames provided by a set of files.
|
||||
fn collect_provided_sonames(files: &[PathBuf]) -> HashSet<String> {
|
||||
let mut sonames = HashSet::new();
|
||||
|
||||
for file in files {
|
||||
if file.to_string_lossy().contains(".so") && file.exists() {
|
||||
if let Ok(Some(soname)) = get_soname(file) {
|
||||
sonames.insert(soname);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
sonames
|
||||
}
|
||||
|
||||
/// Extract the base name from a soname (strip version suffix).
|
||||
/// e.g., "libz.so.1" → "libz.so", "libfoo.so.2.3.4" → "libfoo.so"
|
||||
fn soname_base(soname: &str) -> String {
|
||||
if let Some(pos) = soname.find(".so.") {
|
||||
soname[..pos + 3].to_string() // Include ".so"
|
||||
} else {
|
||||
soname.to_string()
|
||||
}
|
||||
}
|
||||
|
||||
/// Format a conflict report for display to the user.
|
||||
pub fn format_conflict_report(conflicts: &[LibConflict]) -> String {
|
||||
if conflicts.is_empty() {
|
||||
return "No shared library conflicts detected.".to_string();
|
||||
}
|
||||
|
||||
let mut report = String::new();
|
||||
report.push_str(&format!(
|
||||
"WARNING: {} shared library conflict(s) detected:\n\n",
|
||||
conflicts.len()
|
||||
));
|
||||
|
||||
for conflict in conflicts {
|
||||
report.push_str(&format!(
|
||||
" Library: {} → {}\n",
|
||||
conflict.old_soname, conflict.new_soname
|
||||
));
|
||||
report.push_str(&format!(" Source: {}\n", conflict.library));
|
||||
report.push_str(&format!(
|
||||
" Affected packages: {}\n",
|
||||
conflict.affected_packages.join(", ")
|
||||
));
|
||||
report.push_str("\n Options:\n");
|
||||
report.push_str(" 1. Upgrade affected packages\n");
|
||||
report.push_str(" 2. Compile with static linking\n");
|
||||
report.push_str(" 3. Hold back the upgrade\n");
|
||||
report.push_str(" 4. Force (accept the risk)\n\n");
|
||||
}
|
||||
|
||||
report
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_soname_base() {
|
||||
assert_eq!(soname_base("libz.so.1"), "libz.so");
|
||||
assert_eq!(soname_base("libfoo.so.2.3.4"), "libfoo.so");
|
||||
assert_eq!(soname_base("libbar.so"), "libbar.so");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_check_upgrade_no_conflict() {
|
||||
let old_files: Vec<PathBuf> = vec![];
|
||||
let new_files: Vec<PathBuf> = vec![];
|
||||
let solib_map = HashMap::new();
|
||||
|
||||
let conflicts = check_upgrade_conflicts("test", &old_files, &new_files, &solib_map);
|
||||
assert!(conflicts.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_format_empty_report() {
|
||||
let report = format_conflict_report(&[]);
|
||||
assert!(report.contains("No shared library conflicts"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_format_conflict_report() {
|
||||
let conflicts = vec![LibConflict {
|
||||
library: "zlib".to_string(),
|
||||
old_soname: "libz.so.1".to_string(),
|
||||
new_soname: "libz.so.2".to_string(),
|
||||
affected_packages: vec!["curl".to_string(), "git".to_string()],
|
||||
}];
|
||||
|
||||
let report = format_conflict_report(&conflicts);
|
||||
assert!(report.contains("libz.so.1"));
|
||||
assert!(report.contains("libz.so.2"));
|
||||
assert!(report.contains("curl"));
|
||||
assert!(report.contains("git"));
|
||||
}
|
||||
}
|
||||
340
src/dpack/src/sandbox/mod.rs
Normal file
340
src/dpack/src/sandbox/mod.rs
Normal file
@@ -0,0 +1,340 @@
|
||||
//! Build sandboxing using Linux namespaces or bubblewrap.
|
||||
//!
|
||||
//! Isolates package builds so that:
|
||||
//! - Only declared dependencies are visible in the sandbox's filesystem
|
||||
//! - Build processes run in a separate PID namespace
|
||||
//! - Network access is blocked by default (configurable)
|
||||
//! - Installed files are captured via `DESTDIR` to a staging area
|
||||
//!
|
||||
//! Two backends are supported:
|
||||
//! 1. **bubblewrap (bwrap)** — preferred, lightweight, unprivileged
|
||||
//! 2. **direct** — no sandboxing (fallback for bootstrapping or debugging)
|
||||
|
||||
use anyhow::{bail, Context, Result};
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::process::Command;
|
||||
|
||||
use crate::config::{DpackConfig, PackageDefinition};
|
||||
|
||||
/// Sandbox backend selection.
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
pub enum SandboxBackend {
|
||||
/// Use bubblewrap for isolation
|
||||
Bubblewrap,
|
||||
/// No sandboxing — run directly (for bootstrap or debugging)
|
||||
Direct,
|
||||
}
|
||||
|
||||
/// A configured build sandbox ready to execute a package build.
|
||||
pub struct BuildSandbox {
|
||||
/// The backend to use
|
||||
backend: SandboxBackend,
|
||||
|
||||
/// Working directory for the build (contains extracted source)
|
||||
build_dir: PathBuf,
|
||||
|
||||
/// Staging directory where `DESTDIR` installs to
|
||||
staging_dir: PathBuf,
|
||||
|
||||
/// Path to bubblewrap binary
|
||||
bwrap_path: PathBuf,
|
||||
|
||||
/// Whether to allow network access during build
|
||||
allow_network: bool,
|
||||
|
||||
/// Paths to bind-mount read-only into the sandbox (dependency install dirs)
|
||||
ro_binds: Vec<(PathBuf, PathBuf)>,
|
||||
|
||||
/// Environment variables to set in the sandbox
|
||||
env_vars: Vec<(String, String)>,
|
||||
}
|
||||
|
||||
impl BuildSandbox {
|
||||
/// Create a new sandbox for building a package.
|
||||
pub fn new(
|
||||
config: &DpackConfig,
|
||||
pkg: &PackageDefinition,
|
||||
build_dir: &Path,
|
||||
staging_dir: &Path,
|
||||
) -> Result<Self> {
|
||||
std::fs::create_dir_all(build_dir)
|
||||
.with_context(|| format!("Failed to create build dir: {}", build_dir.display()))?;
|
||||
std::fs::create_dir_all(staging_dir)
|
||||
.with_context(|| format!("Failed to create staging dir: {}", staging_dir.display()))?;
|
||||
|
||||
let backend = if config.sandbox.enabled {
|
||||
// Check if bwrap is available
|
||||
if config.sandbox.bwrap_path.exists() {
|
||||
SandboxBackend::Bubblewrap
|
||||
} else {
|
||||
log::warn!(
|
||||
"Bubblewrap not found at {}, falling back to direct execution",
|
||||
config.sandbox.bwrap_path.display()
|
||||
);
|
||||
SandboxBackend::Direct
|
||||
}
|
||||
} else {
|
||||
SandboxBackend::Direct
|
||||
};
|
||||
|
||||
// Set up environment variables for the build
|
||||
let cflags = config.effective_cflags(&pkg.build.flags.cflags).to_string();
|
||||
let cxxflags = if pkg.build.flags.cxxflags.is_empty() {
|
||||
cflags.clone()
|
||||
} else {
|
||||
pkg.build.flags.cxxflags.clone()
|
||||
};
|
||||
let ldflags = config.effective_ldflags(&pkg.build.flags.ldflags).to_string();
|
||||
let makeflags = if pkg.build.flags.makeflags.is_empty() {
|
||||
config.flags.makeflags.clone()
|
||||
} else {
|
||||
pkg.build.flags.makeflags.clone()
|
||||
};
|
||||
|
||||
let env_vars = vec![
|
||||
("CFLAGS".to_string(), cflags),
|
||||
("CXXFLAGS".to_string(), cxxflags),
|
||||
("LDFLAGS".to_string(), ldflags),
|
||||
("MAKEFLAGS".to_string(), makeflags),
|
||||
("PKG".to_string(), staging_dir.to_string_lossy().to_string()),
|
||||
("HOME".to_string(), "/tmp".to_string()),
|
||||
("PATH".to_string(), "/usr/bin:/usr/sbin:/bin:/sbin".to_string()),
|
||||
("LC_ALL".to_string(), "POSIX".to_string()),
|
||||
];
|
||||
|
||||
Ok(Self {
|
||||
backend,
|
||||
build_dir: build_dir.to_path_buf(),
|
||||
staging_dir: staging_dir.to_path_buf(),
|
||||
bwrap_path: config.sandbox.bwrap_path.clone(),
|
||||
allow_network: config.sandbox.allow_network,
|
||||
ro_binds: Vec::new(),
|
||||
env_vars,
|
||||
})
|
||||
}
|
||||
|
||||
/// Add a read-only bind mount (e.g., dependency install paths).
|
||||
pub fn add_ro_bind(&mut self, host_path: PathBuf, sandbox_path: PathBuf) {
|
||||
self.ro_binds.push((host_path, sandbox_path));
|
||||
}
|
||||
|
||||
/// Execute a shell command inside the sandbox.
|
||||
pub fn exec(&self, command: &str) -> Result<()> {
|
||||
if command.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
log::info!("Sandbox exec: {}", command);
|
||||
|
||||
match &self.backend {
|
||||
SandboxBackend::Direct => self.exec_direct(command),
|
||||
SandboxBackend::Bubblewrap => self.exec_bwrap(command),
|
||||
}
|
||||
}
|
||||
|
||||
/// Execute without sandboxing.
|
||||
fn exec_direct(&self, command: &str) -> Result<()> {
|
||||
let mut cmd = Command::new("bash");
|
||||
cmd.arg("-c").arg(command);
|
||||
cmd.current_dir(&self.build_dir);
|
||||
|
||||
for (key, val) in &self.env_vars {
|
||||
cmd.env(key, val);
|
||||
}
|
||||
|
||||
let status = cmd
|
||||
.status()
|
||||
.with_context(|| format!("Failed to execute: {}", command))?;
|
||||
|
||||
if !status.success() {
|
||||
bail!(
|
||||
"Command failed with exit code {}: {}",
|
||||
status.code().unwrap_or(-1),
|
||||
command
|
||||
);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Execute inside a bubblewrap sandbox.
|
||||
fn exec_bwrap(&self, command: &str) -> Result<()> {
|
||||
let mut cmd = Command::new(&self.bwrap_path);
|
||||
|
||||
// Base filesystem: overlay the build directory as writable
|
||||
cmd.arg("--bind").arg(&self.build_dir).arg("/build");
|
||||
cmd.arg("--bind").arg(&self.staging_dir).arg("/staging");
|
||||
|
||||
// Mount essential system directories read-only
|
||||
cmd.arg("--ro-bind").arg("/usr").arg("/usr");
|
||||
cmd.arg("--ro-bind").arg("/lib").arg("/lib");
|
||||
if Path::new("/lib64").exists() {
|
||||
cmd.arg("--ro-bind").arg("/lib64").arg("/lib64");
|
||||
}
|
||||
cmd.arg("--ro-bind").arg("/bin").arg("/bin");
|
||||
cmd.arg("--ro-bind").arg("/sbin").arg("/sbin");
|
||||
|
||||
// Mount /dev minimal
|
||||
cmd.arg("--dev").arg("/dev");
|
||||
|
||||
// Mount /proc and /tmp
|
||||
cmd.arg("--proc").arg("/proc");
|
||||
cmd.arg("--tmpfs").arg("/tmp");
|
||||
|
||||
// Dependency bind mounts
|
||||
for (host, sandbox) in &self.ro_binds {
|
||||
cmd.arg("--ro-bind").arg(host).arg(sandbox);
|
||||
}
|
||||
|
||||
// PID namespace
|
||||
cmd.arg("--unshare-pid");
|
||||
|
||||
// Network isolation (unless explicitly allowed)
|
||||
if !self.allow_network {
|
||||
cmd.arg("--unshare-net");
|
||||
}
|
||||
|
||||
// Set working directory
|
||||
cmd.arg("--chdir").arg("/build");
|
||||
|
||||
// Environment variables
|
||||
for (key, val) in &self.env_vars {
|
||||
cmd.arg("--setenv").arg(key).arg(val);
|
||||
}
|
||||
|
||||
// Override PKG to point to sandbox staging path
|
||||
cmd.arg("--setenv").arg("PKG").arg("/staging");
|
||||
|
||||
// The actual command
|
||||
cmd.arg("bash").arg("-c").arg(command);
|
||||
|
||||
let status = cmd
|
||||
.status()
|
||||
.with_context(|| format!("Bubblewrap execution failed: {}", command))?;
|
||||
|
||||
if !status.success() {
|
||||
bail!(
|
||||
"Sandboxed command failed with exit code {}: {}",
|
||||
status.code().unwrap_or(-1),
|
||||
command
|
||||
);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Run the full build sequence: prepare → configure → make → install
|
||||
pub fn run_build(&self, pkg: &PackageDefinition) -> Result<()> {
|
||||
// Prepare step (optional: patching, autoreconf, etc.)
|
||||
if !pkg.build.prepare.is_empty() {
|
||||
log::info!(">>> Prepare step");
|
||||
self.exec(&pkg.build.prepare)?;
|
||||
}
|
||||
|
||||
// Configure step
|
||||
if !pkg.build.configure.is_empty() {
|
||||
log::info!(">>> Configure step");
|
||||
self.exec(&pkg.build.configure)?;
|
||||
}
|
||||
|
||||
// Build step
|
||||
log::info!(">>> Build step");
|
||||
self.exec(&pkg.build.make)?;
|
||||
|
||||
// Test step (optional)
|
||||
if !pkg.build.check.is_empty() {
|
||||
log::info!(">>> Check step");
|
||||
// Don't fail the build on test failures — log a warning
|
||||
if let Err(e) = self.exec(&pkg.build.check) {
|
||||
log::warn!("Check step failed (non-fatal): {}", e);
|
||||
}
|
||||
}
|
||||
|
||||
// Install step
|
||||
log::info!(">>> Install step");
|
||||
self.exec(&pkg.build.install)?;
|
||||
|
||||
// Post-install step (optional)
|
||||
if !pkg.build.post_install.is_empty() {
|
||||
log::info!(">>> Post-install step");
|
||||
self.exec(&pkg.build.post_install)?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Get the path to the staging directory where installed files landed.
|
||||
pub fn staging_dir(&self) -> &Path {
|
||||
&self.staging_dir
|
||||
}
|
||||
|
||||
/// Get the build directory path.
|
||||
pub fn build_dir(&self) -> &Path {
|
||||
&self.build_dir
|
||||
}
|
||||
}
|
||||
|
||||
/// Collect all files in the staging directory (for database tracking).
|
||||
pub fn collect_staged_files(staging_dir: &Path) -> Result<Vec<PathBuf>> {
|
||||
let mut files = Vec::new();
|
||||
|
||||
if !staging_dir.exists() {
|
||||
return Ok(files);
|
||||
}
|
||||
|
||||
for entry in walkdir::WalkDir::new(staging_dir)
|
||||
.min_depth(1)
|
||||
.into_iter()
|
||||
.filter_map(|e| e.ok())
|
||||
{
|
||||
if entry.file_type().is_file() || entry.file_type().is_symlink() {
|
||||
// Store path relative to staging dir (= absolute path on target)
|
||||
let rel = entry
|
||||
.path()
|
||||
.strip_prefix(staging_dir)
|
||||
.unwrap_or(entry.path());
|
||||
files.push(PathBuf::from("/").join(rel));
|
||||
}
|
||||
}
|
||||
|
||||
Ok(files)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_sandbox_backend_selection_disabled() {
|
||||
let mut config = DpackConfig::default();
|
||||
config.sandbox.enabled = false;
|
||||
|
||||
let pkg_toml = r#"
|
||||
[package]
|
||||
name = "test"
|
||||
version = "1.0"
|
||||
description = "test"
|
||||
url = "https://example.com"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://example.com/test-1.0.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
|
||||
[build]
|
||||
install = "make DESTDIR=${PKG} install"
|
||||
"#;
|
||||
let pkg = crate::config::PackageDefinition::from_str(pkg_toml).unwrap();
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-sandbox");
|
||||
let staging = std::env::temp_dir().join("dpack-test-staging");
|
||||
|
||||
let sandbox = BuildSandbox::new(&config, &pkg, &tmpdir, &staging).unwrap();
|
||||
assert_eq!(sandbox.backend, SandboxBackend::Direct);
|
||||
|
||||
// Cleanup
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
let _ = std::fs::remove_dir_all(&staging);
|
||||
}
|
||||
}
|
||||
Reference in New Issue
Block a user