Initial commit: DarkForge Linux — Phases 0-12
Complete from-scratch Linux distribution targeting AMD Ryzen 9 9950X3D + NVIDIA RTX 5090 on ASUS ROG CROSSHAIR X870E HERO. Deliverables: - dpack: custom package manager in Rust (3,800 lines) - TOML package parser, dependency resolver, build sandbox - CRUX Pkgfile and Gentoo ebuild converters - Shared library conflict detection - 124 package definitions across 4 repos (core/extra/desktop/gaming) - 34 toolchain bootstrap scripts (LFS 13.0 adapted for Zen 5) - Linux 6.19.8 kernel config (hardware-specific, fully commented) - SysVinit init system with rc.d service scripts - Live ISO builder (UEFI-only, squashfs+xorriso) - Interactive installer (GPT partitioning, EFISTUB boot) - Integration test checklist (docs/TESTING.md) No systemd. No bootloader. No display manager. Kernel boots via EFISTUB → auto-login → dwl Wayland compositor. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
1
src/dpack/.gitignore
vendored
Normal file
1
src/dpack/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
/target/
|
||||
2212
src/dpack/Cargo.lock
generated
Normal file
2212
src/dpack/Cargo.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
42
src/dpack/Cargo.toml
Normal file
42
src/dpack/Cargo.toml
Normal file
@@ -0,0 +1,42 @@
|
||||
[package]
|
||||
name = "dpack"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
description = "DarkForge Linux package manager — between CRUX pkgutils and Gentoo emerge"
|
||||
license = "MIT"
|
||||
authors = ["Danny"]
|
||||
|
||||
[dependencies]
|
||||
# TOML parsing for package definitions and database
|
||||
toml = "0.8"
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
|
||||
# CLI argument parsing
|
||||
clap = { version = "4", features = ["derive"] }
|
||||
|
||||
# Error handling
|
||||
anyhow = "1"
|
||||
thiserror = "2"
|
||||
|
||||
# File operations and checksums
|
||||
sha2 = "0.10"
|
||||
walkdir = "2"
|
||||
|
||||
# HTTP for source downloads
|
||||
reqwest = { version = "0.12", features = ["blocking", "rustls-tls"], default-features = false }
|
||||
|
||||
# Logging
|
||||
log = "0.4"
|
||||
env_logger = "0.11"
|
||||
|
||||
# Colorized terminal output
|
||||
colored = "2"
|
||||
|
||||
# Regex for converter modules
|
||||
regex = "1"
|
||||
|
||||
[dev-dependencies]
|
||||
tempfile = "3"
|
||||
assert_cmd = "2"
|
||||
predicates = "3"
|
||||
198
src/dpack/README.md
Normal file
198
src/dpack/README.md
Normal file
@@ -0,0 +1,198 @@
|
||||
# dpack — DarkForge Package Manager
|
||||
|
||||
A source-based package manager for DarkForge Linux, positioned between CRUX's `pkgutils` and Gentoo's `emerge` in complexity. Written in Rust.
|
||||
|
||||
## Features
|
||||
|
||||
- **TOML package definitions** — clean, readable package recipes
|
||||
- **Dependency resolution** — topological sort with circular dependency detection
|
||||
- **Build sandboxing** — bubblewrap (bwrap) isolation with PID/network namespaces
|
||||
- **Installed package database** — file-based TOML tracking in `/var/lib/dpack/db/`
|
||||
- **Full build orchestration** — download → checksum → extract → sandbox build → stage → commit → register
|
||||
- **CRUX Pkgfile converter** — convert CRUX ports to dpack format
|
||||
- **Gentoo ebuild converter** — best-effort conversion of Gentoo ebuilds (handles ~80% of cases)
|
||||
- **Shared library conflict detection** — ELF binary scanning via readelf/objdump
|
||||
- **Reverse dependency tracking** — warns before removing packages that others depend on
|
||||
|
||||
## Requirements
|
||||
|
||||
- Rust 1.75+ (build)
|
||||
- Linux (runtime — uses Linux namespaces for sandboxing)
|
||||
- bubblewrap (`bwrap`) for sandboxed builds (optional, falls back to direct execution)
|
||||
- `curl` or `wget` for source downloads
|
||||
- `tar` for source extraction
|
||||
- `readelf` or `objdump` for shared library scanning
|
||||
|
||||
## Building
|
||||
|
||||
```bash
|
||||
cd src/dpack
|
||||
cargo build --release
|
||||
```
|
||||
|
||||
The binary is at `target/release/dpack`. Install it:
|
||||
|
||||
```bash
|
||||
sudo install -m755 target/release/dpack /usr/local/bin/
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
# Install a package (resolves deps, builds in sandbox, installs, updates db)
|
||||
dpack install zlib
|
||||
|
||||
# Install multiple packages
|
||||
dpack install openssl curl git
|
||||
|
||||
# Remove a package (warns about reverse deps, removes files)
|
||||
dpack remove zlib
|
||||
|
||||
# Upgrade packages (compares installed vs repo versions)
|
||||
dpack upgrade # upgrade all outdated packages
|
||||
dpack upgrade openssl git # upgrade specific packages
|
||||
|
||||
# Search for packages
|
||||
dpack search compression
|
||||
|
||||
# Show package info
|
||||
dpack info zlib
|
||||
|
||||
# List installed packages
|
||||
dpack list
|
||||
|
||||
# Check for file conflicts and shared library issues
|
||||
dpack check
|
||||
|
||||
# Convert foreign package formats
|
||||
dpack convert /path/to/Pkgfile # CRUX → dpack TOML (stdout)
|
||||
dpack convert /path/to/curl-8.19.0.ebuild -o curl.toml # Gentoo → dpack TOML (file)
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
dpack reads its configuration from `/etc/dpack.conf` (TOML format). If the file doesn't exist, sensible defaults are used.
|
||||
|
||||
Example `/etc/dpack.conf`:
|
||||
|
||||
```toml
|
||||
[flags]
|
||||
cflags = "-march=znver5 -O2 -pipe -fomit-frame-pointer"
|
||||
cxxflags = "-march=znver5 -O2 -pipe -fomit-frame-pointer"
|
||||
ldflags = "-Wl,-O1,--as-needed"
|
||||
makeflags = "-j32"
|
||||
|
||||
[paths]
|
||||
db_dir = "/var/lib/dpack/db"
|
||||
repo_dir = "/var/lib/dpack/repos"
|
||||
source_dir = "/var/cache/dpack/sources"
|
||||
build_dir = "/var/tmp/dpack/build"
|
||||
|
||||
[sandbox]
|
||||
enabled = true
|
||||
allow_network = false
|
||||
bwrap_path = "/usr/bin/bwrap"
|
||||
|
||||
[[repos]]
|
||||
name = "core"
|
||||
path = "/var/lib/dpack/repos/core"
|
||||
priority = 0
|
||||
|
||||
[[repos]]
|
||||
name = "extra"
|
||||
path = "/var/lib/dpack/repos/extra"
|
||||
priority = 10
|
||||
|
||||
[[repos]]
|
||||
name = "desktop"
|
||||
path = "/var/lib/dpack/repos/desktop"
|
||||
priority = 20
|
||||
|
||||
[[repos]]
|
||||
name = "gaming"
|
||||
path = "/var/lib/dpack/repos/gaming"
|
||||
priority = 30
|
||||
```
|
||||
|
||||
## Package Definition Format
|
||||
|
||||
Package definitions are TOML files stored at `<repo>/<name>/<name>.toml`:
|
||||
|
||||
```toml
|
||||
[package]
|
||||
name = "zlib"
|
||||
version = "1.3.1"
|
||||
description = "Compression library implementing the deflate algorithm"
|
||||
url = "https://zlib.net/"
|
||||
license = "zlib"
|
||||
|
||||
[source]
|
||||
url = "https://zlib.net/zlib-${version}.tar.xz"
|
||||
sha256 = "38ef96b8dfe510d42707d9c781877914792541133e1870841463bfa73f883e32"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[dependencies.optional]
|
||||
static = { description = "Build static library", default = true }
|
||||
minizip = { description = "Build minizip utility", deps = [] }
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = "./configure --prefix=/usr"
|
||||
make = "make"
|
||||
install = "make DESTDIR=${PKG} install"
|
||||
|
||||
[build.flags]
|
||||
cflags = "" # empty = use global defaults
|
||||
ldflags = ""
|
||||
```
|
||||
|
||||
### Variables available in build commands
|
||||
|
||||
- `${PKG}` — staging directory (DESTDIR)
|
||||
- `${version}` — package version (expanded in source URL)
|
||||
|
||||
### Build systems
|
||||
|
||||
The `system` field is a hint: `autotools`, `cmake`, `meson`, `cargo`, or `custom`.
|
||||
|
||||
## Running Tests
|
||||
|
||||
```bash
|
||||
cargo test
|
||||
```
|
||||
|
||||
Tests cover: TOML parsing, dependency resolution (simple, diamond, circular), database operations (register, unregister, persistence, file ownership, conflicts), and converter parsing.
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
src/
|
||||
├── main.rs # CLI (clap) — install, remove, upgrade, search, info, list, convert, check
|
||||
├── lib.rs # Library re-exports
|
||||
├── config/
|
||||
│ ├── mod.rs # Module root
|
||||
│ ├── package.rs # PackageDefinition TOML structs + parsing + validation
|
||||
│ └── global.rs # DpackConfig (flags, paths, sandbox, repos)
|
||||
├── resolver/
|
||||
│ ├── mod.rs # DependencyGraph, topological sort, reverse deps
|
||||
│ └── solib.rs # Shared library conflict detection (ELF scanning)
|
||||
├── sandbox/
|
||||
│ └── mod.rs # BuildSandbox (bubblewrap + direct backends)
|
||||
├── converter/
|
||||
│ ├── mod.rs # Format auto-detection
|
||||
│ ├── crux.rs # CRUX Pkgfile parser
|
||||
│ └── gentoo.rs # Gentoo ebuild parser
|
||||
├── db/
|
||||
│ └── mod.rs # PackageDb (file-based TOML, installed tracking)
|
||||
└── build/
|
||||
└── mod.rs # BuildOrchestrator (download → build → install pipeline)
|
||||
```
|
||||
|
||||
## Repository
|
||||
|
||||
```
|
||||
git@git.dannyhaslund.dk:danny8632/dpack.git
|
||||
```
|
||||
339
src/dpack/src/build/mod.rs
Normal file
339
src/dpack/src/build/mod.rs
Normal file
@@ -0,0 +1,339 @@
|
||||
//! Package build orchestration.
|
||||
//!
|
||||
//! Coordinates the full install pipeline:
|
||||
//! 1. Resolve dependencies (via `resolver`)
|
||||
//! 2. Download source tarball
|
||||
//! 3. Verify SHA256 checksum
|
||||
//! 4. Extract source
|
||||
//! 5. Build in sandbox (via `sandbox`)
|
||||
//! 6. Collect installed files from staging
|
||||
//! 7. Commit files to the live filesystem
|
||||
//! 8. Update the package database (via `db`)
|
||||
|
||||
use anyhow::{bail, Context, Result};
|
||||
use sha2::{Digest, Sha256};
|
||||
use std::io::Read;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::time::{SystemTime, UNIX_EPOCH};
|
||||
|
||||
use crate::config::{DpackConfig, PackageDefinition};
|
||||
use crate::db::{InstalledPackage, PackageDb};
|
||||
use crate::resolver::{DependencyGraph, ResolvedPackage};
|
||||
use crate::sandbox::{self, BuildSandbox};
|
||||
|
||||
/// Orchestrate the full install of one or more packages.
|
||||
pub struct BuildOrchestrator {
|
||||
config: DpackConfig,
|
||||
db: PackageDb,
|
||||
}
|
||||
|
||||
impl BuildOrchestrator {
|
||||
/// Create a new orchestrator with the given config and database.
|
||||
pub fn new(config: DpackConfig, db: PackageDb) -> Self {
|
||||
Self { config, db }
|
||||
}
|
||||
|
||||
/// Install packages by name. Resolves deps, builds, installs.
|
||||
pub fn install(&mut self, package_names: &[String]) -> Result<()> {
|
||||
log::info!("Resolving dependencies for: {:?}", package_names);
|
||||
|
||||
// Load all repos
|
||||
let mut all_packages = std::collections::HashMap::new();
|
||||
for repo in &self.config.repos {
|
||||
let repo_pkgs = DependencyGraph::load_repo(&repo.path)?;
|
||||
all_packages.extend(repo_pkgs);
|
||||
}
|
||||
|
||||
let installed_versions = self.db.installed_versions();
|
||||
let graph = DependencyGraph::new(all_packages.clone(), installed_versions);
|
||||
|
||||
let plan = graph.resolve(
|
||||
package_names,
|
||||
&std::collections::HashMap::new(),
|
||||
)?;
|
||||
|
||||
if plan.build_order.is_empty() {
|
||||
println!("All requested packages are already installed.");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// Report the plan
|
||||
if !plan.already_installed.is_empty() {
|
||||
println!(
|
||||
"Already installed: {}",
|
||||
plan.already_installed.join(", ")
|
||||
);
|
||||
}
|
||||
|
||||
println!("Build order ({} packages):", plan.build_order.len());
|
||||
for (i, pkg) in plan.build_order.iter().enumerate() {
|
||||
let marker = if pkg.build_only { " [build-only]" } else { "" };
|
||||
println!(" {}. {}-{}{}", i + 1, pkg.name, pkg.version, marker);
|
||||
}
|
||||
println!();
|
||||
|
||||
// Build each package in order
|
||||
for resolved in &plan.build_order {
|
||||
let pkg_def = all_packages.get(&resolved.name).with_context(|| {
|
||||
format!("Package '{}' disappeared from repo", resolved.name)
|
||||
})?;
|
||||
|
||||
self.build_and_install(pkg_def, resolved)?;
|
||||
}
|
||||
|
||||
println!("All packages installed successfully.");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Build and install a single package.
|
||||
fn build_and_install(
|
||||
&mut self,
|
||||
pkg: &PackageDefinition,
|
||||
resolved: &ResolvedPackage,
|
||||
) -> Result<()> {
|
||||
let ident = pkg.ident();
|
||||
println!(">>> Building {}", ident);
|
||||
|
||||
// Step 1: Download source
|
||||
let source_path = self.download_source(pkg)?;
|
||||
|
||||
// Step 2: Verify checksum
|
||||
self.verify_checksum(&source_path, &pkg.source.sha256)?;
|
||||
|
||||
// Step 3: Extract source
|
||||
let build_dir = self.config.paths.build_dir.join(&ident);
|
||||
let staging_dir = self.config.paths.build_dir.join(format!("{}-staging", ident));
|
||||
|
||||
// Clean any previous attempt
|
||||
let _ = std::fs::remove_dir_all(&build_dir);
|
||||
let _ = std::fs::remove_dir_all(&staging_dir);
|
||||
|
||||
self.extract_source(&source_path, &build_dir)?;
|
||||
|
||||
// Step 4: Apply patches
|
||||
// Find the actual source directory (tarballs often have a top-level dir)
|
||||
let actual_build_dir = find_source_dir(&build_dir)?;
|
||||
|
||||
// Step 5: Build in sandbox
|
||||
let sandbox = BuildSandbox::new(
|
||||
&self.config,
|
||||
pkg,
|
||||
&actual_build_dir,
|
||||
&staging_dir,
|
||||
)?;
|
||||
|
||||
sandbox.run_build(pkg)?;
|
||||
|
||||
// Step 6: Collect installed files
|
||||
let staged_files = sandbox::collect_staged_files(&staging_dir)?;
|
||||
|
||||
if staged_files.is_empty() {
|
||||
log::warn!("No files were installed by {} — is the install step correct?", ident);
|
||||
}
|
||||
|
||||
// Step 7: Commit files to the live filesystem
|
||||
self.commit_staged_files(&staging_dir)?;
|
||||
|
||||
// Step 8: Update database
|
||||
let size = calculate_dir_size(&staging_dir);
|
||||
let record = InstalledPackage {
|
||||
name: pkg.package.name.clone(),
|
||||
version: pkg.package.version.clone(),
|
||||
description: pkg.package.description.clone(),
|
||||
run_deps: pkg.effective_run_deps(&resolved.features),
|
||||
build_deps: pkg.effective_build_deps(&resolved.features),
|
||||
features: resolved.features.clone(),
|
||||
files: staged_files,
|
||||
installed_at: SystemTime::now()
|
||||
.duration_since(UNIX_EPOCH)
|
||||
.unwrap()
|
||||
.as_secs(),
|
||||
repo: "core".to_string(), // TODO: track actual repo
|
||||
size,
|
||||
};
|
||||
|
||||
self.db.register(record)?;
|
||||
|
||||
// Cleanup
|
||||
let _ = std::fs::remove_dir_all(&build_dir);
|
||||
let _ = std::fs::remove_dir_all(&staging_dir);
|
||||
|
||||
println!(">>> {} installed successfully", ident);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Download the source tarball to the source cache.
|
||||
fn download_source(&self, pkg: &PackageDefinition) -> Result<PathBuf> {
|
||||
let url = pkg.expanded_source_url();
|
||||
let filename = url
|
||||
.rsplit('/')
|
||||
.next()
|
||||
.unwrap_or("source.tar.gz");
|
||||
let dest = self.config.paths.source_dir.join(filename);
|
||||
|
||||
std::fs::create_dir_all(&self.config.paths.source_dir)?;
|
||||
|
||||
if dest.exists() {
|
||||
log::info!("Source already cached: {}", dest.display());
|
||||
return Ok(dest);
|
||||
}
|
||||
|
||||
log::info!("Downloading: {}", url);
|
||||
|
||||
// Use curl/wget via subprocess for now — avoids pulling in reqwest
|
||||
// at build time for the bootstrap phase
|
||||
let status = std::process::Command::new("curl")
|
||||
.args(["-fLo", &dest.to_string_lossy(), &url])
|
||||
.status()
|
||||
.or_else(|_| {
|
||||
std::process::Command::new("wget")
|
||||
.args(["-O", &dest.to_string_lossy(), &url])
|
||||
.status()
|
||||
})
|
||||
.context("Neither curl nor wget available for downloading")?;
|
||||
|
||||
if !status.success() {
|
||||
bail!("Download failed for: {}", url);
|
||||
}
|
||||
|
||||
Ok(dest)
|
||||
}
|
||||
|
||||
/// Verify the SHA256 checksum of a file.
|
||||
fn verify_checksum(&self, path: &Path, expected: &str) -> Result<()> {
|
||||
log::info!("Verifying checksum: {}", path.display());
|
||||
|
||||
let mut file = std::fs::File::open(path)
|
||||
.with_context(|| format!("Failed to open: {}", path.display()))?;
|
||||
|
||||
let mut hasher = Sha256::new();
|
||||
let mut buffer = [0u8; 8192];
|
||||
|
||||
loop {
|
||||
let n = file.read(&mut buffer)?;
|
||||
if n == 0 {
|
||||
break;
|
||||
}
|
||||
hasher.update(&buffer[..n]);
|
||||
}
|
||||
|
||||
let actual = format!("{:x}", hasher.finalize());
|
||||
|
||||
if actual != expected {
|
||||
bail!(
|
||||
"Checksum mismatch for {}: expected {}, got {}",
|
||||
path.display(),
|
||||
expected,
|
||||
actual
|
||||
);
|
||||
}
|
||||
|
||||
log::info!("Checksum verified: {}", actual);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Extract a source tarball into the build directory.
|
||||
fn extract_source(&self, tarball: &Path, build_dir: &Path) -> Result<()> {
|
||||
std::fs::create_dir_all(build_dir)?;
|
||||
|
||||
let tarball_str = tarball.to_string_lossy();
|
||||
let build_str = build_dir.to_string_lossy();
|
||||
|
||||
// Determine tar flags based on extension
|
||||
let tar_flags = if tarball_str.ends_with(".tar.xz") || tarball_str.ends_with(".txz") {
|
||||
"-xJf"
|
||||
} else if tarball_str.ends_with(".tar.gz") || tarball_str.ends_with(".tgz") {
|
||||
"-xzf"
|
||||
} else if tarball_str.ends_with(".tar.bz2") || tarball_str.ends_with(".tbz2") {
|
||||
"-xjf"
|
||||
} else if tarball_str.ends_with(".tar.zst") {
|
||||
"--zstd -xf"
|
||||
} else {
|
||||
"-xf"
|
||||
};
|
||||
|
||||
let status = std::process::Command::new("tar")
|
||||
.arg(tar_flags)
|
||||
.arg(&*tarball_str)
|
||||
.arg("-C")
|
||||
.arg(&*build_str)
|
||||
.status()
|
||||
.context("Failed to run tar")?;
|
||||
|
||||
if !status.success() {
|
||||
bail!("Failed to extract: {}", tarball.display());
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Copy staged files from the staging directory to the live filesystem root.
|
||||
fn commit_staged_files(&self, staging_dir: &Path) -> Result<()> {
|
||||
if !staging_dir.exists() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// Walk the staging tree and copy each file to its target location
|
||||
for entry in walkdir::WalkDir::new(staging_dir)
|
||||
.min_depth(1)
|
||||
.into_iter()
|
||||
.filter_map(|e| e.ok())
|
||||
{
|
||||
let rel = entry.path().strip_prefix(staging_dir)?;
|
||||
let target = Path::new("/").join(rel);
|
||||
|
||||
if entry.file_type().is_dir() {
|
||||
std::fs::create_dir_all(&target).ok();
|
||||
} else if entry.file_type().is_file() {
|
||||
if let Some(parent) = target.parent() {
|
||||
std::fs::create_dir_all(parent).ok();
|
||||
}
|
||||
std::fs::copy(entry.path(), &target).with_context(|| {
|
||||
format!(
|
||||
"Failed to install file: {} -> {}",
|
||||
entry.path().display(),
|
||||
target.display()
|
||||
)
|
||||
})?;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Get a reference to the database.
|
||||
pub fn db(&self) -> &PackageDb {
|
||||
&self.db
|
||||
}
|
||||
|
||||
/// Get a mutable reference to the database.
|
||||
pub fn db_mut(&mut self) -> &mut PackageDb {
|
||||
&mut self.db
|
||||
}
|
||||
}
|
||||
|
||||
/// Find the actual source directory inside the extraction directory.
|
||||
/// Tarballs usually contain a top-level directory (e.g., `zlib-1.3.1/`).
|
||||
fn find_source_dir(build_dir: &Path) -> Result<PathBuf> {
|
||||
let entries: Vec<_> = std::fs::read_dir(build_dir)?
|
||||
.filter_map(|e| e.ok())
|
||||
.filter(|e| e.file_type().map_or(false, |t| t.is_dir()))
|
||||
.collect();
|
||||
|
||||
if entries.len() == 1 {
|
||||
Ok(entries[0].path())
|
||||
} else {
|
||||
// No single top-level directory — use the build dir itself
|
||||
Ok(build_dir.to_path_buf())
|
||||
}
|
||||
}
|
||||
|
||||
/// Calculate the total size of files in a directory.
|
||||
fn calculate_dir_size(dir: &Path) -> u64 {
|
||||
walkdir::WalkDir::new(dir)
|
||||
.into_iter()
|
||||
.filter_map(|e| e.ok())
|
||||
.filter(|e| e.file_type().is_file())
|
||||
.map(|e| e.metadata().map_or(0, |m| m.len()))
|
||||
.sum()
|
||||
}
|
||||
265
src/dpack/src/config/global.rs
Normal file
265
src/dpack/src/config/global.rs
Normal file
@@ -0,0 +1,265 @@
|
||||
//! Global dpack configuration (`/etc/dpack.conf`).
|
||||
//!
|
||||
//! Controls compiler flags, repository paths, sandbox settings, and other
|
||||
//! system-wide package manager behavior.
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
/// Default configuration file location
|
||||
pub const DEFAULT_CONFIG_PATH: &str = "/etc/dpack.conf";
|
||||
|
||||
/// Default package database location
|
||||
pub const DEFAULT_DB_PATH: &str = "/var/lib/dpack/db";
|
||||
|
||||
/// Default repository root
|
||||
pub const DEFAULT_REPO_PATH: &str = "/var/lib/dpack/repos";
|
||||
|
||||
/// Default source download cache
|
||||
pub const DEFAULT_SOURCE_DIR: &str = "/var/cache/dpack/sources";
|
||||
|
||||
/// Default build directory
|
||||
pub const DEFAULT_BUILD_DIR: &str = "/var/tmp/dpack/build";
|
||||
|
||||
/// Global dpack configuration.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct DpackConfig {
|
||||
/// Compiler and linker flags
|
||||
#[serde(default)]
|
||||
pub flags: GlobalFlags,
|
||||
|
||||
/// Paths for repos, database, sources, build directory
|
||||
#[serde(default)]
|
||||
pub paths: PathConfig,
|
||||
|
||||
/// Sandbox configuration
|
||||
#[serde(default)]
|
||||
pub sandbox: SandboxConfig,
|
||||
|
||||
/// Repository configuration
|
||||
#[serde(default)]
|
||||
pub repos: Vec<RepoConfig>,
|
||||
}
|
||||
|
||||
/// Global compiler flags — applied to all packages unless overridden.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct GlobalFlags {
|
||||
/// C compiler flags (e.g., "-march=znver5 -O2 -pipe -fomit-frame-pointer")
|
||||
pub cflags: String,
|
||||
|
||||
/// C++ compiler flags (defaults to same as cflags)
|
||||
pub cxxflags: String,
|
||||
|
||||
/// Linker flags (e.g., "-Wl,-O1,--as-needed")
|
||||
pub ldflags: String,
|
||||
|
||||
/// Make flags (e.g., "-j32")
|
||||
pub makeflags: String,
|
||||
}
|
||||
|
||||
impl Default for GlobalFlags {
|
||||
fn default() -> Self {
|
||||
// DarkForge defaults — Zen 5 optimized
|
||||
Self {
|
||||
cflags: "-march=znver5 -O2 -pipe -fomit-frame-pointer".to_string(),
|
||||
cxxflags: "-march=znver5 -O2 -pipe -fomit-frame-pointer".to_string(),
|
||||
ldflags: "-Wl,-O1,--as-needed".to_string(),
|
||||
makeflags: "-j32".to_string(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// File system paths used by dpack.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct PathConfig {
|
||||
/// Path to the installed package database
|
||||
pub db_dir: PathBuf,
|
||||
|
||||
/// Path to the package repository definitions
|
||||
pub repo_dir: PathBuf,
|
||||
|
||||
/// Path to cache downloaded source tarballs
|
||||
pub source_dir: PathBuf,
|
||||
|
||||
/// Path for build sandboxes / staging areas
|
||||
pub build_dir: PathBuf,
|
||||
}
|
||||
|
||||
impl Default for PathConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
db_dir: PathBuf::from(DEFAULT_DB_PATH),
|
||||
repo_dir: PathBuf::from(DEFAULT_REPO_PATH),
|
||||
source_dir: PathBuf::from(DEFAULT_SOURCE_DIR),
|
||||
build_dir: PathBuf::from(DEFAULT_BUILD_DIR),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Build sandbox configuration.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SandboxConfig {
|
||||
/// Enable sandboxing (mount/PID/net namespaces via bubblewrap)
|
||||
pub enabled: bool,
|
||||
|
||||
/// Allow network access during build (some packages need it)
|
||||
pub allow_network: bool,
|
||||
|
||||
/// Path to bubblewrap binary
|
||||
pub bwrap_path: PathBuf,
|
||||
}
|
||||
|
||||
impl Default for SandboxConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
enabled: true,
|
||||
allow_network: false,
|
||||
bwrap_path: PathBuf::from("/usr/bin/bwrap"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A package repository definition.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct RepoConfig {
|
||||
/// Repository name (e.g., "core", "extra", "gaming")
|
||||
pub name: String,
|
||||
|
||||
/// Path to the repository directory containing package definitions
|
||||
pub path: PathBuf,
|
||||
|
||||
/// Priority (lower = higher priority for conflict resolution)
|
||||
#[serde(default)]
|
||||
pub priority: u32,
|
||||
}
|
||||
|
||||
impl DpackConfig {
|
||||
/// Load configuration from a TOML file.
|
||||
pub fn from_file(path: &Path) -> Result<Self> {
|
||||
let content = std::fs::read_to_string(path)
|
||||
.with_context(|| format!("Failed to read config: {}", path.display()))?;
|
||||
toml::from_str(&content).context("Failed to parse dpack configuration")
|
||||
}
|
||||
|
||||
/// Load from the default location, or return defaults if not found.
|
||||
pub fn load_default() -> Self {
|
||||
let path = Path::new(DEFAULT_CONFIG_PATH);
|
||||
if path.exists() {
|
||||
Self::from_file(path).unwrap_or_else(|e| {
|
||||
log::warn!("Failed to load {}: {}, using defaults", path.display(), e);
|
||||
Self::default()
|
||||
})
|
||||
} else {
|
||||
Self::default()
|
||||
}
|
||||
}
|
||||
|
||||
/// Get the effective CFLAGS for a package (package override or global default).
|
||||
pub fn effective_cflags<'a>(&'a self, pkg_override: &'a str) -> &'a str {
|
||||
if pkg_override.is_empty() {
|
||||
&self.flags.cflags
|
||||
} else {
|
||||
pkg_override
|
||||
}
|
||||
}
|
||||
|
||||
/// Get the effective LDFLAGS for a package.
|
||||
pub fn effective_ldflags<'a>(&'a self, pkg_override: &'a str) -> &'a str {
|
||||
if pkg_override.is_empty() {
|
||||
&self.flags.ldflags
|
||||
} else {
|
||||
pkg_override
|
||||
}
|
||||
}
|
||||
|
||||
/// Find a package definition across all configured repos.
|
||||
/// Returns the first match by repo priority.
|
||||
pub fn find_package(&self, name: &str) -> Option<PathBuf> {
|
||||
let mut repos = self.repos.clone();
|
||||
repos.sort_by_key(|r| r.priority);
|
||||
|
||||
for repo in &repos {
|
||||
let pkg_path = repo.path.join(name).join(format!("{}.toml", name));
|
||||
if pkg_path.exists() {
|
||||
return Some(pkg_path);
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for DpackConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
flags: GlobalFlags::default(),
|
||||
paths: PathConfig::default(),
|
||||
sandbox: SandboxConfig::default(),
|
||||
repos: vec![
|
||||
RepoConfig {
|
||||
name: "core".to_string(),
|
||||
path: PathBuf::from("/var/lib/dpack/repos/core"),
|
||||
priority: 0,
|
||||
},
|
||||
RepoConfig {
|
||||
name: "extra".to_string(),
|
||||
path: PathBuf::from("/var/lib/dpack/repos/extra"),
|
||||
priority: 10,
|
||||
},
|
||||
RepoConfig {
|
||||
name: "desktop".to_string(),
|
||||
path: PathBuf::from("/var/lib/dpack/repos/desktop"),
|
||||
priority: 20,
|
||||
},
|
||||
RepoConfig {
|
||||
name: "gaming".to_string(),
|
||||
path: PathBuf::from("/var/lib/dpack/repos/gaming"),
|
||||
priority: 30,
|
||||
},
|
||||
],
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_default_config() {
|
||||
let config = DpackConfig::default();
|
||||
assert_eq!(config.flags.cflags, "-march=znver5 -O2 -pipe -fomit-frame-pointer");
|
||||
assert_eq!(config.flags.makeflags, "-j32");
|
||||
assert!(config.sandbox.enabled);
|
||||
assert!(!config.sandbox.allow_network);
|
||||
assert_eq!(config.repos.len(), 4);
|
||||
assert_eq!(config.repos[0].name, "core");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_effective_cflags_default() {
|
||||
let config = DpackConfig::default();
|
||||
assert_eq!(
|
||||
config.effective_cflags(""),
|
||||
"-march=znver5 -O2 -pipe -fomit-frame-pointer"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_effective_cflags_override() {
|
||||
let config = DpackConfig::default();
|
||||
assert_eq!(
|
||||
config.effective_cflags("-march=znver5 -O3 -pipe"),
|
||||
"-march=znver5 -O3 -pipe"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_config_roundtrip() {
|
||||
let config = DpackConfig::default();
|
||||
let toml_str = toml::to_string_pretty(&config).unwrap();
|
||||
let reparsed: DpackConfig = toml::from_str(&toml_str).unwrap();
|
||||
assert_eq!(reparsed.flags.cflags, config.flags.cflags);
|
||||
assert_eq!(reparsed.repos.len(), config.repos.len());
|
||||
}
|
||||
}
|
||||
19
src/dpack/src/config/mod.rs
Normal file
19
src/dpack/src/config/mod.rs
Normal file
@@ -0,0 +1,19 @@
|
||||
//! Configuration and package definition parsing.
|
||||
//!
|
||||
//! Handles reading `.toml` package definition files and the global dpack
|
||||
//! configuration. The package definition format is documented in CLAUDE.md §dpack.
|
||||
//!
|
||||
//! # Package Definition Format
|
||||
//!
|
||||
//! Package definitions are TOML files with these sections:
|
||||
//! - `[package]` — name, version, description, URL, license
|
||||
//! - `[source]` — download URL and SHA256 checksum
|
||||
//! - `[dependencies]` — runtime, build, and optional dependencies
|
||||
//! - `[build]` — configure, make, and install commands
|
||||
//! - `[build.flags]` — per-package compiler flag overrides
|
||||
|
||||
pub mod package;
|
||||
pub mod global;
|
||||
|
||||
pub use package::PackageDefinition;
|
||||
pub use global::DpackConfig;
|
||||
364
src/dpack/src/config/package.rs
Normal file
364
src/dpack/src/config/package.rs
Normal file
@@ -0,0 +1,364 @@
|
||||
//! Package definition structs and TOML parsing.
|
||||
//!
|
||||
//! A `.toml` package definition describes how to download, build, and install
|
||||
//! a single software package. This module defines the Rust structs that map
|
||||
//! to the TOML schema, plus loading/validation logic.
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
|
||||
/// Top-level package definition — the entire contents of a `.toml` file.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct PackageDefinition {
|
||||
pub package: PackageMetadata,
|
||||
pub source: SourceInfo,
|
||||
pub dependencies: Dependencies,
|
||||
pub build: BuildInstructions,
|
||||
}
|
||||
|
||||
/// The `[package]` section — basic metadata.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct PackageMetadata {
|
||||
/// Package name (must be unique within a repository)
|
||||
pub name: String,
|
||||
|
||||
/// Package version (semver or upstream version string)
|
||||
pub version: String,
|
||||
|
||||
/// Short description of the package
|
||||
pub description: String,
|
||||
|
||||
/// Upstream project URL
|
||||
pub url: String,
|
||||
|
||||
/// License identifier (SPDX preferred)
|
||||
pub license: String,
|
||||
|
||||
/// Optional epoch for version comparison when upstream resets versions
|
||||
#[serde(default)]
|
||||
pub epoch: u32,
|
||||
|
||||
/// Package revision (for repackaging without upstream version change)
|
||||
#[serde(default = "default_revision")]
|
||||
pub revision: u32,
|
||||
}
|
||||
|
||||
fn default_revision() -> u32 {
|
||||
1
|
||||
}
|
||||
|
||||
/// The `[source]` section — where to get the source code.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SourceInfo {
|
||||
/// Download URL. May contain `${version}` which is expanded at runtime.
|
||||
pub url: String,
|
||||
|
||||
/// SHA256 checksum of the source tarball
|
||||
pub sha256: String,
|
||||
|
||||
/// Optional: additional source files or patches to download
|
||||
#[serde(default)]
|
||||
pub patches: Vec<PatchInfo>,
|
||||
}
|
||||
|
||||
/// A patch to apply to the source before building.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct PatchInfo {
|
||||
/// Download URL for the patch
|
||||
pub url: String,
|
||||
|
||||
/// SHA256 checksum of the patch file
|
||||
pub sha256: String,
|
||||
|
||||
/// Strip level for `patch -p<N>` (default: 1)
|
||||
#[serde(default = "default_strip")]
|
||||
pub strip: u32,
|
||||
}
|
||||
|
||||
fn default_strip() -> u32 {
|
||||
1
|
||||
}
|
||||
|
||||
/// The `[dependencies]` section — what this package needs.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
pub struct Dependencies {
|
||||
/// Runtime dependencies (must be installed for the package to function)
|
||||
#[serde(default)]
|
||||
pub run: Vec<String>,
|
||||
|
||||
/// Build-time dependencies (only needed during compilation)
|
||||
#[serde(default)]
|
||||
pub build: Vec<String>,
|
||||
|
||||
/// Optional features — maps feature name to its definition
|
||||
#[serde(default)]
|
||||
pub optional: HashMap<String, OptionalDep>,
|
||||
}
|
||||
|
||||
/// An optional dependency / feature flag (inspired by Gentoo USE flags).
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct OptionalDep {
|
||||
/// Human-readable description of what this feature does
|
||||
pub description: String,
|
||||
|
||||
/// Whether this feature is enabled by default
|
||||
#[serde(default)]
|
||||
pub default: bool,
|
||||
|
||||
/// Additional runtime dependencies required by this feature
|
||||
#[serde(default)]
|
||||
pub deps: Vec<String>,
|
||||
|
||||
/// Additional build-time dependencies required by this feature
|
||||
#[serde(default)]
|
||||
pub build_deps: Vec<String>,
|
||||
}
|
||||
|
||||
/// The `[build]` section — how to compile and install.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct BuildInstructions {
|
||||
/// Configure command (e.g., `./configure --prefix=/usr`)
|
||||
/// May be empty for packages that don't need configuration.
|
||||
#[serde(default)]
|
||||
pub configure: String,
|
||||
|
||||
/// Build command (e.g., `make`)
|
||||
#[serde(default = "default_make")]
|
||||
pub make: String,
|
||||
|
||||
/// Install command (e.g., `make DESTDIR=${PKG} install`)
|
||||
pub install: String,
|
||||
|
||||
/// Optional: commands to run before configure (e.g., autoreconf, patching)
|
||||
#[serde(default)]
|
||||
pub prepare: String,
|
||||
|
||||
/// Optional: commands to run after install (e.g., cleanup, stripping)
|
||||
#[serde(default)]
|
||||
pub post_install: String,
|
||||
|
||||
/// Optional: custom test command
|
||||
#[serde(default)]
|
||||
pub check: String,
|
||||
|
||||
/// Per-package compiler flag overrides
|
||||
#[serde(default)]
|
||||
pub flags: BuildFlags,
|
||||
|
||||
/// Build system type hint (autotools, cmake, meson, cargo, custom)
|
||||
#[serde(default)]
|
||||
pub system: BuildSystem,
|
||||
}
|
||||
|
||||
fn default_make() -> String {
|
||||
"make".to_string()
|
||||
}
|
||||
|
||||
/// Per-package compiler flag overrides.
|
||||
/// Empty strings mean "use global defaults from dpack.conf".
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
pub struct BuildFlags {
|
||||
#[serde(default)]
|
||||
pub cflags: String,
|
||||
|
||||
#[serde(default)]
|
||||
pub cxxflags: String,
|
||||
|
||||
#[serde(default)]
|
||||
pub ldflags: String,
|
||||
|
||||
#[serde(default)]
|
||||
pub makeflags: String,
|
||||
}
|
||||
|
||||
/// Hint for the build system used by this package.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default, PartialEq)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum BuildSystem {
|
||||
#[default]
|
||||
Autotools,
|
||||
Cmake,
|
||||
Meson,
|
||||
Cargo,
|
||||
Custom,
|
||||
}
|
||||
|
||||
impl PackageDefinition {
|
||||
/// Load a package definition from a `.toml` file.
|
||||
pub fn from_file(path: &Path) -> Result<Self> {
|
||||
let content = std::fs::read_to_string(path)
|
||||
.with_context(|| format!("Failed to read package file: {}", path.display()))?;
|
||||
Self::from_str(&content)
|
||||
}
|
||||
|
||||
/// Parse a package definition from a TOML string.
|
||||
pub fn from_str(content: &str) -> Result<Self> {
|
||||
let pkg: Self = toml::from_str(content)
|
||||
.context("Failed to parse package definition TOML")?;
|
||||
pkg.validate()?;
|
||||
Ok(pkg)
|
||||
}
|
||||
|
||||
/// Serialize this definition back to TOML.
|
||||
pub fn to_toml(&self) -> Result<String> {
|
||||
toml::to_string_pretty(self).context("Failed to serialize package definition")
|
||||
}
|
||||
|
||||
/// Validate the package definition for correctness.
|
||||
fn validate(&self) -> Result<()> {
|
||||
anyhow::ensure!(!self.package.name.is_empty(), "Package name cannot be empty");
|
||||
anyhow::ensure!(!self.package.version.is_empty(), "Package version cannot be empty");
|
||||
anyhow::ensure!(!self.source.url.is_empty(), "Source URL cannot be empty");
|
||||
anyhow::ensure!(
|
||||
self.source.sha256.len() == 64 && self.source.sha256.chars().all(|c| c.is_ascii_hexdigit()),
|
||||
"SHA256 checksum must be exactly 64 hex characters, got: '{}'",
|
||||
self.source.sha256
|
||||
);
|
||||
anyhow::ensure!(!self.build.install.is_empty(), "Install command cannot be empty");
|
||||
|
||||
// Validate optional dep names don't contain spaces or special chars
|
||||
for name in self.dependencies.optional.keys() {
|
||||
anyhow::ensure!(
|
||||
name.chars().all(|c| c.is_alphanumeric() || c == '_' || c == '-'),
|
||||
"Optional dependency name '{}' contains invalid characters", name
|
||||
);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Expand `${version}` in the source URL.
|
||||
pub fn expanded_source_url(&self) -> String {
|
||||
self.source.url.replace("${version}", &self.package.version)
|
||||
}
|
||||
|
||||
/// Get all runtime dependencies, including those from enabled optional features.
|
||||
pub fn effective_run_deps(&self, enabled_features: &[String]) -> Vec<String> {
|
||||
let mut deps = self.dependencies.run.clone();
|
||||
for feature in enabled_features {
|
||||
if let Some(opt) = self.dependencies.optional.get(feature) {
|
||||
deps.extend(opt.deps.clone());
|
||||
}
|
||||
}
|
||||
deps
|
||||
}
|
||||
|
||||
/// Get all build dependencies, including those from enabled optional features.
|
||||
pub fn effective_build_deps(&self, enabled_features: &[String]) -> Vec<String> {
|
||||
let mut deps = self.dependencies.build.clone();
|
||||
for feature in enabled_features {
|
||||
if let Some(opt) = self.dependencies.optional.get(feature) {
|
||||
deps.extend(opt.build_deps.clone());
|
||||
}
|
||||
}
|
||||
deps
|
||||
}
|
||||
|
||||
/// Get the list of default-enabled features.
|
||||
pub fn default_features(&self) -> Vec<String> {
|
||||
self.dependencies
|
||||
.optional
|
||||
.iter()
|
||||
.filter(|(_, v)| v.default)
|
||||
.map(|(k, _)| k.clone())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Full identifier: "name-version"
|
||||
pub fn ident(&self) -> String {
|
||||
format!("{}-{}", self.package.name, self.package.version)
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
const SAMPLE_TOML: &str = r#"
|
||||
[package]
|
||||
name = "zlib"
|
||||
version = "1.3.1"
|
||||
description = "Compression library implementing the deflate algorithm"
|
||||
url = "https://zlib.net/"
|
||||
license = "zlib"
|
||||
|
||||
[source]
|
||||
url = "https://zlib.net/zlib-${version}.tar.xz"
|
||||
sha256 = "38ef96b8dfe510d42707d9c781877914792541133e1870841463bfa73f883e32"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[dependencies.optional]
|
||||
static = { description = "Build static library", default = true }
|
||||
minizip = { description = "Build minizip utility", deps = [] }
|
||||
|
||||
[build]
|
||||
configure = "./configure --prefix=/usr"
|
||||
make = "make"
|
||||
install = "make DESTDIR=${PKG} install"
|
||||
"#;
|
||||
|
||||
#[test]
|
||||
fn test_parse_zlib() {
|
||||
let pkg = PackageDefinition::from_str(SAMPLE_TOML).unwrap();
|
||||
assert_eq!(pkg.package.name, "zlib");
|
||||
assert_eq!(pkg.package.version, "1.3.1");
|
||||
assert_eq!(pkg.package.license, "zlib");
|
||||
assert_eq!(pkg.dependencies.build, vec!["gcc", "make"]);
|
||||
assert!(pkg.dependencies.optional.contains_key("static"));
|
||||
assert!(pkg.dependencies.optional.contains_key("minizip"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_expanded_source_url() {
|
||||
let pkg = PackageDefinition::from_str(SAMPLE_TOML).unwrap();
|
||||
assert_eq!(
|
||||
pkg.expanded_source_url(),
|
||||
"https://zlib.net/zlib-1.3.1.tar.xz"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_default_features() {
|
||||
let pkg = PackageDefinition::from_str(SAMPLE_TOML).unwrap();
|
||||
let defaults = pkg.default_features();
|
||||
assert!(defaults.contains(&"static".to_string()));
|
||||
assert!(!defaults.contains(&"minizip".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_effective_deps() {
|
||||
let pkg = PackageDefinition::from_str(SAMPLE_TOML).unwrap();
|
||||
let run_deps = pkg.effective_run_deps(&["minizip".to_string()]);
|
||||
// minizip has empty deps, so run_deps should still be empty
|
||||
assert!(run_deps.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_invalid_sha256() {
|
||||
let bad_toml = SAMPLE_TOML.replace(
|
||||
"38ef96b8dfe510d42707d9c781877914792541133e1870841463bfa73f883e32",
|
||||
"bad",
|
||||
);
|
||||
assert!(PackageDefinition::from_str(&bad_toml).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_empty_name() {
|
||||
let bad_toml = SAMPLE_TOML.replace("name = \"zlib\"", "name = \"\"");
|
||||
assert!(PackageDefinition::from_str(&bad_toml).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_roundtrip_toml() {
|
||||
let pkg = PackageDefinition::from_str(SAMPLE_TOML).unwrap();
|
||||
let serialized = pkg.to_toml().unwrap();
|
||||
let reparsed = PackageDefinition::from_str(&serialized).unwrap();
|
||||
assert_eq!(pkg.package.name, reparsed.package.name);
|
||||
assert_eq!(pkg.package.version, reparsed.package.version);
|
||||
}
|
||||
}
|
||||
432
src/dpack/src/converter/crux.rs
Normal file
432
src/dpack/src/converter/crux.rs
Normal file
@@ -0,0 +1,432 @@
|
||||
//! CRUX Pkgfile converter.
|
||||
//!
|
||||
//! Parses CRUX `Pkgfile` format (bash-like syntax) and emits a dpack
|
||||
//! `PackageDefinition`. Handles the common patterns:
|
||||
//! - Variable assignments: `name=`, `version=`, `release=`, `source=()`
|
||||
//! - Comment metadata: `# Description:`, `# URL:`, `# Depends on:`
|
||||
//! - Build function: `build() { ... }`
|
||||
//!
|
||||
//! CRUX Pkgfile format reference:
|
||||
//! - Variables are plain bash assignments
|
||||
//! - `source=()` is a bash array of URLs (may span multiple lines)
|
||||
//! - `build()` contains the full build logic
|
||||
//! - Dependencies are in comments, not formal fields
|
||||
|
||||
use anyhow::Result;
|
||||
use regex::Regex;
|
||||
use std::collections::HashMap;
|
||||
|
||||
use crate::config::package::*;
|
||||
|
||||
/// Parse a CRUX Pkgfile string into a dpack PackageDefinition.
|
||||
pub fn parse_pkgfile(content: &str) -> Result<PackageDefinition> {
|
||||
let mut name = String::new();
|
||||
let mut version = String::new();
|
||||
let mut release = 1u32;
|
||||
let mut description = String::new();
|
||||
let mut url = String::new();
|
||||
let mut _maintainer = String::new();
|
||||
let mut depends: Vec<String> = Vec::new();
|
||||
let mut optional_deps: Vec<String> = Vec::new();
|
||||
let source_urls: Vec<String>;
|
||||
|
||||
// --- Extract comment metadata ---
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
if let Some(desc) = trimmed.strip_prefix("# Description:") {
|
||||
description = desc.trim().to_string();
|
||||
} else if let Some(u) = trimmed.strip_prefix("# URL:") {
|
||||
url = u.trim().to_string();
|
||||
} else if let Some(m) = trimmed.strip_prefix("# Maintainer:") {
|
||||
_maintainer = m.trim().to_string();
|
||||
} else if let Some(d) = trimmed.strip_prefix("# Depends on:") {
|
||||
depends = d
|
||||
.split([',', ' '])
|
||||
.map(|s| s.trim().to_string())
|
||||
.filter(|s| !s.is_empty())
|
||||
.collect();
|
||||
} else if let Some(o) = trimmed.strip_prefix("# Optional:") {
|
||||
optional_deps = o
|
||||
.split([',', ' '])
|
||||
.map(|s| s.trim().to_string())
|
||||
.filter(|s| !s.is_empty())
|
||||
.collect();
|
||||
}
|
||||
}
|
||||
|
||||
// --- Extract variable assignments ---
|
||||
// name=value (no quotes or with quotes)
|
||||
let var_re = Regex::new(r#"^(\w+)=["']?([^"'\n]*)["']?\s*$"#).unwrap();
|
||||
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
if let Some(caps) = var_re.captures(trimmed) {
|
||||
let key = caps.get(1).unwrap().as_str();
|
||||
let val = caps.get(2).unwrap().as_str().trim();
|
||||
match key {
|
||||
"name" => name = val.to_string(),
|
||||
"version" => version = val.to_string(),
|
||||
"release" => release = val.parse().unwrap_or(1),
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// --- Extract source array ---
|
||||
source_urls = extract_source_array(content);
|
||||
|
||||
// --- Extract build function ---
|
||||
let build_body = extract_build_function(content);
|
||||
|
||||
// --- Parse build commands from the build function ---
|
||||
let (configure_cmd, make_cmd, install_cmd, prepare_cmd) =
|
||||
parse_build_commands(&build_body, &name, &version);
|
||||
|
||||
// --- Expand source URL (replace $name, $version, ${name}, ${version}) ---
|
||||
let primary_source = source_urls
|
||||
.first()
|
||||
.cloned()
|
||||
.unwrap_or_default();
|
||||
|
||||
let expanded_url = expand_crux_vars(&primary_source, &name, &version);
|
||||
|
||||
// Convert to template URL (replace version back with ${version})
|
||||
let template_url = expanded_url.replace(&version, "${version}");
|
||||
|
||||
// --- Build the PackageDefinition ---
|
||||
let mut optional_map = HashMap::new();
|
||||
for opt in &optional_deps {
|
||||
optional_map.insert(
|
||||
opt.clone(),
|
||||
OptionalDep {
|
||||
description: format!("Optional: {} support", opt),
|
||||
default: false,
|
||||
deps: vec![opt.clone()],
|
||||
build_deps: vec![],
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
Ok(PackageDefinition {
|
||||
package: PackageMetadata {
|
||||
name: name.clone(),
|
||||
version: version.clone(),
|
||||
description,
|
||||
url,
|
||||
license: String::new(), // CRUX doesn't track license in Pkgfile
|
||||
epoch: 0,
|
||||
revision: release,
|
||||
},
|
||||
source: SourceInfo {
|
||||
url: template_url,
|
||||
sha256: "FIXME_CHECKSUM".repeat(4)[..64].to_string(), // Placeholder
|
||||
patches: vec![],
|
||||
},
|
||||
dependencies: Dependencies {
|
||||
run: depends,
|
||||
build: vec![], // CRUX doesn't distinguish build vs runtime deps
|
||||
optional: optional_map,
|
||||
},
|
||||
build: BuildInstructions {
|
||||
configure: configure_cmd,
|
||||
make: make_cmd,
|
||||
install: install_cmd,
|
||||
prepare: prepare_cmd,
|
||||
post_install: String::new(),
|
||||
check: String::new(),
|
||||
flags: BuildFlags::default(),
|
||||
system: detect_build_system(&build_body),
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
/// Extract the source=() array from a Pkgfile.
|
||||
/// Handles single-line and multi-line arrays.
|
||||
fn extract_source_array(content: &str) -> Vec<String> {
|
||||
let mut sources = Vec::new();
|
||||
let mut in_source = false;
|
||||
let mut source_text = String::new();
|
||||
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
|
||||
if trimmed.starts_with("source=") || trimmed.starts_with("source =") {
|
||||
in_source = true;
|
||||
// Get everything after source=(
|
||||
let after_eq = trimmed.splitn(2, '=').nth(1).unwrap_or("");
|
||||
source_text.push_str(after_eq);
|
||||
if after_eq.contains(')') {
|
||||
in_source = false;
|
||||
}
|
||||
} else if in_source {
|
||||
source_text.push(' ');
|
||||
source_text.push_str(trimmed);
|
||||
if trimmed.contains(')') {
|
||||
in_source = false;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Strip parens and parse individual URLs
|
||||
let cleaned = source_text
|
||||
.trim_start_matches('(')
|
||||
.trim_end_matches(')')
|
||||
.trim();
|
||||
|
||||
for url in cleaned.split_whitespace() {
|
||||
let u = url.trim().to_string();
|
||||
if !u.is_empty() {
|
||||
sources.push(u);
|
||||
}
|
||||
}
|
||||
|
||||
sources
|
||||
}
|
||||
|
||||
/// Extract the build() function body from a Pkgfile.
|
||||
fn extract_build_function(content: &str) -> String {
|
||||
let mut in_build = false;
|
||||
let mut brace_depth = 0;
|
||||
let mut body = String::new();
|
||||
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
|
||||
if !in_build && (trimmed.starts_with("build()") || trimmed.starts_with("build ()")) {
|
||||
in_build = true;
|
||||
// Count braces on this line
|
||||
for ch in trimmed.chars() {
|
||||
match ch {
|
||||
'{' => brace_depth += 1,
|
||||
'}' => brace_depth -= 1,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
if in_build {
|
||||
for ch in trimmed.chars() {
|
||||
match ch {
|
||||
'{' => brace_depth += 1,
|
||||
'}' => brace_depth -= 1,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if brace_depth <= 0 {
|
||||
break;
|
||||
}
|
||||
|
||||
body.push_str(trimmed);
|
||||
body.push('\n');
|
||||
}
|
||||
}
|
||||
|
||||
body
|
||||
}
|
||||
|
||||
/// Parse configure/make/install commands from the build function body.
|
||||
fn parse_build_commands(
|
||||
body: &str,
|
||||
_name: &str,
|
||||
_version: &str,
|
||||
) -> (String, String, String, String) {
|
||||
let mut configure = String::new();
|
||||
let mut make = String::new();
|
||||
let mut install = String::new();
|
||||
let mut prepare = String::new();
|
||||
|
||||
let mut continuation = String::new();
|
||||
|
||||
for line in body.lines() {
|
||||
let trimmed = line.trim();
|
||||
|
||||
// Handle line continuations
|
||||
if trimmed.ends_with('\\') {
|
||||
continuation.push_str(&trimmed[..trimmed.len() - 1]);
|
||||
continuation.push(' ');
|
||||
continue;
|
||||
}
|
||||
|
||||
let full_line = if !continuation.is_empty() {
|
||||
let result = format!("{}{}", continuation, trimmed);
|
||||
continuation.clear();
|
||||
result
|
||||
} else {
|
||||
trimmed.to_string()
|
||||
};
|
||||
|
||||
let fl = full_line.trim();
|
||||
|
||||
// Detect configure-like commands
|
||||
if fl.starts_with("./configure")
|
||||
|| fl.starts_with("../configure")
|
||||
|| fl.starts_with("cmake")
|
||||
|| fl.starts_with("meson setup")
|
||||
|| fl.starts_with("meson ")
|
||||
{
|
||||
// Replace $PKG with ${PKG} for dpack template
|
||||
configure = fl.replace("$PKG", "${PKG}");
|
||||
}
|
||||
// Detect install commands
|
||||
else if fl.contains("DESTDIR=") && fl.contains("install")
|
||||
|| fl.starts_with("make install")
|
||||
|| fl.starts_with("make DESTDIR")
|
||||
|| fl.starts_with("meson install")
|
||||
|| fl.starts_with("DESTDIR=")
|
||||
|| fl.starts_with("ninja -C") && fl.contains("install")
|
||||
{
|
||||
install = fl.replace("$PKG", "${PKG}");
|
||||
}
|
||||
// Detect make/build commands
|
||||
else if fl == "make" || fl.starts_with("make -") || fl.starts_with("make ") && !fl.contains("install") {
|
||||
make = fl.to_string();
|
||||
} else if fl.starts_with("meson compile") || fl.starts_with("ninja") && !fl.contains("install") {
|
||||
make = fl.to_string();
|
||||
}
|
||||
// Detect prepare steps (patching, sed, autoreconf)
|
||||
else if fl.starts_with("sed ") || fl.starts_with("patch ") || fl.starts_with("autoreconf") {
|
||||
if !prepare.is_empty() {
|
||||
prepare.push_str(" && ");
|
||||
}
|
||||
prepare.push_str(fl);
|
||||
}
|
||||
}
|
||||
|
||||
// Default make if not found
|
||||
if make.is_empty() {
|
||||
make = "make".to_string();
|
||||
}
|
||||
|
||||
// Default install if not found
|
||||
if install.is_empty() {
|
||||
install = "make DESTDIR=${PKG} install".to_string();
|
||||
}
|
||||
|
||||
(configure, make, install, prepare)
|
||||
}
|
||||
|
||||
/// Expand CRUX variables in a string ($name, $version, ${name}, ${version}).
|
||||
fn expand_crux_vars(s: &str, name: &str, version: &str) -> String {
|
||||
s.replace("$name", name)
|
||||
.replace("${name}", name)
|
||||
.replace("$version", version)
|
||||
.replace("${version}", version)
|
||||
}
|
||||
|
||||
/// Detect the build system from the build function body.
|
||||
fn detect_build_system(body: &str) -> BuildSystem {
|
||||
if body.contains("meson setup") || body.contains("meson compile") {
|
||||
BuildSystem::Meson
|
||||
} else if body.contains("cmake") || body.contains("CMakeLists") {
|
||||
BuildSystem::Cmake
|
||||
} else if body.contains("cargo build") || body.contains("cargo install") {
|
||||
BuildSystem::Cargo
|
||||
} else if body.contains("./configure") || body.contains("../configure") {
|
||||
BuildSystem::Autotools
|
||||
} else {
|
||||
BuildSystem::Custom
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
const SAMPLE_PKGFILE: &str = r#"# Description: Compression library
|
||||
# URL: https://zlib.net/
|
||||
# Maintainer: Danny, danny@example.com
|
||||
# Depends on: gcc
|
||||
|
||||
name=zlib
|
||||
version=1.3.1
|
||||
release=1
|
||||
|
||||
source=(https://zlib.net/$name-$version.tar.xz)
|
||||
|
||||
build() {
|
||||
cd $name-$version
|
||||
|
||||
./configure --prefix=/usr
|
||||
|
||||
make
|
||||
make DESTDIR=$PKG install
|
||||
}
|
||||
"#;
|
||||
|
||||
#[test]
|
||||
fn test_parse_simple_pkgfile() {
|
||||
let pkg = parse_pkgfile(SAMPLE_PKGFILE).unwrap();
|
||||
assert_eq!(pkg.package.name, "zlib");
|
||||
assert_eq!(pkg.package.version, "1.3.1");
|
||||
assert_eq!(pkg.package.description, "Compression library");
|
||||
assert_eq!(pkg.package.url, "https://zlib.net/");
|
||||
assert_eq!(pkg.dependencies.run, vec!["gcc"]);
|
||||
assert_eq!(pkg.build.configure, "./configure --prefix=/usr");
|
||||
assert_eq!(pkg.build.install, "make DESTDIR=${PKG} install");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_source_url_expansion() {
|
||||
let pkg = parse_pkgfile(SAMPLE_PKGFILE).unwrap();
|
||||
let expanded = pkg.expanded_source_url();
|
||||
assert_eq!(expanded, "https://zlib.net/zlib-1.3.1.tar.xz");
|
||||
}
|
||||
|
||||
const COMPLEX_PKGFILE: &str = r#"# Description: A tool for transfering files with URL syntax
|
||||
# URL: https://curl.haxx.se
|
||||
# Maintainer: CRUX System Team
|
||||
# Depends on: libnghttp2 openssl zstd
|
||||
# Optional: brotli c-ares libpsl
|
||||
|
||||
name=curl
|
||||
version=8.19.0
|
||||
release=1
|
||||
|
||||
source=(https://curl.haxx.se/download/$name-$version.tar.xz)
|
||||
|
||||
build() {
|
||||
cd $name-$version
|
||||
|
||||
sed -i 's|/usr/share/curl|/etc/ssl/certs|' lib/url.c
|
||||
|
||||
./configure \
|
||||
--prefix=/usr \
|
||||
--enable-ipv6 \
|
||||
--with-openssl \
|
||||
--with-nghttp2 \
|
||||
--disable-ldap
|
||||
|
||||
make
|
||||
make DESTDIR=$PKG install
|
||||
}
|
||||
"#;
|
||||
|
||||
#[test]
|
||||
fn test_parse_complex_pkgfile() {
|
||||
let pkg = parse_pkgfile(COMPLEX_PKGFILE).unwrap();
|
||||
assert_eq!(pkg.package.name, "curl");
|
||||
assert_eq!(pkg.package.version, "8.19.0");
|
||||
assert_eq!(
|
||||
pkg.dependencies.run,
|
||||
vec!["libnghttp2", "openssl", "zstd"]
|
||||
);
|
||||
assert!(pkg.dependencies.optional.contains_key("brotli"));
|
||||
assert!(pkg.dependencies.optional.contains_key("c-ares"));
|
||||
assert!(pkg.build.configure.contains("--with-openssl"));
|
||||
assert!(pkg.build.prepare.contains("sed"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_detect_meson_build_system() {
|
||||
let body = "meson setup build --prefix=/usr\nmeson compile -C build\nDESTDIR=$PKG meson install -C build";
|
||||
assert_eq!(detect_build_system(body), BuildSystem::Meson);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_detect_cmake_build_system() {
|
||||
let body = "cmake -B build -DCMAKE_INSTALL_PREFIX=/usr\nmake -C build\nmake -C build DESTDIR=$PKG install";
|
||||
assert_eq!(detect_build_system(body), BuildSystem::Cmake);
|
||||
}
|
||||
}
|
||||
570
src/dpack/src/converter/gentoo.rs
Normal file
570
src/dpack/src/converter/gentoo.rs
Normal file
@@ -0,0 +1,570 @@
|
||||
//! Gentoo ebuild converter.
|
||||
//!
|
||||
//! Parses Gentoo `.ebuild` files and emits dpack `PackageDefinition` TOML.
|
||||
//! This is a best-effort converter — ebuilds can be extraordinarily complex
|
||||
//! (eclasses, slot deps, multilib, conditional USE deps). We handle the
|
||||
//! common 80% and flag the rest for manual review.
|
||||
//!
|
||||
//! What we extract:
|
||||
//! - DESCRIPTION, HOMEPAGE, SRC_URI, LICENSE
|
||||
//! - IUSE (USE flags → dpack optional deps)
|
||||
//! - RDEPEND, DEPEND, BDEPEND (dependencies)
|
||||
//! - src_configure/src_compile/src_install phase functions
|
||||
//!
|
||||
//! What requires manual review:
|
||||
//! - Complex eclass-dependent logic
|
||||
//! - Multilib builds (inherit multilib-minimal)
|
||||
//! - Slot dependencies and subslots
|
||||
//! - REQUIRED_USE constraints
|
||||
//! - Conditional dependency atoms with nested logic
|
||||
|
||||
use anyhow::Result;
|
||||
use regex::Regex;
|
||||
use std::collections::HashMap;
|
||||
|
||||
use crate::config::package::*;
|
||||
|
||||
/// Warnings generated during conversion that require manual review.
|
||||
#[derive(Debug, Default)]
|
||||
pub struct ConversionWarnings {
|
||||
pub warnings: Vec<String>,
|
||||
}
|
||||
|
||||
impl ConversionWarnings {
|
||||
fn warn(&mut self, msg: impl Into<String>) {
|
||||
self.warnings.push(msg.into());
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse a Gentoo ebuild string into a dpack PackageDefinition.
|
||||
///
|
||||
/// The filename is needed to extract name and version (Gentoo convention:
|
||||
/// `<name>-<version>.ebuild`).
|
||||
pub fn parse_ebuild(content: &str, filename: &str) -> Result<PackageDefinition> {
|
||||
let mut warnings = ConversionWarnings::default();
|
||||
|
||||
// Extract name and version from filename
|
||||
// Format: <name>-<version>.ebuild (e.g., curl-8.19.0.ebuild)
|
||||
let (name, version) = parse_ebuild_filename(filename)?;
|
||||
|
||||
// Extract simple variables
|
||||
let description = extract_var(content, "DESCRIPTION").unwrap_or_default();
|
||||
let homepage = extract_var(content, "HOMEPAGE").unwrap_or_default();
|
||||
let license = extract_var(content, "LICENSE").unwrap_or_default();
|
||||
let src_uri = extract_var(content, "SRC_URI").unwrap_or_default();
|
||||
let iuse = extract_var(content, "IUSE").unwrap_or_default();
|
||||
|
||||
// Check for eclasses that need manual review
|
||||
let inherits = extract_var(content, "inherit").unwrap_or_default();
|
||||
if inherits.contains("multilib-minimal") || inherits.contains("meson-multilib") {
|
||||
warnings.warn("Package uses multilib — may need separate 32-bit build definitions");
|
||||
}
|
||||
if inherits.contains("cargo") {
|
||||
warnings.warn("Package uses Rust cargo eclass — Rust crate deps may need manual handling");
|
||||
}
|
||||
if inherits.contains("git-r3") {
|
||||
warnings.warn("Package fetches from git — needs a release tarball URL instead");
|
||||
}
|
||||
|
||||
// Parse USE flags into optional dependencies
|
||||
let optional_deps = parse_use_flags(&iuse);
|
||||
|
||||
// Parse dependencies
|
||||
let rdepend = extract_multiline_var(content, "RDEPEND");
|
||||
let depend = extract_multiline_var(content, "DEPEND");
|
||||
let bdepend = extract_multiline_var(content, "BDEPEND");
|
||||
|
||||
let run_deps = parse_dep_atoms(&rdepend, &mut warnings);
|
||||
let build_deps = parse_dep_atoms(&bdepend, &mut warnings);
|
||||
|
||||
// If DEPEND is different from RDEPEND, merge its unique entries into build_deps
|
||||
let depend_parsed = parse_dep_atoms(&depend, &mut warnings);
|
||||
let extra_build_deps: Vec<String> = depend_parsed
|
||||
.into_iter()
|
||||
.filter(|d| !run_deps.contains(d) && !build_deps.contains(d))
|
||||
.collect();
|
||||
|
||||
let mut all_build_deps = build_deps;
|
||||
all_build_deps.extend(extra_build_deps);
|
||||
|
||||
// Parse build phase functions
|
||||
let configure_cmd = extract_phase_function(content, "src_configure");
|
||||
let compile_cmd = extract_phase_function(content, "src_compile");
|
||||
let install_cmd = extract_phase_function(content, "src_install");
|
||||
let prepare_cmd = extract_phase_function(content, "src_prepare");
|
||||
let test_cmd = extract_phase_function(content, "src_test");
|
||||
|
||||
// Determine build system from eclasses and configure commands
|
||||
let build_system = if inherits.contains("meson") {
|
||||
BuildSystem::Meson
|
||||
} else if inherits.contains("cmake") {
|
||||
BuildSystem::Cmake
|
||||
} else if inherits.contains("cargo") {
|
||||
BuildSystem::Cargo
|
||||
} else if inherits.contains("autotools") || configure_cmd.contains("econf") {
|
||||
BuildSystem::Autotools
|
||||
} else {
|
||||
BuildSystem::Custom
|
||||
};
|
||||
|
||||
// Convert econf to ./configure
|
||||
let configure_converted = convert_phase_to_commands(&configure_cmd, &build_system);
|
||||
let make_converted = convert_phase_to_commands(&compile_cmd, &build_system);
|
||||
let install_converted = convert_phase_to_commands(&install_cmd, &build_system);
|
||||
let prepare_converted = convert_phase_to_commands(&prepare_cmd, &build_system);
|
||||
let check_converted = convert_phase_to_commands(&test_cmd, &build_system);
|
||||
|
||||
// Parse SRC_URI into a clean URL
|
||||
let source_url = parse_src_uri(&src_uri, &name, &version);
|
||||
|
||||
// Check REQUIRED_USE for constraints
|
||||
let required_use = extract_multiline_var(content, "REQUIRED_USE");
|
||||
if !required_use.is_empty() {
|
||||
warnings.warn(format!(
|
||||
"REQUIRED_USE constraints exist — validate feature combinations: {}",
|
||||
required_use.chars().take(200).collect::<String>()
|
||||
));
|
||||
}
|
||||
|
||||
// Build the PackageDefinition
|
||||
let mut pkg = PackageDefinition {
|
||||
package: PackageMetadata {
|
||||
name: name.clone(),
|
||||
version: version.clone(),
|
||||
description,
|
||||
url: homepage,
|
||||
license,
|
||||
epoch: 0,
|
||||
revision: 1,
|
||||
},
|
||||
source: SourceInfo {
|
||||
url: source_url,
|
||||
sha256: "FIXME_CHECKSUM".repeat(4)[..64].to_string(),
|
||||
patches: vec![],
|
||||
},
|
||||
dependencies: Dependencies {
|
||||
run: run_deps,
|
||||
build: all_build_deps,
|
||||
optional: optional_deps,
|
||||
},
|
||||
build: BuildInstructions {
|
||||
configure: configure_converted,
|
||||
make: make_converted,
|
||||
install: if install_converted.is_empty() {
|
||||
"make DESTDIR=${PKG} install".to_string()
|
||||
} else {
|
||||
install_converted
|
||||
},
|
||||
prepare: prepare_converted,
|
||||
post_install: String::new(),
|
||||
check: check_converted,
|
||||
flags: BuildFlags::default(),
|
||||
system: build_system,
|
||||
},
|
||||
};
|
||||
|
||||
// Append warnings as comments in the TOML output
|
||||
// We do this by adding a note to the description
|
||||
if !warnings.warnings.is_empty() {
|
||||
let warning_text = warnings
|
||||
.warnings
|
||||
.iter()
|
||||
.map(|w| format!(" # REVIEW: {}", w))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n");
|
||||
pkg.package.description = format!(
|
||||
"{}\n# --- Conversion warnings (manual review needed) ---\n{}",
|
||||
pkg.package.description, warning_text
|
||||
);
|
||||
}
|
||||
|
||||
Ok(pkg)
|
||||
}
|
||||
|
||||
/// Parse ebuild filename into (name, version).
|
||||
/// Convention: `<name>-<version>.ebuild`
|
||||
fn parse_ebuild_filename(filename: &str) -> Result<(String, String)> {
|
||||
let stem = filename.strip_suffix(".ebuild").unwrap_or(filename);
|
||||
|
||||
// Find the version part: starts at the first `-` followed by a digit
|
||||
let re = Regex::new(r"^(.+?)-(\d.*)$").unwrap();
|
||||
|
||||
if let Some(caps) = re.captures(stem) {
|
||||
let name = caps.get(1).unwrap().as_str().to_string();
|
||||
let version = caps.get(2).unwrap().as_str().to_string();
|
||||
Ok((name, version))
|
||||
} else {
|
||||
anyhow::bail!("Cannot parse name/version from ebuild filename: {}", filename);
|
||||
}
|
||||
}
|
||||
|
||||
/// Extract a single-line variable assignment from ebuild content.
|
||||
fn extract_var(content: &str, var_name: &str) -> Option<String> {
|
||||
let re = Regex::new(&format!(
|
||||
r#"(?m)^{}=["']([^"']*?)["']\s*$"#,
|
||||
regex::escape(var_name)
|
||||
))
|
||||
.ok()?;
|
||||
|
||||
re.captures(content)
|
||||
.and_then(|caps| caps.get(1))
|
||||
.map(|m| m.as_str().to_string())
|
||||
}
|
||||
|
||||
/// Extract a multi-line variable (handles heredoc-style and continuation).
|
||||
fn extract_multiline_var(content: &str, var_name: &str) -> String {
|
||||
let mut result = String::new();
|
||||
let mut in_var = false;
|
||||
let mut quote_char = '"';
|
||||
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
|
||||
if !in_var {
|
||||
// Match: VARNAME="value or VARNAME='value
|
||||
let pattern = format!("{}=", var_name);
|
||||
if trimmed.starts_with(&pattern) {
|
||||
let after_eq = &trimmed[pattern.len()..];
|
||||
if after_eq.starts_with('"') {
|
||||
quote_char = '"';
|
||||
let inner = &after_eq[1..];
|
||||
if inner.ends_with('"') {
|
||||
// Single-line
|
||||
result = inner[..inner.len() - 1].to_string();
|
||||
return result;
|
||||
}
|
||||
result.push_str(inner);
|
||||
result.push('\n');
|
||||
in_var = true;
|
||||
} else if after_eq.starts_with('\'') {
|
||||
quote_char = '\'';
|
||||
let inner = &after_eq[1..];
|
||||
if inner.ends_with('\'') {
|
||||
result = inner[..inner.len() - 1].to_string();
|
||||
return result;
|
||||
}
|
||||
result.push_str(inner);
|
||||
result.push('\n');
|
||||
in_var = true;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
let close = format!("{}", quote_char);
|
||||
if trimmed.ends_with(quote_char) || trimmed == &close {
|
||||
let end = if trimmed.ends_with(quote_char) {
|
||||
&trimmed[..trimmed.len() - 1]
|
||||
} else {
|
||||
""
|
||||
};
|
||||
result.push_str(end);
|
||||
in_var = false;
|
||||
} else {
|
||||
result.push_str(trimmed);
|
||||
result.push('\n');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
result.trim().to_string()
|
||||
}
|
||||
|
||||
/// Parse IUSE string into optional dependency map.
|
||||
fn parse_use_flags(iuse: &str) -> HashMap<String, OptionalDep> {
|
||||
let mut map = HashMap::new();
|
||||
|
||||
for flag in iuse.split_whitespace() {
|
||||
let (name, default) = if let Some(stripped) = flag.strip_prefix('+') {
|
||||
(stripped.to_string(), true)
|
||||
} else if let Some(stripped) = flag.strip_prefix('-') {
|
||||
(stripped.to_string(), false)
|
||||
} else {
|
||||
(flag.to_string(), false)
|
||||
};
|
||||
|
||||
// Skip internal/system flags
|
||||
if name.starts_with("cpu_flags_")
|
||||
|| name.starts_with("video_cards_")
|
||||
|| name.starts_with("python_")
|
||||
|| name == "test"
|
||||
|| name == "doc"
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
map.insert(
|
||||
name.clone(),
|
||||
OptionalDep {
|
||||
description: format!("Enable {} support", name),
|
||||
default,
|
||||
deps: vec![], // Would need dep analysis to fill
|
||||
build_deps: vec![],
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
map
|
||||
}
|
||||
|
||||
/// Parse Gentoo dependency atoms into a flat list of package names.
|
||||
///
|
||||
/// Handles:
|
||||
/// - Simple atoms: `dev-libs/openssl`
|
||||
/// - Versioned: `>=dev-libs/openssl-1.0.2`
|
||||
/// - USE-conditional: `ssl? ( dev-libs/openssl )`
|
||||
/// - Slot: `dev-libs/openssl:0=`
|
||||
///
|
||||
/// Strips category prefixes and version constraints for dpack format.
|
||||
fn parse_dep_atoms(deps: &str, warnings: &mut ConversionWarnings) -> Vec<String> {
|
||||
let mut result = Vec::new();
|
||||
let atom_re = Regex::new(
|
||||
r"(?:>=|<=|~|=)?([a-zA-Z0-9_-]+/[a-zA-Z0-9_.+-]+?)(?:-\d[^\s\[\]:]*)?(?:\[.*?\])?(?::[\w/=*]*)?(?:\s|$)"
|
||||
).unwrap();
|
||||
|
||||
for caps in atom_re.captures_iter(deps) {
|
||||
if let Some(m) = caps.get(1) {
|
||||
let full_atom = m.as_str();
|
||||
// Strip category prefix (e.g., "dev-libs/" -> "")
|
||||
let pkg_name = full_atom
|
||||
.rsplit('/')
|
||||
.next()
|
||||
.unwrap_or(full_atom)
|
||||
.to_string();
|
||||
|
||||
// Skip virtual packages and test-only deps
|
||||
if full_atom.starts_with("virtual/") {
|
||||
continue;
|
||||
}
|
||||
|
||||
if !result.contains(&pkg_name) {
|
||||
result.push(pkg_name);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Detect complex constructs we can't fully parse
|
||||
if deps.contains("^^") || deps.contains("||") {
|
||||
warnings.warn("Complex dependency logic (^^ or ||) detected — manual review needed");
|
||||
}
|
||||
if deps.contains("${MULTILIB_USEDEP}") {
|
||||
warnings.warn("Multilib dependencies detected — 32-bit builds may be needed");
|
||||
}
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
/// Extract a phase function body (e.g., src_configure, src_install).
|
||||
fn extract_phase_function(content: &str, func_name: &str) -> String {
|
||||
let mut in_func = false;
|
||||
let mut brace_depth = 0;
|
||||
let mut body = String::new();
|
||||
|
||||
for line in content.lines() {
|
||||
let trimmed = line.trim();
|
||||
|
||||
if !in_func {
|
||||
// Match: func_name() { or func_name () {
|
||||
if trimmed.starts_with(func_name) && trimmed.contains('{') {
|
||||
in_func = true;
|
||||
for ch in trimmed.chars() {
|
||||
match ch {
|
||||
'{' => brace_depth += 1,
|
||||
'}' => brace_depth -= 1,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
if brace_depth <= 0 {
|
||||
break;
|
||||
}
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
if in_func {
|
||||
for ch in trimmed.chars() {
|
||||
match ch {
|
||||
'{' => brace_depth += 1,
|
||||
'}' => {
|
||||
brace_depth -= 1;
|
||||
if brace_depth <= 0 {
|
||||
return body.trim().to_string();
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
body.push_str(trimmed);
|
||||
body.push('\n');
|
||||
}
|
||||
}
|
||||
|
||||
body.trim().to_string()
|
||||
}
|
||||
|
||||
/// Convert Gentoo eclass helper calls to plain shell commands.
|
||||
fn convert_phase_to_commands(body: &str, _build_system: &BuildSystem) -> String {
|
||||
if body.is_empty() {
|
||||
return String::new();
|
||||
}
|
||||
|
||||
let mut result = body.to_string();
|
||||
|
||||
// Replace common Gentoo helpers
|
||||
result = result.replace("econf ", "./configure ");
|
||||
result = result.replace("econf\n", "./configure\n");
|
||||
result = result.replace("emake ", "make ");
|
||||
result = result.replace("emake\n", "make\n");
|
||||
result = result.replace("${ED}", "${PKG}");
|
||||
result = result.replace("${D}", "${PKG}");
|
||||
result = result.replace("${FILESDIR}", "./files");
|
||||
result = result.replace("${WORKDIR}", ".");
|
||||
result = result.replace("${S}", ".");
|
||||
result = result.replace("${P}", "${name}-${version}");
|
||||
result = result.replace("${PV}", "${version}");
|
||||
result = result.replace("${PN}", "${name}");
|
||||
|
||||
// Replace einstall
|
||||
result = result.replace("einstall", "make DESTDIR=${PKG} install");
|
||||
|
||||
// Remove Gentoo-specific calls that have no equivalent
|
||||
let remove_patterns = [
|
||||
"default",
|
||||
"eapply_user",
|
||||
"multilib_src_configure",
|
||||
"multilib_src_compile",
|
||||
"multilib_src_install",
|
||||
];
|
||||
for pattern in &remove_patterns {
|
||||
result = result
|
||||
.lines()
|
||||
.filter(|l| !l.trim().starts_with(pattern))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n");
|
||||
}
|
||||
|
||||
result.trim().to_string()
|
||||
}
|
||||
|
||||
/// Parse SRC_URI into a clean download URL.
|
||||
fn parse_src_uri(src_uri: &str, name: &str, version: &str) -> String {
|
||||
// SRC_URI can have multiple entries, redirects, and mirror:// prefixes
|
||||
// Take the first real URL
|
||||
for token in src_uri.split_whitespace() {
|
||||
if token.starts_with("http://") || token.starts_with("https://") || token.starts_with("mirror://") {
|
||||
let url = token
|
||||
.replace("mirror://sourceforge", "https://downloads.sourceforge.net")
|
||||
.replace("mirror://gnu", "https://ftp.gnu.org/gnu")
|
||||
.replace("mirror://gentoo", "https://distfiles.gentoo.org/distfiles");
|
||||
|
||||
// Replace ${P}, ${PV}, ${PN} with template vars
|
||||
let templated = url
|
||||
.replace(&format!("{}-{}", name, version), "${name}-${version}")
|
||||
.replace(version, "${version}")
|
||||
.replace(name, "${name}");
|
||||
|
||||
return templated;
|
||||
}
|
||||
}
|
||||
|
||||
// If no URL found, return a placeholder
|
||||
format!("https://FIXME/{}-{}.tar.xz", name, version)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_parse_ebuild_filename() {
|
||||
let (name, version) = parse_ebuild_filename("curl-8.19.0.ebuild").unwrap();
|
||||
assert_eq!(name, "curl");
|
||||
assert_eq!(version, "8.19.0");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_ebuild_filename_complex() {
|
||||
let (name, version) = parse_ebuild_filename("qt6-base-6.8.0.ebuild").unwrap();
|
||||
assert_eq!(name, "qt6-base");
|
||||
assert_eq!(version, "6.8.0");
|
||||
}
|
||||
|
||||
const SIMPLE_EBUILD: &str = r#"
|
||||
EAPI=8
|
||||
|
||||
DESCRIPTION="Standard (de)compression library"
|
||||
HOMEPAGE="https://zlib.net/"
|
||||
SRC_URI="https://zlib.net/zlib-${PV}.tar.xz"
|
||||
|
||||
LICENSE="ZLIB"
|
||||
SLOT="0/1"
|
||||
KEYWORDS="~alpha amd64 arm arm64"
|
||||
IUSE="minizip static-libs"
|
||||
|
||||
RDEPEND=""
|
||||
DEPEND=""
|
||||
|
||||
src_configure() {
|
||||
econf
|
||||
}
|
||||
|
||||
src_install() {
|
||||
emake DESTDIR="${D}" install
|
||||
}
|
||||
"#;
|
||||
|
||||
#[test]
|
||||
fn test_parse_simple_ebuild() {
|
||||
let pkg = parse_ebuild(SIMPLE_EBUILD, "zlib-1.3.2.ebuild").unwrap();
|
||||
assert_eq!(pkg.package.name, "zlib");
|
||||
assert_eq!(pkg.package.version, "1.3.2");
|
||||
assert_eq!(pkg.package.description, "Standard (de)compression library");
|
||||
assert_eq!(pkg.package.license, "ZLIB");
|
||||
assert!(pkg.dependencies.optional.contains_key("minizip"));
|
||||
assert!(pkg.dependencies.optional.contains_key("static-libs"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_multiline_var() {
|
||||
let content = r#"
|
||||
RDEPEND="
|
||||
dev-libs/openssl:=
|
||||
>=net-libs/nghttp2-1.0
|
||||
sys-libs/zlib
|
||||
"
|
||||
"#;
|
||||
let result = extract_multiline_var(content, "RDEPEND");
|
||||
assert!(result.contains("openssl"));
|
||||
assert!(result.contains("nghttp2"));
|
||||
assert!(result.contains("zlib"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_dep_atoms() {
|
||||
let deps = ">=dev-libs/openssl-1.0.2:=[static-libs?] net-libs/nghttp2:= sys-libs/zlib";
|
||||
let mut warnings = ConversionWarnings::default();
|
||||
let result = parse_dep_atoms(deps, &mut warnings);
|
||||
assert!(result.contains(&"openssl".to_string()));
|
||||
assert!(result.contains(&"nghttp2".to_string()));
|
||||
assert!(result.contains(&"zlib".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_use_flags() {
|
||||
let iuse = "+http2 +quic brotli debug test doc";
|
||||
let flags = parse_use_flags(iuse);
|
||||
assert!(flags.get("http2").unwrap().default);
|
||||
assert!(flags.get("quic").unwrap().default);
|
||||
assert!(!flags.get("brotli").unwrap().default);
|
||||
// test and doc should be filtered out
|
||||
assert!(!flags.contains_key("test"));
|
||||
assert!(!flags.contains_key("doc"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_convert_phase_to_commands() {
|
||||
let body = "econf --prefix=/usr\nemake\nemake DESTDIR=\"${D}\" install";
|
||||
let result = convert_phase_to_commands(body, &BuildSystem::Autotools);
|
||||
assert!(result.contains("./configure --prefix=/usr"));
|
||||
assert!(result.contains("make DESTDIR=\"${PKG}\" install"));
|
||||
}
|
||||
}
|
||||
34
src/dpack/src/converter/mod.rs
Normal file
34
src/dpack/src/converter/mod.rs
Normal file
@@ -0,0 +1,34 @@
|
||||
//! Foreign package format converters.
|
||||
//!
|
||||
//! Converts CRUX Pkgfiles and Gentoo ebuilds to `.toml` dpack format.
|
||||
//! Both converters are best-effort: they handle common patterns and flag
|
||||
//! anything that requires manual review.
|
||||
|
||||
pub mod crux;
|
||||
pub mod gentoo;
|
||||
|
||||
use anyhow::{bail, Result};
|
||||
use std::path::Path;
|
||||
|
||||
/// Detect the format of a foreign package file and convert it.
|
||||
pub fn convert_file(path: &Path) -> Result<String> {
|
||||
let filename = path
|
||||
.file_name()
|
||||
.map(|f| f.to_string_lossy().to_string())
|
||||
.unwrap_or_default();
|
||||
|
||||
if filename == "Pkgfile" {
|
||||
let content = std::fs::read_to_string(path)?;
|
||||
let pkg = crux::parse_pkgfile(&content)?;
|
||||
pkg.to_toml()
|
||||
} else if filename.ends_with(".ebuild") {
|
||||
let content = std::fs::read_to_string(path)?;
|
||||
let pkg = gentoo::parse_ebuild(&content, &filename)?;
|
||||
pkg.to_toml()
|
||||
} else {
|
||||
bail!(
|
||||
"Unknown package format: '{}'. Expected 'Pkgfile' or '*.ebuild'",
|
||||
filename
|
||||
);
|
||||
}
|
||||
}
|
||||
329
src/dpack/src/db/mod.rs
Normal file
329
src/dpack/src/db/mod.rs
Normal file
@@ -0,0 +1,329 @@
|
||||
//! Installed package database.
|
||||
//!
|
||||
//! File-based database stored at `/var/lib/dpack/db/`. One TOML file per
|
||||
//! installed package, tracking: name, version, installed files, dependencies,
|
||||
//! features enabled, install timestamp, and link type (shared/static).
|
||||
//!
|
||||
//! The database is the source of truth for what's installed on the system.
|
||||
//! It's used by the resolver (to skip already-installed packages), the
|
||||
//! remove command (to know which files to delete), and the upgrade command
|
||||
//! (to compare installed vs available versions).
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashMap;
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
/// A record of a single installed package.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct InstalledPackage {
|
||||
/// Package name
|
||||
pub name: String,
|
||||
|
||||
/// Installed version
|
||||
pub version: String,
|
||||
|
||||
/// Package description (copied from definition at install time)
|
||||
pub description: String,
|
||||
|
||||
/// Runtime dependencies at time of installation
|
||||
pub run_deps: Vec<String>,
|
||||
|
||||
/// Build dependencies used during installation
|
||||
pub build_deps: Vec<String>,
|
||||
|
||||
/// Features that were enabled during build
|
||||
pub features: Vec<String>,
|
||||
|
||||
/// All files installed by this package (absolute paths)
|
||||
pub files: Vec<PathBuf>,
|
||||
|
||||
/// Install timestamp (seconds since epoch)
|
||||
pub installed_at: u64,
|
||||
|
||||
/// Which repository this package came from
|
||||
pub repo: String,
|
||||
|
||||
/// Size in bytes of all installed files
|
||||
pub size: u64,
|
||||
}
|
||||
|
||||
/// The package database — manages the collection of installed package records.
|
||||
pub struct PackageDb {
|
||||
/// Path to the database directory
|
||||
db_dir: PathBuf,
|
||||
|
||||
/// In-memory cache of installed packages
|
||||
cache: HashMap<String, InstalledPackage>,
|
||||
}
|
||||
|
||||
impl PackageDb {
|
||||
/// Open or create the package database at the given directory.
|
||||
pub fn open(db_dir: &Path) -> Result<Self> {
|
||||
std::fs::create_dir_all(db_dir)
|
||||
.with_context(|| format!("Failed to create db dir: {}", db_dir.display()))?;
|
||||
|
||||
let mut db = Self {
|
||||
db_dir: db_dir.to_path_buf(),
|
||||
cache: HashMap::new(),
|
||||
};
|
||||
|
||||
db.load_all()?;
|
||||
Ok(db)
|
||||
}
|
||||
|
||||
/// Load all package records from disk into the cache.
|
||||
fn load_all(&mut self) -> Result<()> {
|
||||
self.cache.clear();
|
||||
|
||||
if !self.db_dir.exists() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
for entry in std::fs::read_dir(&self.db_dir)
|
||||
.with_context(|| format!("Failed to read db dir: {}", self.db_dir.display()))?
|
||||
{
|
||||
let entry = entry?;
|
||||
let path = entry.path();
|
||||
|
||||
if path.extension().map_or(false, |ext| ext == "toml") {
|
||||
match self.load_one(&path) {
|
||||
Ok(pkg) => {
|
||||
self.cache.insert(pkg.name.clone(), pkg);
|
||||
}
|
||||
Err(e) => {
|
||||
log::warn!("Skipping corrupt db entry {}: {}", path.display(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Load a single package record from a TOML file.
|
||||
fn load_one(&self, path: &Path) -> Result<InstalledPackage> {
|
||||
let content = std::fs::read_to_string(path)
|
||||
.with_context(|| format!("Failed to read: {}", path.display()))?;
|
||||
toml::from_str(&content).context("Failed to parse package db entry")
|
||||
}
|
||||
|
||||
/// Register a newly installed package in the database.
|
||||
pub fn register(&mut self, pkg: InstalledPackage) -> Result<()> {
|
||||
let path = self.db_dir.join(format!("{}.toml", pkg.name));
|
||||
let content = toml::to_string_pretty(&pkg)
|
||||
.context("Failed to serialize package record")?;
|
||||
std::fs::write(&path, content)
|
||||
.with_context(|| format!("Failed to write db entry: {}", path.display()))?;
|
||||
|
||||
self.cache.insert(pkg.name.clone(), pkg);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Remove a package record from the database.
|
||||
pub fn unregister(&mut self, name: &str) -> Result<Option<InstalledPackage>> {
|
||||
let path = self.db_dir.join(format!("{}.toml", name));
|
||||
if path.exists() {
|
||||
std::fs::remove_file(&path)
|
||||
.with_context(|| format!("Failed to remove db entry: {}", path.display()))?;
|
||||
}
|
||||
Ok(self.cache.remove(name))
|
||||
}
|
||||
|
||||
/// Check if a package is installed.
|
||||
pub fn is_installed(&self, name: &str) -> bool {
|
||||
self.cache.contains_key(name)
|
||||
}
|
||||
|
||||
/// Get the installed version of a package.
|
||||
pub fn installed_version(&self, name: &str) -> Option<&str> {
|
||||
self.cache.get(name).map(|p| p.version.as_str())
|
||||
}
|
||||
|
||||
/// Get the full record of an installed package.
|
||||
pub fn get(&self, name: &str) -> Option<&InstalledPackage> {
|
||||
self.cache.get(name)
|
||||
}
|
||||
|
||||
/// List all installed packages.
|
||||
pub fn list_all(&self) -> Vec<&InstalledPackage> {
|
||||
let mut pkgs: Vec<_> = self.cache.values().collect();
|
||||
pkgs.sort_by_key(|p| &p.name);
|
||||
pkgs
|
||||
}
|
||||
|
||||
/// Get a map of all installed packages: name -> version.
|
||||
/// Used by the dependency resolver.
|
||||
pub fn installed_versions(&self) -> HashMap<String, String> {
|
||||
self.cache
|
||||
.iter()
|
||||
.map(|(k, v)| (k.clone(), v.version.clone()))
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Find all packages that own a specific file.
|
||||
pub fn who_owns(&self, file_path: &Path) -> Vec<String> {
|
||||
self.cache
|
||||
.values()
|
||||
.filter(|pkg| pkg.files.iter().any(|f| f == file_path))
|
||||
.map(|pkg| pkg.name.clone())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Find packages with file conflicts (files owned by multiple packages).
|
||||
pub fn find_conflicts(&self) -> HashMap<PathBuf, Vec<String>> {
|
||||
let mut file_owners: HashMap<PathBuf, Vec<String>> = HashMap::new();
|
||||
|
||||
for pkg in self.cache.values() {
|
||||
for file in &pkg.files {
|
||||
file_owners
|
||||
.entry(file.clone())
|
||||
.or_default()
|
||||
.push(pkg.name.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// Return only files with multiple owners
|
||||
file_owners
|
||||
.into_iter()
|
||||
.filter(|(_, owners)| owners.len() > 1)
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get total number of installed packages.
|
||||
pub fn count(&self) -> usize {
|
||||
self.cache.len()
|
||||
}
|
||||
|
||||
/// Get total disk usage of all installed packages.
|
||||
pub fn total_size(&self) -> u64 {
|
||||
self.cache.values().map(|p| p.size).sum()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use std::time::{SystemTime, UNIX_EPOCH};
|
||||
|
||||
fn make_installed(name: &str, version: &str, files: Vec<&str>) -> InstalledPackage {
|
||||
InstalledPackage {
|
||||
name: name.to_string(),
|
||||
version: version.to_string(),
|
||||
description: format!("Test package {}", name),
|
||||
run_deps: vec![],
|
||||
build_deps: vec![],
|
||||
features: vec![],
|
||||
files: files.into_iter().map(PathBuf::from).collect(),
|
||||
installed_at: SystemTime::now()
|
||||
.duration_since(UNIX_EPOCH)
|
||||
.unwrap()
|
||||
.as_secs(),
|
||||
repo: "core".to_string(),
|
||||
size: 1024,
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_register_and_get() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-reg");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
assert_eq!(db.count(), 0);
|
||||
|
||||
let pkg = make_installed("zlib", "1.3.1", vec!["/usr/lib/libz.so"]);
|
||||
db.register(pkg).unwrap();
|
||||
|
||||
assert!(db.is_installed("zlib"));
|
||||
assert_eq!(db.installed_version("zlib"), Some("1.3.1"));
|
||||
assert_eq!(db.count(), 1);
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_unregister() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-unreg");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
db.register(make_installed("zlib", "1.3.1", vec![])).unwrap();
|
||||
|
||||
assert!(db.is_installed("zlib"));
|
||||
let removed = db.unregister("zlib").unwrap();
|
||||
assert!(removed.is_some());
|
||||
assert!(!db.is_installed("zlib"));
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_persistence() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-persist");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
{
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
db.register(make_installed("zlib", "1.3.1", vec!["/usr/lib/libz.so"])).unwrap();
|
||||
db.register(make_installed("gcc", "15.2.0", vec!["/usr/bin/gcc"])).unwrap();
|
||||
}
|
||||
|
||||
// Re-open and verify data persisted
|
||||
let db = PackageDb::open(&tmpdir).unwrap();
|
||||
assert_eq!(db.count(), 2);
|
||||
assert!(db.is_installed("zlib"));
|
||||
assert!(db.is_installed("gcc"));
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_who_owns() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-owns");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
db.register(make_installed("zlib", "1.3.1", vec!["/usr/lib/libz.so"])).unwrap();
|
||||
|
||||
let owners = db.who_owns(Path::new("/usr/lib/libz.so"));
|
||||
assert_eq!(owners, vec!["zlib"]);
|
||||
|
||||
let owners = db.who_owns(Path::new("/usr/lib/nonexistent.so"));
|
||||
assert!(owners.is_empty());
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_find_conflicts() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-conflicts");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
db.register(make_installed("pkg-a", "1.0", vec!["/usr/lib/shared.so"])).unwrap();
|
||||
db.register(make_installed("pkg-b", "2.0", vec!["/usr/lib/shared.so"])).unwrap();
|
||||
|
||||
let conflicts = db.find_conflicts();
|
||||
assert!(conflicts.contains_key(&PathBuf::from("/usr/lib/shared.so")));
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_list_all_sorted() {
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-db-list");
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
|
||||
let mut db = PackageDb::open(&tmpdir).unwrap();
|
||||
db.register(make_installed("zlib", "1.3.1", vec![])).unwrap();
|
||||
db.register(make_installed("bash", "5.3", vec![])).unwrap();
|
||||
db.register(make_installed("gcc", "15.2.0", vec![])).unwrap();
|
||||
|
||||
let all = db.list_all();
|
||||
let names: Vec<&str> = all.iter().map(|p| p.name.as_str()).collect();
|
||||
assert_eq!(names, vec!["bash", "gcc", "zlib"]); // sorted
|
||||
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
}
|
||||
}
|
||||
21
src/dpack/src/lib.rs
Normal file
21
src/dpack/src/lib.rs
Normal file
@@ -0,0 +1,21 @@
|
||||
//! dpack library — core functionality for the DarkForge package manager.
|
||||
//!
|
||||
//! This crate provides:
|
||||
//! - Package definition parsing (`config`)
|
||||
//! - Dependency resolution (`resolver`)
|
||||
//! - Build sandboxing (`sandbox`)
|
||||
//! - Foreign format converters (`converter`)
|
||||
//! - Installed package database (`db`)
|
||||
//! - Build orchestration (`build`)
|
||||
|
||||
// Many public API items are not yet used from main.rs but will be
|
||||
// consumed as later phases are implemented. Suppress dead_code warnings
|
||||
// for the library crate.
|
||||
#![allow(dead_code)]
|
||||
|
||||
pub mod config;
|
||||
pub mod resolver;
|
||||
pub mod sandbox;
|
||||
pub mod converter;
|
||||
pub mod db;
|
||||
pub mod build;
|
||||
389
src/dpack/src/main.rs
Normal file
389
src/dpack/src/main.rs
Normal file
@@ -0,0 +1,389 @@
|
||||
//! dpack — DarkForge Linux Package Manager
|
||||
//!
|
||||
//! A source-based package manager inspired by CRUX's pkgutils and Gentoo's emerge.
|
||||
//! Supports TOML package definitions, dependency resolution, sandboxed builds,
|
||||
//! and converters for CRUX Pkgfiles and Gentoo ebuilds.
|
||||
|
||||
// Public API items in submodules are used across phases — suppress dead_code
|
||||
// warnings for items not yet wired into CLI commands.
|
||||
#![allow(dead_code)]
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use clap::{Parser, Subcommand};
|
||||
use colored::Colorize;
|
||||
|
||||
mod config;
|
||||
mod resolver;
|
||||
mod sandbox;
|
||||
mod converter;
|
||||
mod db;
|
||||
mod build;
|
||||
|
||||
use config::{DpackConfig, PackageDefinition};
|
||||
use db::PackageDb;
|
||||
use build::BuildOrchestrator;
|
||||
|
||||
/// DarkForge package manager
|
||||
#[derive(Parser)]
|
||||
#[command(name = "dpack")]
|
||||
#[command(about = "DarkForge Linux package manager")]
|
||||
#[command(version)]
|
||||
struct Cli {
|
||||
/// Path to dpack configuration file
|
||||
#[arg(short, long, default_value = "/etc/dpack.conf")]
|
||||
config: String,
|
||||
|
||||
#[command(subcommand)]
|
||||
command: Commands,
|
||||
}
|
||||
|
||||
#[derive(Subcommand)]
|
||||
enum Commands {
|
||||
/// Install a package (resolve deps → build → install → update db)
|
||||
Install {
|
||||
/// Package name(s) to install
|
||||
#[arg(required = true)]
|
||||
packages: Vec<String>,
|
||||
},
|
||||
|
||||
/// Remove an installed package
|
||||
Remove {
|
||||
/// Package name(s) to remove
|
||||
#[arg(required = true)]
|
||||
packages: Vec<String>,
|
||||
},
|
||||
|
||||
/// Upgrade installed package(s) to latest version
|
||||
Upgrade {
|
||||
/// Package name(s) to upgrade, or empty for all
|
||||
packages: Vec<String>,
|
||||
},
|
||||
|
||||
/// Search for packages in the repository
|
||||
Search {
|
||||
/// Search query
|
||||
query: String,
|
||||
},
|
||||
|
||||
/// Show information about a package
|
||||
Info {
|
||||
/// Package name
|
||||
package: String,
|
||||
},
|
||||
|
||||
/// List installed packages
|
||||
List,
|
||||
|
||||
/// Convert a foreign package definition to dpack format
|
||||
Convert {
|
||||
/// Path to the foreign package file (Pkgfile or .ebuild)
|
||||
path: String,
|
||||
|
||||
/// Output path for the generated .toml file
|
||||
#[arg(short, long)]
|
||||
output: Option<String>,
|
||||
},
|
||||
|
||||
/// Check for shared library conflicts
|
||||
Check,
|
||||
}
|
||||
|
||||
fn main() {
|
||||
env_logger::init();
|
||||
let cli = Cli::parse();
|
||||
|
||||
let result = run(cli);
|
||||
|
||||
if let Err(e) = result {
|
||||
eprintln!("{}: {:#}", "error".red().bold(), e);
|
||||
std::process::exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
fn run(cli: Cli) -> Result<()> {
|
||||
let config = if std::path::Path::new(&cli.config).exists() {
|
||||
DpackConfig::from_file(std::path::Path::new(&cli.config))?
|
||||
} else {
|
||||
DpackConfig::default()
|
||||
};
|
||||
|
||||
match cli.command {
|
||||
Commands::Install { packages } => {
|
||||
let db = PackageDb::open(&config.paths.db_dir)?;
|
||||
let mut orchestrator = BuildOrchestrator::new(config, db);
|
||||
orchestrator.install(&packages)?;
|
||||
}
|
||||
|
||||
Commands::Remove { packages } => {
|
||||
let mut db = PackageDb::open(&config.paths.db_dir)?;
|
||||
|
||||
// Load repos for reverse-dep checking
|
||||
let mut all_repo_packages = std::collections::HashMap::new();
|
||||
for repo in &config.repos {
|
||||
let repo_pkgs = resolver::DependencyGraph::load_repo(&repo.path)?;
|
||||
all_repo_packages.extend(repo_pkgs);
|
||||
}
|
||||
|
||||
let installed_names: std::collections::HashSet<String> =
|
||||
db.list_all().iter().map(|p| p.name.clone()).collect();
|
||||
|
||||
for name in &packages {
|
||||
// Check reverse dependencies before removing
|
||||
let rdeps = resolver::reverse_deps(name, &all_repo_packages, &installed_names);
|
||||
if !rdeps.is_empty() {
|
||||
println!(
|
||||
"{} '{}' is required by: {}",
|
||||
"Warning:".yellow().bold(),
|
||||
name,
|
||||
rdeps.join(", ")
|
||||
);
|
||||
println!(" Removing it may break these packages.");
|
||||
println!(" Proceeding anyway...");
|
||||
}
|
||||
|
||||
match db.unregister(name)? {
|
||||
Some(pkg) => {
|
||||
// Remove installed files (in reverse order to clean dirs)
|
||||
let mut files = pkg.files.clone();
|
||||
files.sort();
|
||||
files.reverse();
|
||||
|
||||
let mut removed_count = 0;
|
||||
for file in &files {
|
||||
if file.is_file() {
|
||||
if std::fs::remove_file(file).is_ok() {
|
||||
removed_count += 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
// Try to remove empty parent directories
|
||||
for file in &files {
|
||||
if let Some(parent) = file.parent() {
|
||||
std::fs::remove_dir(parent).ok();
|
||||
}
|
||||
}
|
||||
println!(
|
||||
"{} {} (removed {}/{} files)",
|
||||
"Removed".green().bold(),
|
||||
name,
|
||||
removed_count,
|
||||
files.len()
|
||||
);
|
||||
}
|
||||
None => {
|
||||
println!("{} '{}' is not installed", "Warning:".yellow().bold(), name);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Commands::Upgrade { packages } => {
|
||||
let db = PackageDb::open(&config.paths.db_dir)?;
|
||||
|
||||
// Load all repos to compare available vs installed versions
|
||||
let mut all_repo_packages = std::collections::HashMap::new();
|
||||
for repo in &config.repos {
|
||||
let repo_pkgs = resolver::DependencyGraph::load_repo(&repo.path)?;
|
||||
all_repo_packages.extend(repo_pkgs);
|
||||
}
|
||||
|
||||
// Determine which packages to upgrade
|
||||
let targets: Vec<String> = if packages.is_empty() {
|
||||
// Upgrade all installed packages that have newer versions
|
||||
db.list_all()
|
||||
.iter()
|
||||
.filter(|installed| {
|
||||
all_repo_packages
|
||||
.get(&installed.name)
|
||||
.map_or(false, |repo_pkg| repo_pkg.package.version != installed.version)
|
||||
})
|
||||
.map(|p| p.name.clone())
|
||||
.collect()
|
||||
} else {
|
||||
packages
|
||||
};
|
||||
|
||||
if targets.is_empty() {
|
||||
println!("{}", "All packages are up to date.".green());
|
||||
} else {
|
||||
println!("Packages to upgrade:");
|
||||
for name in &targets {
|
||||
let installed_ver = db.installed_version(name).unwrap_or("?");
|
||||
let repo_ver = all_repo_packages
|
||||
.get(name)
|
||||
.map(|p| p.package.version.as_str())
|
||||
.unwrap_or("?");
|
||||
println!(" {} {} → {}", name.bold(), installed_ver.red(), repo_ver.green());
|
||||
}
|
||||
|
||||
// Check for shared library conflicts before proceeding
|
||||
let _solib_map = resolver::solib::build_solib_map(&db);
|
||||
|
||||
for name in &targets {
|
||||
// Warn about packages that depend on this one
|
||||
let rdeps = resolver::reverse_deps(
|
||||
name,
|
||||
&all_repo_packages,
|
||||
&db.list_all().iter().map(|p| p.name.clone()).collect(),
|
||||
);
|
||||
if !rdeps.is_empty() {
|
||||
println!(
|
||||
"\n{} {} is depended on by: {}",
|
||||
"Note:".cyan().bold(),
|
||||
name,
|
||||
rdeps.join(", ")
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Proceed with the upgrade (remove old, install new)
|
||||
println!("\nProceeding with upgrade...");
|
||||
let mut orchestrator = BuildOrchestrator::new(config, db);
|
||||
orchestrator.install(&targets)?;
|
||||
}
|
||||
}
|
||||
|
||||
Commands::Search { query } => {
|
||||
// Search through all repos for matching package names/descriptions
|
||||
for repo in &config.repos {
|
||||
let packages = resolver::DependencyGraph::load_repo(&repo.path)?;
|
||||
for (name, pkg) in &packages {
|
||||
if name.contains(&query) || pkg.package.description.to_lowercase().contains(&query.to_lowercase()) {
|
||||
println!(
|
||||
"{}/{} {} — {}",
|
||||
repo.name.cyan(),
|
||||
name.bold(),
|
||||
pkg.package.version.green(),
|
||||
pkg.package.description
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Commands::Info { package } => {
|
||||
// Check installed first
|
||||
let db = PackageDb::open(&config.paths.db_dir)?;
|
||||
if let Some(installed) = db.get(&package) {
|
||||
println!("{}: {}", "Name".bold(), installed.name);
|
||||
println!("{}: {}", "Version".bold(), installed.version);
|
||||
println!("{}: {}", "Description".bold(), installed.description);
|
||||
println!("{}: {}", "Status".bold(), "installed".green());
|
||||
println!("{}: {}", "Repo".bold(), installed.repo);
|
||||
println!("{}: {}", "Files".bold(), installed.files.len());
|
||||
println!("{}: {} bytes", "Size".bold(), installed.size);
|
||||
if !installed.features.is_empty() {
|
||||
println!("{}: {}", "Features".bold(), installed.features.join(", "));
|
||||
}
|
||||
if !installed.run_deps.is_empty() {
|
||||
println!("{}: {}", "Run deps".bold(), installed.run_deps.join(", "));
|
||||
}
|
||||
} else {
|
||||
// Search repos
|
||||
for repo in &config.repos {
|
||||
if let Some(pkg_path) = repo.path.join(&package).join(format!("{}.toml", package)).exists().then(|| repo.path.join(&package).join(format!("{}.toml", package))) {
|
||||
let pkg = PackageDefinition::from_file(&pkg_path)?;
|
||||
println!("{}: {}", "Name".bold(), pkg.package.name);
|
||||
println!("{}: {}", "Version".bold(), pkg.package.version);
|
||||
println!("{}: {}", "Description".bold(), pkg.package.description);
|
||||
println!("{}: {}", "Status".bold(), "not installed".yellow());
|
||||
println!("{}: {}", "URL".bold(), pkg.package.url);
|
||||
println!("{}: {}", "License".bold(), pkg.package.license);
|
||||
if !pkg.dependencies.run.is_empty() {
|
||||
println!("{}: {}", "Run deps".bold(), pkg.dependencies.run.join(", "));
|
||||
}
|
||||
if !pkg.dependencies.build.is_empty() {
|
||||
println!("{}: {}", "Build deps".bold(), pkg.dependencies.build.join(", "));
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
println!("{} Package '{}' not found", "Error:".red().bold(), package);
|
||||
}
|
||||
}
|
||||
|
||||
Commands::List => {
|
||||
let db = PackageDb::open(&config.paths.db_dir)?;
|
||||
let all = db.list_all();
|
||||
if all.is_empty() {
|
||||
println!("No packages installed.");
|
||||
} else {
|
||||
println!("{} installed packages:", all.len());
|
||||
for pkg in &all {
|
||||
println!(
|
||||
" {} {} [{}]",
|
||||
pkg.name.bold(),
|
||||
pkg.version.green(),
|
||||
pkg.repo.cyan()
|
||||
);
|
||||
}
|
||||
println!("\nTotal disk usage: {} MB", db.total_size() / (1024 * 1024));
|
||||
}
|
||||
}
|
||||
|
||||
Commands::Convert { path, output } => {
|
||||
let input_path = std::path::Path::new(&path);
|
||||
if !input_path.exists() {
|
||||
anyhow::bail!("Input file not found: {}", path);
|
||||
}
|
||||
|
||||
println!("Converting: {}", path);
|
||||
let toml_output = converter::convert_file(input_path)?;
|
||||
|
||||
if let Some(out_path) = output {
|
||||
std::fs::write(&out_path, &toml_output)
|
||||
.with_context(|| format!("Failed to write: {}", out_path))?;
|
||||
println!("{} Written to: {}", "Converted!".green().bold(), out_path);
|
||||
} else {
|
||||
// Print to stdout
|
||||
println!("{}", "--- Converted TOML ---".cyan().bold());
|
||||
println!("{}", toml_output);
|
||||
}
|
||||
}
|
||||
|
||||
Commands::Check => {
|
||||
let db = PackageDb::open(&config.paths.db_dir)?;
|
||||
|
||||
// Check for file ownership conflicts
|
||||
let file_conflicts = db.find_conflicts();
|
||||
if file_conflicts.is_empty() {
|
||||
println!("{}", "No file ownership conflicts detected.".green());
|
||||
} else {
|
||||
println!(
|
||||
"{} {} file conflict(s) found:",
|
||||
"Warning:".yellow().bold(),
|
||||
file_conflicts.len()
|
||||
);
|
||||
for (file, owners) in &file_conflicts {
|
||||
println!(" {} — owned by: {}", file.display(), owners.join(", "));
|
||||
}
|
||||
}
|
||||
|
||||
// Build solib dependency map
|
||||
println!("\nScanning shared library dependencies...");
|
||||
let solib_map = resolver::solib::build_solib_map(&db);
|
||||
println!(
|
||||
"Tracked {} unique shared libraries across {} packages.",
|
||||
solib_map.len(),
|
||||
db.count()
|
||||
);
|
||||
|
||||
// Report any libraries linked by multiple packages
|
||||
let multi_user_libs: Vec<_> = solib_map
|
||||
.iter()
|
||||
.filter(|(_, pkgs)| pkgs.len() > 2)
|
||||
.collect();
|
||||
if !multi_user_libs.is_empty() {
|
||||
println!(
|
||||
"\n{} libraries used by 3+ packages (upgrade with care):",
|
||||
"Widely-used".cyan().bold()
|
||||
);
|
||||
for (lib, pkgs) in &multi_user_libs {
|
||||
println!(" {} — {} packages", lib, pkgs.len());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
389
src/dpack/src/resolver/mod.rs
Normal file
389
src/dpack/src/resolver/mod.rs
Normal file
@@ -0,0 +1,389 @@
|
||||
//! Dependency resolution engine.
|
||||
//!
|
||||
//! Resolves a package's full dependency tree into a topologically sorted
|
||||
//! build order. Handles:
|
||||
//! - Direct runtime dependencies
|
||||
//! - Build-time dependencies
|
||||
//! - Optional feature dependencies
|
||||
//! - Circular dependency detection
|
||||
//! - Version constraints (basic)
|
||||
//!
|
||||
//! The resolver operates on a `PackageGraph` built from the repository's
|
||||
//! package definitions and the installed package database.
|
||||
|
||||
pub mod solib;
|
||||
|
||||
use anyhow::{bail, Context, Result};
|
||||
use std::collections::{HashMap, HashSet};
|
||||
use std::path::Path;
|
||||
|
||||
use crate::config::PackageDefinition;
|
||||
|
||||
/// The result of dependency resolution: an ordered list of packages to build.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ResolutionPlan {
|
||||
/// Packages in topological order (build these first-to-last)
|
||||
pub build_order: Vec<ResolvedPackage>,
|
||||
|
||||
/// Packages that are already installed and don't need rebuilding
|
||||
pub already_installed: Vec<String>,
|
||||
}
|
||||
|
||||
/// A single package in the resolution plan.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ResolvedPackage {
|
||||
/// Package name
|
||||
pub name: String,
|
||||
|
||||
/// Package version
|
||||
pub version: String,
|
||||
|
||||
/// Whether this is a build-only dependency (not needed at runtime)
|
||||
pub build_only: bool,
|
||||
|
||||
/// Which features are enabled for this package
|
||||
pub features: Vec<String>,
|
||||
|
||||
/// Path to the package definition file
|
||||
pub definition_path: std::path::PathBuf,
|
||||
}
|
||||
|
||||
/// The dependency graph used internally for resolution.
|
||||
pub struct DependencyGraph {
|
||||
/// All known package definitions, keyed by name
|
||||
packages: HashMap<String, PackageDefinition>,
|
||||
|
||||
/// Set of currently installed packages (name -> version)
|
||||
installed: HashMap<String, String>,
|
||||
}
|
||||
|
||||
impl DependencyGraph {
|
||||
/// Create a new graph from a set of package definitions and installed state.
|
||||
pub fn new(
|
||||
packages: HashMap<String, PackageDefinition>,
|
||||
installed: HashMap<String, String>,
|
||||
) -> Self {
|
||||
Self {
|
||||
packages,
|
||||
installed,
|
||||
}
|
||||
}
|
||||
|
||||
/// Load all package definitions from a repository directory.
|
||||
///
|
||||
/// Expects: `repo_dir/<name>/<name>.toml`
|
||||
pub fn load_repo(repo_dir: &Path) -> Result<HashMap<String, PackageDefinition>> {
|
||||
let mut packages = HashMap::new();
|
||||
|
||||
if !repo_dir.is_dir() {
|
||||
return Ok(packages);
|
||||
}
|
||||
|
||||
for entry in std::fs::read_dir(repo_dir)
|
||||
.with_context(|| format!("Failed to read repo: {}", repo_dir.display()))?
|
||||
{
|
||||
let entry = entry?;
|
||||
if !entry.file_type()?.is_dir() {
|
||||
continue;
|
||||
}
|
||||
|
||||
let pkg_name = entry.file_name().to_string_lossy().to_string();
|
||||
let toml_path = entry.path().join(format!("{}.toml", pkg_name));
|
||||
|
||||
if toml_path.exists() {
|
||||
match PackageDefinition::from_file(&toml_path) {
|
||||
Ok(pkg) => {
|
||||
packages.insert(pkg_name, pkg);
|
||||
}
|
||||
Err(e) => {
|
||||
log::warn!("Skipping {}: {}", toml_path.display(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(packages)
|
||||
}
|
||||
|
||||
/// Resolve all dependencies for the given package names.
|
||||
///
|
||||
/// Returns a topologically sorted build order. Detects circular deps.
|
||||
pub fn resolve(
|
||||
&self,
|
||||
targets: &[String],
|
||||
enabled_features: &HashMap<String, Vec<String>>,
|
||||
) -> Result<ResolutionPlan> {
|
||||
let mut visited: HashSet<String> = HashSet::new();
|
||||
let mut in_stack: HashSet<String> = HashSet::new();
|
||||
let mut order: Vec<ResolvedPackage> = Vec::new();
|
||||
let mut already_installed: Vec<String> = Vec::new();
|
||||
|
||||
for target in targets {
|
||||
self.resolve_recursive(
|
||||
target,
|
||||
false, // not build-only
|
||||
enabled_features,
|
||||
&mut visited,
|
||||
&mut in_stack,
|
||||
&mut order,
|
||||
&mut already_installed,
|
||||
)?;
|
||||
}
|
||||
|
||||
Ok(ResolutionPlan {
|
||||
build_order: order,
|
||||
already_installed,
|
||||
})
|
||||
}
|
||||
|
||||
/// Recursive DFS for topological sort with cycle detection.
|
||||
fn resolve_recursive(
|
||||
&self,
|
||||
name: &str,
|
||||
build_only: bool,
|
||||
enabled_features: &HashMap<String, Vec<String>>,
|
||||
visited: &mut HashSet<String>,
|
||||
in_stack: &mut HashSet<String>,
|
||||
order: &mut Vec<ResolvedPackage>,
|
||||
already_installed: &mut Vec<String>,
|
||||
) -> Result<()> {
|
||||
// Already fully resolved
|
||||
if visited.contains(name) {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// Circular dependency detected
|
||||
if in_stack.contains(name) {
|
||||
bail!(
|
||||
"Circular dependency detected: '{}' depends on itself (chain: {:?})",
|
||||
name,
|
||||
in_stack
|
||||
);
|
||||
}
|
||||
|
||||
// Check if already installed at the right version
|
||||
if let Some(installed_version) = self.installed.get(name) {
|
||||
if let Some(pkg) = self.packages.get(name) {
|
||||
if installed_version == &pkg.package.version {
|
||||
already_installed.push(name.to_string());
|
||||
visited.insert(name.to_string());
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Look up the package definition
|
||||
let pkg = self
|
||||
.packages
|
||||
.get(name)
|
||||
.with_context(|| format!("Package '{}' not found in any repository", name))?;
|
||||
|
||||
in_stack.insert(name.to_string());
|
||||
|
||||
// Get features for this package
|
||||
let features = enabled_features
|
||||
.get(name)
|
||||
.cloned()
|
||||
.unwrap_or_else(|| pkg.default_features());
|
||||
|
||||
// Resolve build dependencies first
|
||||
for dep in &pkg.effective_build_deps(&features) {
|
||||
self.resolve_recursive(
|
||||
dep,
|
||||
true,
|
||||
enabled_features,
|
||||
visited,
|
||||
in_stack,
|
||||
order,
|
||||
already_installed,
|
||||
)?;
|
||||
}
|
||||
|
||||
// Then resolve runtime dependencies
|
||||
for dep in &pkg.effective_run_deps(&features) {
|
||||
self.resolve_recursive(
|
||||
dep,
|
||||
false,
|
||||
enabled_features,
|
||||
visited,
|
||||
in_stack,
|
||||
order,
|
||||
already_installed,
|
||||
)?;
|
||||
}
|
||||
|
||||
in_stack.remove(name);
|
||||
visited.insert(name.to_string());
|
||||
|
||||
order.push(ResolvedPackage {
|
||||
name: name.to_string(),
|
||||
version: pkg.package.version.clone(),
|
||||
build_only,
|
||||
features,
|
||||
definition_path: std::path::PathBuf::new(), // Set by caller
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Perform a simple reverse-dependency lookup: which installed packages
|
||||
/// depend on the given package?
|
||||
pub fn reverse_deps(
|
||||
package: &str,
|
||||
all_packages: &HashMap<String, PackageDefinition>,
|
||||
installed: &HashSet<String>,
|
||||
) -> Vec<String> {
|
||||
let mut rdeps = Vec::new();
|
||||
|
||||
for inst_name in installed {
|
||||
if let Some(pkg) = all_packages.get(inst_name) {
|
||||
let features = pkg.default_features();
|
||||
let all_deps: Vec<String> = pkg
|
||||
.effective_run_deps(&features)
|
||||
.into_iter()
|
||||
.chain(pkg.effective_build_deps(&features))
|
||||
.collect();
|
||||
|
||||
if all_deps.iter().any(|d| d == package) {
|
||||
rdeps.push(inst_name.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
rdeps
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::config::package::*;
|
||||
use std::collections::HashMap;
|
||||
|
||||
/// Helper: create a minimal PackageDefinition for testing.
|
||||
fn make_pkg(name: &str, version: &str, run_deps: Vec<&str>, build_deps: Vec<&str>) -> PackageDefinition {
|
||||
PackageDefinition {
|
||||
package: PackageMetadata {
|
||||
name: name.to_string(),
|
||||
version: version.to_string(),
|
||||
description: format!("Test package {}", name),
|
||||
url: "https://example.com".to_string(),
|
||||
license: "MIT".to_string(),
|
||||
epoch: 0,
|
||||
revision: 1,
|
||||
},
|
||||
source: SourceInfo {
|
||||
url: format!("https://example.com/{}-{}.tar.xz", name, version),
|
||||
sha256: "a".repeat(64),
|
||||
patches: vec![],
|
||||
},
|
||||
dependencies: Dependencies {
|
||||
run: run_deps.into_iter().map(String::from).collect(),
|
||||
build: build_deps.into_iter().map(String::from).collect(),
|
||||
optional: HashMap::new(),
|
||||
},
|
||||
build: BuildInstructions {
|
||||
configure: "./configure --prefix=/usr".to_string(),
|
||||
make: "make".to_string(),
|
||||
install: "make DESTDIR=${PKG} install".to_string(),
|
||||
prepare: String::new(),
|
||||
post_install: String::new(),
|
||||
check: String::new(),
|
||||
flags: BuildFlags::default(),
|
||||
system: BuildSystem::default(),
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_simple_resolution() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("zlib".to_string(), make_pkg("zlib", "1.3.1", vec![], vec!["gcc", "make"]));
|
||||
packages.insert("gcc".to_string(), make_pkg("gcc", "15.2.0", vec![], vec![]));
|
||||
packages.insert("make".to_string(), make_pkg("make", "4.4.1", vec![], vec![]));
|
||||
|
||||
let graph = DependencyGraph::new(packages, HashMap::new());
|
||||
let plan = graph.resolve(&["zlib".to_string()], &HashMap::new()).unwrap();
|
||||
|
||||
assert_eq!(plan.build_order.len(), 3);
|
||||
// gcc and make should come before zlib
|
||||
let names: Vec<&str> = plan.build_order.iter().map(|p| p.name.as_str()).collect();
|
||||
let zlib_pos = names.iter().position(|&n| n == "zlib").unwrap();
|
||||
let gcc_pos = names.iter().position(|&n| n == "gcc").unwrap();
|
||||
let make_pos = names.iter().position(|&n| n == "make").unwrap();
|
||||
assert!(gcc_pos < zlib_pos);
|
||||
assert!(make_pos < zlib_pos);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_circular_dependency_detection() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("a".to_string(), make_pkg("a", "1.0", vec!["b"], vec![]));
|
||||
packages.insert("b".to_string(), make_pkg("b", "1.0", vec!["a"], vec![]));
|
||||
|
||||
let graph = DependencyGraph::new(packages, HashMap::new());
|
||||
let result = graph.resolve(&["a".to_string()], &HashMap::new());
|
||||
assert!(result.is_err());
|
||||
assert!(result.unwrap_err().to_string().contains("Circular"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_already_installed() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("zlib".to_string(), make_pkg("zlib", "1.3.1", vec![], vec![]));
|
||||
|
||||
let mut installed = HashMap::new();
|
||||
installed.insert("zlib".to_string(), "1.3.1".to_string());
|
||||
|
||||
let graph = DependencyGraph::new(packages, installed);
|
||||
let plan = graph.resolve(&["zlib".to_string()], &HashMap::new()).unwrap();
|
||||
|
||||
assert!(plan.build_order.is_empty());
|
||||
assert_eq!(plan.already_installed, vec!["zlib"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_missing_dependency() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("foo".to_string(), make_pkg("foo", "1.0", vec!["missing"], vec![]));
|
||||
|
||||
let graph = DependencyGraph::new(packages, HashMap::new());
|
||||
let result = graph.resolve(&["foo".to_string()], &HashMap::new());
|
||||
assert!(result.is_err());
|
||||
assert!(result.unwrap_err().to_string().contains("missing"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_diamond_dependency() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("a".to_string(), make_pkg("a", "1.0", vec![], vec![]));
|
||||
packages.insert("b".to_string(), make_pkg("b", "1.0", vec!["a"], vec![]));
|
||||
packages.insert("c".to_string(), make_pkg("c", "1.0", vec!["a"], vec![]));
|
||||
packages.insert("d".to_string(), make_pkg("d", "1.0", vec!["b", "c"], vec![]));
|
||||
|
||||
let graph = DependencyGraph::new(packages, HashMap::new());
|
||||
let plan = graph.resolve(&["d".to_string()], &HashMap::new()).unwrap();
|
||||
|
||||
let names: Vec<&str> = plan.build_order.iter().map(|p| p.name.as_str()).collect();
|
||||
// A should appear only once
|
||||
assert_eq!(names.iter().filter(|&&n| n == "a").count(), 1);
|
||||
// A before B and C, B and C before D
|
||||
let a_pos = names.iter().position(|&n| n == "a").unwrap();
|
||||
let d_pos = names.iter().position(|&n| n == "d").unwrap();
|
||||
assert!(a_pos < d_pos);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_reverse_deps() {
|
||||
let mut packages = HashMap::new();
|
||||
packages.insert("zlib".to_string(), make_pkg("zlib", "1.3.1", vec![], vec![]));
|
||||
packages.insert("curl".to_string(), make_pkg("curl", "8.0", vec!["zlib"], vec![]));
|
||||
packages.insert("git".to_string(), make_pkg("git", "2.0", vec!["curl"], vec![]));
|
||||
|
||||
let installed: HashSet<String> = ["zlib", "curl", "git"].iter().map(|s| s.to_string()).collect();
|
||||
|
||||
let rdeps = reverse_deps("zlib", &packages, &installed);
|
||||
assert!(rdeps.contains(&"curl".to_string()));
|
||||
assert!(!rdeps.contains(&"git".to_string())); // git depends on curl, not zlib directly
|
||||
}
|
||||
}
|
||||
311
src/dpack/src/resolver/solib.rs
Normal file
311
src/dpack/src/resolver/solib.rs
Normal file
@@ -0,0 +1,311 @@
|
||||
//! Shared library conflict detection and resolution.
|
||||
//!
|
||||
//! When upgrading or installing a package that provides a shared library,
|
||||
//! check if other installed packages depend on the old version of that library.
|
||||
//!
|
||||
//! Resolution strategies:
|
||||
//! 1. Check if dependents have an update that works with the new lib version
|
||||
//! 2. If yes, offer to upgrade them too
|
||||
//! 3. If no, offer: (a) static compilation, (b) hold back, (c) force
|
||||
//!
|
||||
//! Implementation uses `readelf` or `objdump` to parse ELF shared library
|
||||
//! dependencies from installed binaries.
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use std::collections::{HashMap, HashSet};
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::process::Command;
|
||||
|
||||
use crate::db::PackageDb;
|
||||
|
||||
/// A shared library dependency found in an ELF binary.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
|
||||
pub struct SharedLib {
|
||||
/// Library soname (e.g., "libz.so.1")
|
||||
pub soname: String,
|
||||
|
||||
/// Full path to the library file
|
||||
pub path: Option<PathBuf>,
|
||||
}
|
||||
|
||||
/// A conflict where a library upgrade would break a dependent package.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct LibConflict {
|
||||
/// The library being upgraded
|
||||
pub library: String,
|
||||
|
||||
/// The old soname that dependents link against
|
||||
pub old_soname: String,
|
||||
|
||||
/// The new soname after the upgrade
|
||||
pub new_soname: String,
|
||||
|
||||
/// Packages that depend on the old soname
|
||||
pub affected_packages: Vec<String>,
|
||||
}
|
||||
|
||||
/// Resolution action chosen by the user.
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum ConflictResolution {
|
||||
/// Upgrade all affected packages
|
||||
UpgradeAll,
|
||||
/// Compile the new package with static linking
|
||||
StaticLink,
|
||||
/// Hold back the library (don't upgrade)
|
||||
HoldBack,
|
||||
/// Force the upgrade (user accepts the risk)
|
||||
Force,
|
||||
}
|
||||
|
||||
/// Scan an ELF binary for shared library dependencies.
|
||||
///
|
||||
/// Uses `readelf -d` to extract NEEDED entries.
|
||||
pub fn get_needed_libs(binary_path: &Path) -> Result<Vec<String>> {
|
||||
let output = Command::new("readelf")
|
||||
.args(["-d", &binary_path.to_string_lossy()])
|
||||
.output()
|
||||
.or_else(|_| {
|
||||
// Fallback to objdump
|
||||
Command::new("objdump")
|
||||
.args(["-p", &binary_path.to_string_lossy()])
|
||||
.output()
|
||||
})
|
||||
.context("Neither readelf nor objdump available")?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&output.stdout);
|
||||
let mut libs = Vec::new();
|
||||
|
||||
for line in stdout.lines() {
|
||||
// readelf format: 0x0000000000000001 (NEEDED) Shared library: [libz.so.1]
|
||||
if line.contains("NEEDED") {
|
||||
if let Some(start) = line.find('[') {
|
||||
if let Some(end) = line.find(']') {
|
||||
libs.push(line[start + 1..end].to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
// objdump format: NEEDED libz.so.1
|
||||
else if line.trim().starts_with("NEEDED") {
|
||||
if let Some(lib) = line.split_whitespace().last() {
|
||||
libs.push(lib.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(libs)
|
||||
}
|
||||
|
||||
/// Get the soname of a shared library file.
|
||||
///
|
||||
/// Uses `readelf -d` to extract the SONAME entry.
|
||||
pub fn get_soname(lib_path: &Path) -> Result<Option<String>> {
|
||||
let output = Command::new("readelf")
|
||||
.args(["-d", &lib_path.to_string_lossy()])
|
||||
.output()
|
||||
.context("readelf not available")?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&output.stdout);
|
||||
|
||||
for line in stdout.lines() {
|
||||
if line.contains("SONAME") {
|
||||
if let Some(start) = line.find('[') {
|
||||
if let Some(end) = line.find(']') {
|
||||
return Ok(Some(line[start + 1..end].to_string()));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(None)
|
||||
}
|
||||
|
||||
/// Build a map of soname → packages that link against it.
|
||||
///
|
||||
/// Scans all ELF binaries/libraries in installed packages.
|
||||
pub fn build_solib_map(db: &PackageDb) -> HashMap<String, Vec<String>> {
|
||||
let mut map: HashMap<String, Vec<String>> = HashMap::new();
|
||||
|
||||
for pkg in db.list_all() {
|
||||
for file in &pkg.files {
|
||||
// Only check ELF binaries and shared libraries
|
||||
let ext = file.extension().map(|e| e.to_string_lossy().to_string());
|
||||
let is_elf = file.starts_with("/usr/bin")
|
||||
|| file.starts_with("/usr/lib")
|
||||
|| file.starts_with("/usr/sbin")
|
||||
|| ext.as_deref() == Some("so")
|
||||
|| file.to_string_lossy().contains(".so.");
|
||||
|
||||
if !is_elf || !file.exists() {
|
||||
continue;
|
||||
}
|
||||
|
||||
if let Ok(libs) = get_needed_libs(file) {
|
||||
for lib in libs {
|
||||
map.entry(lib)
|
||||
.or_default()
|
||||
.push(pkg.name.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Deduplicate package names per soname
|
||||
for pkgs in map.values_mut() {
|
||||
pkgs.sort();
|
||||
pkgs.dedup();
|
||||
}
|
||||
|
||||
map
|
||||
}
|
||||
|
||||
/// Check if upgrading a package would cause shared library conflicts.
|
||||
///
|
||||
/// Compares the old package's provided sonames with the new package's sonames.
|
||||
/// If a soname changes (e.g., `libfoo.so.1` → `libfoo.so.2`), find all
|
||||
/// packages that link against the old soname.
|
||||
pub fn check_upgrade_conflicts(
|
||||
package_name: &str,
|
||||
old_files: &[PathBuf],
|
||||
new_files: &[PathBuf],
|
||||
solib_map: &HashMap<String, Vec<String>>,
|
||||
) -> Vec<LibConflict> {
|
||||
let mut conflicts = Vec::new();
|
||||
|
||||
// Find sonames provided by the old version
|
||||
let old_sonames = collect_provided_sonames(old_files);
|
||||
let new_sonames = collect_provided_sonames(new_files);
|
||||
|
||||
// Check for sonames that exist in old but not in new
|
||||
for old_so in &old_sonames {
|
||||
if !new_sonames.contains(old_so) {
|
||||
// Find the replacement (if any) — same base name, different version
|
||||
let base = soname_base(old_so);
|
||||
let replacement = new_sonames
|
||||
.iter()
|
||||
.find(|s| soname_base(s) == base)
|
||||
.cloned()
|
||||
.unwrap_or_else(|| "REMOVED".to_string());
|
||||
|
||||
// Find affected packages
|
||||
if let Some(dependents) = solib_map.get(old_so) {
|
||||
let affected: Vec<String> = dependents
|
||||
.iter()
|
||||
.filter(|p| p.as_str() != package_name) // Exclude self
|
||||
.cloned()
|
||||
.collect();
|
||||
|
||||
if !affected.is_empty() {
|
||||
conflicts.push(LibConflict {
|
||||
library: package_name.to_string(),
|
||||
old_soname: old_so.clone(),
|
||||
new_soname: replacement,
|
||||
affected_packages: affected,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
conflicts
|
||||
}
|
||||
|
||||
/// Collect sonames provided by a set of files.
|
||||
fn collect_provided_sonames(files: &[PathBuf]) -> HashSet<String> {
|
||||
let mut sonames = HashSet::new();
|
||||
|
||||
for file in files {
|
||||
if file.to_string_lossy().contains(".so") && file.exists() {
|
||||
if let Ok(Some(soname)) = get_soname(file) {
|
||||
sonames.insert(soname);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
sonames
|
||||
}
|
||||
|
||||
/// Extract the base name from a soname (strip version suffix).
|
||||
/// e.g., "libz.so.1" → "libz.so", "libfoo.so.2.3.4" → "libfoo.so"
|
||||
fn soname_base(soname: &str) -> String {
|
||||
if let Some(pos) = soname.find(".so.") {
|
||||
soname[..pos + 3].to_string() // Include ".so"
|
||||
} else {
|
||||
soname.to_string()
|
||||
}
|
||||
}
|
||||
|
||||
/// Format a conflict report for display to the user.
|
||||
pub fn format_conflict_report(conflicts: &[LibConflict]) -> String {
|
||||
if conflicts.is_empty() {
|
||||
return "No shared library conflicts detected.".to_string();
|
||||
}
|
||||
|
||||
let mut report = String::new();
|
||||
report.push_str(&format!(
|
||||
"WARNING: {} shared library conflict(s) detected:\n\n",
|
||||
conflicts.len()
|
||||
));
|
||||
|
||||
for conflict in conflicts {
|
||||
report.push_str(&format!(
|
||||
" Library: {} → {}\n",
|
||||
conflict.old_soname, conflict.new_soname
|
||||
));
|
||||
report.push_str(&format!(" Source: {}\n", conflict.library));
|
||||
report.push_str(&format!(
|
||||
" Affected packages: {}\n",
|
||||
conflict.affected_packages.join(", ")
|
||||
));
|
||||
report.push_str("\n Options:\n");
|
||||
report.push_str(" 1. Upgrade affected packages\n");
|
||||
report.push_str(" 2. Compile with static linking\n");
|
||||
report.push_str(" 3. Hold back the upgrade\n");
|
||||
report.push_str(" 4. Force (accept the risk)\n\n");
|
||||
}
|
||||
|
||||
report
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_soname_base() {
|
||||
assert_eq!(soname_base("libz.so.1"), "libz.so");
|
||||
assert_eq!(soname_base("libfoo.so.2.3.4"), "libfoo.so");
|
||||
assert_eq!(soname_base("libbar.so"), "libbar.so");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_check_upgrade_no_conflict() {
|
||||
let old_files: Vec<PathBuf> = vec![];
|
||||
let new_files: Vec<PathBuf> = vec![];
|
||||
let solib_map = HashMap::new();
|
||||
|
||||
let conflicts = check_upgrade_conflicts("test", &old_files, &new_files, &solib_map);
|
||||
assert!(conflicts.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_format_empty_report() {
|
||||
let report = format_conflict_report(&[]);
|
||||
assert!(report.contains("No shared library conflicts"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_format_conflict_report() {
|
||||
let conflicts = vec![LibConflict {
|
||||
library: "zlib".to_string(),
|
||||
old_soname: "libz.so.1".to_string(),
|
||||
new_soname: "libz.so.2".to_string(),
|
||||
affected_packages: vec!["curl".to_string(), "git".to_string()],
|
||||
}];
|
||||
|
||||
let report = format_conflict_report(&conflicts);
|
||||
assert!(report.contains("libz.so.1"));
|
||||
assert!(report.contains("libz.so.2"));
|
||||
assert!(report.contains("curl"));
|
||||
assert!(report.contains("git"));
|
||||
}
|
||||
}
|
||||
340
src/dpack/src/sandbox/mod.rs
Normal file
340
src/dpack/src/sandbox/mod.rs
Normal file
@@ -0,0 +1,340 @@
|
||||
//! Build sandboxing using Linux namespaces or bubblewrap.
|
||||
//!
|
||||
//! Isolates package builds so that:
|
||||
//! - Only declared dependencies are visible in the sandbox's filesystem
|
||||
//! - Build processes run in a separate PID namespace
|
||||
//! - Network access is blocked by default (configurable)
|
||||
//! - Installed files are captured via `DESTDIR` to a staging area
|
||||
//!
|
||||
//! Two backends are supported:
|
||||
//! 1. **bubblewrap (bwrap)** — preferred, lightweight, unprivileged
|
||||
//! 2. **direct** — no sandboxing (fallback for bootstrapping or debugging)
|
||||
|
||||
use anyhow::{bail, Context, Result};
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::process::Command;
|
||||
|
||||
use crate::config::{DpackConfig, PackageDefinition};
|
||||
|
||||
/// Sandbox backend selection.
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
pub enum SandboxBackend {
|
||||
/// Use bubblewrap for isolation
|
||||
Bubblewrap,
|
||||
/// No sandboxing — run directly (for bootstrap or debugging)
|
||||
Direct,
|
||||
}
|
||||
|
||||
/// A configured build sandbox ready to execute a package build.
|
||||
pub struct BuildSandbox {
|
||||
/// The backend to use
|
||||
backend: SandboxBackend,
|
||||
|
||||
/// Working directory for the build (contains extracted source)
|
||||
build_dir: PathBuf,
|
||||
|
||||
/// Staging directory where `DESTDIR` installs to
|
||||
staging_dir: PathBuf,
|
||||
|
||||
/// Path to bubblewrap binary
|
||||
bwrap_path: PathBuf,
|
||||
|
||||
/// Whether to allow network access during build
|
||||
allow_network: bool,
|
||||
|
||||
/// Paths to bind-mount read-only into the sandbox (dependency install dirs)
|
||||
ro_binds: Vec<(PathBuf, PathBuf)>,
|
||||
|
||||
/// Environment variables to set in the sandbox
|
||||
env_vars: Vec<(String, String)>,
|
||||
}
|
||||
|
||||
impl BuildSandbox {
|
||||
/// Create a new sandbox for building a package.
|
||||
pub fn new(
|
||||
config: &DpackConfig,
|
||||
pkg: &PackageDefinition,
|
||||
build_dir: &Path,
|
||||
staging_dir: &Path,
|
||||
) -> Result<Self> {
|
||||
std::fs::create_dir_all(build_dir)
|
||||
.with_context(|| format!("Failed to create build dir: {}", build_dir.display()))?;
|
||||
std::fs::create_dir_all(staging_dir)
|
||||
.with_context(|| format!("Failed to create staging dir: {}", staging_dir.display()))?;
|
||||
|
||||
let backend = if config.sandbox.enabled {
|
||||
// Check if bwrap is available
|
||||
if config.sandbox.bwrap_path.exists() {
|
||||
SandboxBackend::Bubblewrap
|
||||
} else {
|
||||
log::warn!(
|
||||
"Bubblewrap not found at {}, falling back to direct execution",
|
||||
config.sandbox.bwrap_path.display()
|
||||
);
|
||||
SandboxBackend::Direct
|
||||
}
|
||||
} else {
|
||||
SandboxBackend::Direct
|
||||
};
|
||||
|
||||
// Set up environment variables for the build
|
||||
let cflags = config.effective_cflags(&pkg.build.flags.cflags).to_string();
|
||||
let cxxflags = if pkg.build.flags.cxxflags.is_empty() {
|
||||
cflags.clone()
|
||||
} else {
|
||||
pkg.build.flags.cxxflags.clone()
|
||||
};
|
||||
let ldflags = config.effective_ldflags(&pkg.build.flags.ldflags).to_string();
|
||||
let makeflags = if pkg.build.flags.makeflags.is_empty() {
|
||||
config.flags.makeflags.clone()
|
||||
} else {
|
||||
pkg.build.flags.makeflags.clone()
|
||||
};
|
||||
|
||||
let env_vars = vec![
|
||||
("CFLAGS".to_string(), cflags),
|
||||
("CXXFLAGS".to_string(), cxxflags),
|
||||
("LDFLAGS".to_string(), ldflags),
|
||||
("MAKEFLAGS".to_string(), makeflags),
|
||||
("PKG".to_string(), staging_dir.to_string_lossy().to_string()),
|
||||
("HOME".to_string(), "/tmp".to_string()),
|
||||
("PATH".to_string(), "/usr/bin:/usr/sbin:/bin:/sbin".to_string()),
|
||||
("LC_ALL".to_string(), "POSIX".to_string()),
|
||||
];
|
||||
|
||||
Ok(Self {
|
||||
backend,
|
||||
build_dir: build_dir.to_path_buf(),
|
||||
staging_dir: staging_dir.to_path_buf(),
|
||||
bwrap_path: config.sandbox.bwrap_path.clone(),
|
||||
allow_network: config.sandbox.allow_network,
|
||||
ro_binds: Vec::new(),
|
||||
env_vars,
|
||||
})
|
||||
}
|
||||
|
||||
/// Add a read-only bind mount (e.g., dependency install paths).
|
||||
pub fn add_ro_bind(&mut self, host_path: PathBuf, sandbox_path: PathBuf) {
|
||||
self.ro_binds.push((host_path, sandbox_path));
|
||||
}
|
||||
|
||||
/// Execute a shell command inside the sandbox.
|
||||
pub fn exec(&self, command: &str) -> Result<()> {
|
||||
if command.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
log::info!("Sandbox exec: {}", command);
|
||||
|
||||
match &self.backend {
|
||||
SandboxBackend::Direct => self.exec_direct(command),
|
||||
SandboxBackend::Bubblewrap => self.exec_bwrap(command),
|
||||
}
|
||||
}
|
||||
|
||||
/// Execute without sandboxing.
|
||||
fn exec_direct(&self, command: &str) -> Result<()> {
|
||||
let mut cmd = Command::new("bash");
|
||||
cmd.arg("-c").arg(command);
|
||||
cmd.current_dir(&self.build_dir);
|
||||
|
||||
for (key, val) in &self.env_vars {
|
||||
cmd.env(key, val);
|
||||
}
|
||||
|
||||
let status = cmd
|
||||
.status()
|
||||
.with_context(|| format!("Failed to execute: {}", command))?;
|
||||
|
||||
if !status.success() {
|
||||
bail!(
|
||||
"Command failed with exit code {}: {}",
|
||||
status.code().unwrap_or(-1),
|
||||
command
|
||||
);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Execute inside a bubblewrap sandbox.
|
||||
fn exec_bwrap(&self, command: &str) -> Result<()> {
|
||||
let mut cmd = Command::new(&self.bwrap_path);
|
||||
|
||||
// Base filesystem: overlay the build directory as writable
|
||||
cmd.arg("--bind").arg(&self.build_dir).arg("/build");
|
||||
cmd.arg("--bind").arg(&self.staging_dir).arg("/staging");
|
||||
|
||||
// Mount essential system directories read-only
|
||||
cmd.arg("--ro-bind").arg("/usr").arg("/usr");
|
||||
cmd.arg("--ro-bind").arg("/lib").arg("/lib");
|
||||
if Path::new("/lib64").exists() {
|
||||
cmd.arg("--ro-bind").arg("/lib64").arg("/lib64");
|
||||
}
|
||||
cmd.arg("--ro-bind").arg("/bin").arg("/bin");
|
||||
cmd.arg("--ro-bind").arg("/sbin").arg("/sbin");
|
||||
|
||||
// Mount /dev minimal
|
||||
cmd.arg("--dev").arg("/dev");
|
||||
|
||||
// Mount /proc and /tmp
|
||||
cmd.arg("--proc").arg("/proc");
|
||||
cmd.arg("--tmpfs").arg("/tmp");
|
||||
|
||||
// Dependency bind mounts
|
||||
for (host, sandbox) in &self.ro_binds {
|
||||
cmd.arg("--ro-bind").arg(host).arg(sandbox);
|
||||
}
|
||||
|
||||
// PID namespace
|
||||
cmd.arg("--unshare-pid");
|
||||
|
||||
// Network isolation (unless explicitly allowed)
|
||||
if !self.allow_network {
|
||||
cmd.arg("--unshare-net");
|
||||
}
|
||||
|
||||
// Set working directory
|
||||
cmd.arg("--chdir").arg("/build");
|
||||
|
||||
// Environment variables
|
||||
for (key, val) in &self.env_vars {
|
||||
cmd.arg("--setenv").arg(key).arg(val);
|
||||
}
|
||||
|
||||
// Override PKG to point to sandbox staging path
|
||||
cmd.arg("--setenv").arg("PKG").arg("/staging");
|
||||
|
||||
// The actual command
|
||||
cmd.arg("bash").arg("-c").arg(command);
|
||||
|
||||
let status = cmd
|
||||
.status()
|
||||
.with_context(|| format!("Bubblewrap execution failed: {}", command))?;
|
||||
|
||||
if !status.success() {
|
||||
bail!(
|
||||
"Sandboxed command failed with exit code {}: {}",
|
||||
status.code().unwrap_or(-1),
|
||||
command
|
||||
);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Run the full build sequence: prepare → configure → make → install
|
||||
pub fn run_build(&self, pkg: &PackageDefinition) -> Result<()> {
|
||||
// Prepare step (optional: patching, autoreconf, etc.)
|
||||
if !pkg.build.prepare.is_empty() {
|
||||
log::info!(">>> Prepare step");
|
||||
self.exec(&pkg.build.prepare)?;
|
||||
}
|
||||
|
||||
// Configure step
|
||||
if !pkg.build.configure.is_empty() {
|
||||
log::info!(">>> Configure step");
|
||||
self.exec(&pkg.build.configure)?;
|
||||
}
|
||||
|
||||
// Build step
|
||||
log::info!(">>> Build step");
|
||||
self.exec(&pkg.build.make)?;
|
||||
|
||||
// Test step (optional)
|
||||
if !pkg.build.check.is_empty() {
|
||||
log::info!(">>> Check step");
|
||||
// Don't fail the build on test failures — log a warning
|
||||
if let Err(e) = self.exec(&pkg.build.check) {
|
||||
log::warn!("Check step failed (non-fatal): {}", e);
|
||||
}
|
||||
}
|
||||
|
||||
// Install step
|
||||
log::info!(">>> Install step");
|
||||
self.exec(&pkg.build.install)?;
|
||||
|
||||
// Post-install step (optional)
|
||||
if !pkg.build.post_install.is_empty() {
|
||||
log::info!(">>> Post-install step");
|
||||
self.exec(&pkg.build.post_install)?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Get the path to the staging directory where installed files landed.
|
||||
pub fn staging_dir(&self) -> &Path {
|
||||
&self.staging_dir
|
||||
}
|
||||
|
||||
/// Get the build directory path.
|
||||
pub fn build_dir(&self) -> &Path {
|
||||
&self.build_dir
|
||||
}
|
||||
}
|
||||
|
||||
/// Collect all files in the staging directory (for database tracking).
|
||||
pub fn collect_staged_files(staging_dir: &Path) -> Result<Vec<PathBuf>> {
|
||||
let mut files = Vec::new();
|
||||
|
||||
if !staging_dir.exists() {
|
||||
return Ok(files);
|
||||
}
|
||||
|
||||
for entry in walkdir::WalkDir::new(staging_dir)
|
||||
.min_depth(1)
|
||||
.into_iter()
|
||||
.filter_map(|e| e.ok())
|
||||
{
|
||||
if entry.file_type().is_file() || entry.file_type().is_symlink() {
|
||||
// Store path relative to staging dir (= absolute path on target)
|
||||
let rel = entry
|
||||
.path()
|
||||
.strip_prefix(staging_dir)
|
||||
.unwrap_or(entry.path());
|
||||
files.push(PathBuf::from("/").join(rel));
|
||||
}
|
||||
}
|
||||
|
||||
Ok(files)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_sandbox_backend_selection_disabled() {
|
||||
let mut config = DpackConfig::default();
|
||||
config.sandbox.enabled = false;
|
||||
|
||||
let pkg_toml = r#"
|
||||
[package]
|
||||
name = "test"
|
||||
version = "1.0"
|
||||
description = "test"
|
||||
url = "https://example.com"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://example.com/test-1.0.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
|
||||
[build]
|
||||
install = "make DESTDIR=${PKG} install"
|
||||
"#;
|
||||
let pkg = crate::config::PackageDefinition::from_str(pkg_toml).unwrap();
|
||||
let tmpdir = std::env::temp_dir().join("dpack-test-sandbox");
|
||||
let staging = std::env::temp_dir().join("dpack-test-staging");
|
||||
|
||||
let sandbox = BuildSandbox::new(&config, &pkg, &tmpdir, &staging).unwrap();
|
||||
assert_eq!(sandbox.backend, SandboxBackend::Direct);
|
||||
|
||||
// Cleanup
|
||||
let _ = std::fs::remove_dir_all(&tmpdir);
|
||||
let _ = std::fs::remove_dir_all(&staging);
|
||||
}
|
||||
}
|
||||
75
src/install/README.md
Normal file
75
src/install/README.md
Normal file
@@ -0,0 +1,75 @@
|
||||
# DarkForge Linux Installer
|
||||
|
||||
A CRUX-style interactive text-mode installer that runs from the DarkForge live ISO.
|
||||
|
||||
## Overview
|
||||
|
||||
The installer walks through 9 steps to get a working DarkForge system on disk:
|
||||
|
||||
1. **Disk selection** — choose the target drive
|
||||
2. **Partitioning** — GPT auto-partition (ESP 512MB + Swap 96GB + Root)
|
||||
3. **Filesystem creation** — FAT32 (ESP), ext4 (root), swap
|
||||
4. **Base system install** — via dpack or direct copy from live environment
|
||||
5. **Kernel install** — copies kernel to ESP
|
||||
6. **User setup** — root password, user account with groups
|
||||
7. **Locale/timezone** — timezone, locale, keyboard layout
|
||||
8. **Boot config** — EFISTUB boot entry via efibootmgr
|
||||
9. **Optional packages** — desktop, gaming, dev tool groups
|
||||
|
||||
## Requirements
|
||||
|
||||
The installer runs from the DarkForge live ISO environment. It expects:
|
||||
|
||||
- UEFI firmware (no legacy BIOS support)
|
||||
- At least one NVMe or SATA disk
|
||||
- `sgdisk` (GPT partitioning)
|
||||
- `mkfs.ext4`, `mkfs.fat`, `mkswap`
|
||||
- `efibootmgr` (UEFI boot entry creation)
|
||||
|
||||
## Usage
|
||||
|
||||
Boot the DarkForge live ISO, then:
|
||||
|
||||
```bash
|
||||
install
|
||||
```
|
||||
|
||||
Or run directly:
|
||||
|
||||
```bash
|
||||
/install/install.sh
|
||||
```
|
||||
|
||||
## Module Structure
|
||||
|
||||
```
|
||||
install/
|
||||
├── install.sh # Main entry point (9-step wizard)
|
||||
└── modules/
|
||||
├── disk.sh # Disk selection, partitioning, formatting, mounting
|
||||
├── user.sh # User account and hostname setup
|
||||
├── locale.sh # Timezone, locale, keyboard
|
||||
└── packages.sh # Base system install, kernel, optional packages
|
||||
```
|
||||
|
||||
## Partition Scheme
|
||||
|
||||
The auto-partitioner creates:
|
||||
|
||||
| # | Type | Size | Filesystem | Mount |
|
||||
|---|------|------|------------|-------|
|
||||
| 1 | EFI System | 512MB | FAT32 | /boot/efi |
|
||||
| 2 | Linux Swap | 96GB | swap | — |
|
||||
| 3 | Linux Root | Remaining | ext4 | / |
|
||||
|
||||
The 96GB swap matches the RAM size to enable hibernation.
|
||||
|
||||
## Post-Install
|
||||
|
||||
After installation and reboot, the system boots via EFISTUB directly to a tty1 auto-login, which launches the dwl Wayland compositor.
|
||||
|
||||
## Repository
|
||||
|
||||
```
|
||||
git@git.dannyhaslund.dk:danny8632/darkforge.git
|
||||
```
|
||||
150
src/install/install.sh
Executable file
150
src/install/install.sh
Executable file
@@ -0,0 +1,150 @@
|
||||
#!/bin/bash
|
||||
# ============================================================================
|
||||
# DarkForge Linux — Interactive Installer
|
||||
# ============================================================================
|
||||
# Purpose: CRUX-style interactive installer that runs from the live environment.
|
||||
# Walks the user through disk selection, partitioning, base install,
|
||||
# user creation, locale setup, and boot configuration.
|
||||
# Inputs: User input (interactive prompts)
|
||||
# Outputs: A fully installed DarkForge Linux system on the target disk
|
||||
# ============================================================================
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
MODULES_DIR="${SCRIPT_DIR}/modules"
|
||||
MOUNT_POINT="/mnt/darkforge"
|
||||
REPOS_DIR="/var/lib/dpack/repos"
|
||||
|
||||
# --- Colors and formatting --------------------------------------------------
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
CYAN='\033[0;36m'
|
||||
BOLD='\033[1m'
|
||||
NC='\033[0m'
|
||||
|
||||
info() { echo -e "${CYAN}:: ${1}${NC}"; }
|
||||
ok() { echo -e "${GREEN}:: ${1}${NC}"; }
|
||||
warn() { echo -e "${YELLOW}!! ${1}${NC}"; }
|
||||
error() { echo -e "${RED}!! ${1}${NC}"; }
|
||||
ask() { echo -en "${BOLD}${1}${NC}"; }
|
||||
|
||||
# --- Welcome ----------------------------------------------------------------
|
||||
clear
|
||||
echo -e "${BOLD}"
|
||||
echo " ╔══════════════════════════════════════════════════════════════╗"
|
||||
echo " ║ ║"
|
||||
echo " ║ DarkForge Linux Installer v1.0 ║"
|
||||
echo " ║ ║"
|
||||
echo " ║ A custom Linux distribution optimized for gaming ║"
|
||||
echo " ║ and development on AMD Ryzen 9 9950X3D + RTX 5090 ║"
|
||||
echo " ║ ║"
|
||||
echo " ╚══════════════════════════════════════════════════════════════╝"
|
||||
echo -e "${NC}"
|
||||
echo ""
|
||||
echo " This installer will guide you through:"
|
||||
echo " 1. Disk selection and partitioning"
|
||||
echo " 2. Filesystem creation"
|
||||
echo " 3. Base system installation"
|
||||
echo " 4. Kernel installation"
|
||||
echo " 5. User account setup"
|
||||
echo " 6. Locale, timezone, and keyboard"
|
||||
echo " 7. Boot configuration (EFISTUB)"
|
||||
echo " 8. Post-install package selection"
|
||||
echo ""
|
||||
ask "Press Enter to begin, or Ctrl+C to exit..."
|
||||
read -r
|
||||
|
||||
# --- Step 1: Disk selection -------------------------------------------------
|
||||
info "Step 1: Disk Selection"
|
||||
echo ""
|
||||
source "${MODULES_DIR}/disk.sh"
|
||||
select_disk
|
||||
echo ""
|
||||
|
||||
# --- Step 2: Partitioning ---------------------------------------------------
|
||||
info "Step 2: Partitioning"
|
||||
echo ""
|
||||
partition_disk
|
||||
echo ""
|
||||
|
||||
# --- Step 3: Format and mount -----------------------------------------------
|
||||
info "Step 3: Filesystem Creation"
|
||||
echo ""
|
||||
format_partitions
|
||||
mount_partitions
|
||||
echo ""
|
||||
|
||||
# --- Step 4: Base system installation ----------------------------------------
|
||||
info "Step 4: Base System Installation"
|
||||
echo ""
|
||||
source "${MODULES_DIR}/packages.sh"
|
||||
install_base_system
|
||||
echo ""
|
||||
|
||||
# --- Step 5: Kernel installation ---------------------------------------------
|
||||
info "Step 5: Kernel Installation"
|
||||
echo ""
|
||||
install_kernel
|
||||
echo ""
|
||||
|
||||
# --- Step 6: User account setup ---------------------------------------------
|
||||
info "Step 6: User Account Setup"
|
||||
echo ""
|
||||
source "${MODULES_DIR}/user.sh"
|
||||
setup_users
|
||||
echo ""
|
||||
|
||||
# --- Step 7: Locale, timezone, keyboard --------------------------------------
|
||||
info "Step 7: Locale, Timezone, and Keyboard"
|
||||
echo ""
|
||||
source "${MODULES_DIR}/locale.sh"
|
||||
configure_locale
|
||||
echo ""
|
||||
|
||||
# --- Step 8: Boot configuration (EFISTUB) ------------------------------------
|
||||
info "Step 8: Boot Configuration"
|
||||
echo ""
|
||||
configure_boot
|
||||
echo ""
|
||||
|
||||
# --- Step 9: Post-install package selection ----------------------------------
|
||||
info "Step 9: Additional Packages (Optional)"
|
||||
echo ""
|
||||
select_additional_packages
|
||||
echo ""
|
||||
|
||||
# --- Finalize ----------------------------------------------------------------
|
||||
info "Finalizing installation..."
|
||||
|
||||
# Generate fstab
|
||||
generate_fstab
|
||||
|
||||
# Set hostname
|
||||
echo "${INSTALL_HOSTNAME}" > "${MOUNT_POINT}/etc/hostname"
|
||||
|
||||
# Copy rc.conf with configured values
|
||||
configure_rc_conf
|
||||
|
||||
# Unmount
|
||||
info "Unmounting filesystems..."
|
||||
umount -R "${MOUNT_POINT}" 2>/dev/null || true
|
||||
|
||||
echo ""
|
||||
echo -e "${GREEN}${BOLD}"
|
||||
echo " ╔══════════════════════════════════════════════════════════════╗"
|
||||
echo " ║ ║"
|
||||
echo " ║ Installation Complete! ║"
|
||||
echo " ║ ║"
|
||||
echo " ║ Remove the installation media and reboot. ║"
|
||||
echo " ║ Your DarkForge system will boot directly via EFISTUB. ║"
|
||||
echo " ║ ║"
|
||||
echo " ╚══════════════════════════════════════════════════════════════╝"
|
||||
echo -e "${NC}"
|
||||
echo ""
|
||||
ask "Reboot now? [y/N] "
|
||||
read -r response
|
||||
if [[ "${response}" =~ ^[Yy]$ ]]; then
|
||||
reboot
|
||||
fi
|
||||
175
src/install/modules/disk.sh
Executable file
175
src/install/modules/disk.sh
Executable file
@@ -0,0 +1,175 @@
|
||||
#!/bin/bash
|
||||
# ============================================================================
|
||||
# DarkForge Linux Installer — Disk Module
|
||||
# ============================================================================
|
||||
# Handles disk selection, partitioning, formatting, and mounting.
|
||||
# Partition scheme: GPT with ESP (512MB) + Swap (96GB) + Root (remaining)
|
||||
# ============================================================================
|
||||
|
||||
INSTALL_DISK=""
|
||||
PART_ESP=""
|
||||
PART_SWAP=""
|
||||
PART_ROOT=""
|
||||
|
||||
# --- List available disks and let user choose -------------------------------
|
||||
select_disk() {
|
||||
echo "Available disks:"
|
||||
echo ""
|
||||
lsblk -d -o NAME,SIZE,MODEL,TRAN -n | grep -v "loop\|sr\|rom" | while read -r line; do
|
||||
echo " ${line}"
|
||||
done
|
||||
echo ""
|
||||
|
||||
ask "Enter the target disk (e.g., nvme0n1, sda): "
|
||||
read -r INSTALL_DISK
|
||||
|
||||
# Validate
|
||||
if [ ! -b "/dev/${INSTALL_DISK}" ]; then
|
||||
error "Disk /dev/${INSTALL_DISK} not found."
|
||||
select_disk
|
||||
return
|
||||
fi
|
||||
|
||||
echo ""
|
||||
warn "ALL DATA ON /dev/${INSTALL_DISK} WILL BE DESTROYED!"
|
||||
ask "Are you sure? Type 'yes' to confirm: "
|
||||
read -r confirm
|
||||
if [ "${confirm}" != "yes" ]; then
|
||||
error "Aborted."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
export INSTALL_DISK
|
||||
}
|
||||
|
||||
# --- Partition the disk (GPT: ESP + swap + root) ----------------------------
|
||||
partition_disk() {
|
||||
local disk="/dev/${INSTALL_DISK}"
|
||||
|
||||
info "Creating GPT partition table on ${disk}..."
|
||||
|
||||
# Wipe existing partition table
|
||||
sgdisk --zap-all "${disk}"
|
||||
|
||||
# Create partitions:
|
||||
# 1: EFI System Partition (512MB)
|
||||
# 2: Swap (96GB — matches RAM for hibernation)
|
||||
# 3: Root (remaining space)
|
||||
sgdisk -n 1:0:+512M -t 1:ef00 -c 1:"EFI System" "${disk}"
|
||||
sgdisk -n 2:0:+96G -t 2:8200 -c 2:"Linux Swap" "${disk}"
|
||||
sgdisk -n 3:0:0 -t 3:8300 -c 3:"Linux Root" "${disk}"
|
||||
|
||||
# Determine partition device names
|
||||
if [[ "${INSTALL_DISK}" == nvme* ]]; then
|
||||
PART_ESP="${disk}p1"
|
||||
PART_SWAP="${disk}p2"
|
||||
PART_ROOT="${disk}p3"
|
||||
else
|
||||
PART_ESP="${disk}1"
|
||||
PART_SWAP="${disk}2"
|
||||
PART_ROOT="${disk}3"
|
||||
fi
|
||||
|
||||
export PART_ESP PART_SWAP PART_ROOT
|
||||
|
||||
ok "Partitions created:"
|
||||
echo " ESP: ${PART_ESP} (512MB)"
|
||||
echo " Swap: ${PART_SWAP} (96GB)"
|
||||
echo " Root: ${PART_ROOT} (remaining)"
|
||||
|
||||
# Wait for kernel to recognize new partitions
|
||||
partprobe "${disk}" 2>/dev/null || true
|
||||
sleep 1
|
||||
}
|
||||
|
||||
# --- Format partitions ------------------------------------------------------
|
||||
format_partitions() {
|
||||
info "Formatting partitions..."
|
||||
|
||||
# ESP — FAT32
|
||||
mkfs.fat -F 32 -n "ESP" "${PART_ESP}"
|
||||
ok "ESP formatted (FAT32)"
|
||||
|
||||
# Swap
|
||||
mkswap -L "darkforge-swap" "${PART_SWAP}"
|
||||
ok "Swap formatted (96GB)"
|
||||
|
||||
# Root — ext4 (user chose ext4)
|
||||
mkfs.ext4 -L "darkforge-root" -O ^metadata_csum_seed "${PART_ROOT}"
|
||||
ok "Root formatted (ext4)"
|
||||
}
|
||||
|
||||
# --- Mount partitions -------------------------------------------------------
|
||||
mount_partitions() {
|
||||
info "Mounting filesystems to ${MOUNT_POINT}..."
|
||||
|
||||
mkdir -p "${MOUNT_POINT}"
|
||||
mount "${PART_ROOT}" "${MOUNT_POINT}"
|
||||
|
||||
mkdir -p "${MOUNT_POINT}/boot/efi"
|
||||
mount "${PART_ESP}" "${MOUNT_POINT}/boot/efi"
|
||||
|
||||
swapon "${PART_SWAP}"
|
||||
|
||||
ok "Filesystems mounted"
|
||||
}
|
||||
|
||||
# --- Generate fstab from current mounts ------------------------------------
|
||||
generate_fstab() {
|
||||
info "Generating /etc/fstab..."
|
||||
|
||||
local root_uuid=$(blkid -o value -s UUID "${PART_ROOT}")
|
||||
local esp_uuid=$(blkid -o value -s UUID "${PART_ESP}")
|
||||
local swap_uuid=$(blkid -o value -s UUID "${PART_SWAP}")
|
||||
|
||||
cat > "${MOUNT_POINT}/etc/fstab" << EOF
|
||||
# DarkForge Linux — /etc/fstab
|
||||
# Generated by installer on $(date -u +%Y-%m-%d)
|
||||
|
||||
# Root filesystem
|
||||
UUID=${root_uuid} / ext4 defaults,noatime 0 1
|
||||
|
||||
# EFI System Partition
|
||||
UUID=${esp_uuid} /boot/efi vfat defaults,noatime 0 2
|
||||
|
||||
# Swap (96GB for hibernation)
|
||||
UUID=${swap_uuid} none swap defaults 0 0
|
||||
|
||||
# Tmpfs
|
||||
tmpfs /tmp tmpfs defaults,nosuid,nodev 0 0
|
||||
EOF
|
||||
|
||||
ok "fstab generated"
|
||||
}
|
||||
|
||||
# --- Configure EFISTUB boot entry -------------------------------------------
|
||||
configure_boot() {
|
||||
info "Configuring UEFI boot entry (EFISTUB)..."
|
||||
|
||||
local root_uuid=$(blkid -o value -s UUID "${PART_ROOT}")
|
||||
local esp_dev=$(blkid -o value -s PARTUUID "${PART_ESP}")
|
||||
|
||||
# Copy kernel to ESP
|
||||
if [ -f "${MOUNT_POINT}/boot/vmlinuz" ]; then
|
||||
cp "${MOUNT_POINT}/boot/vmlinuz" "${MOUNT_POINT}/boot/efi/EFI/Linux/vmlinuz.efi"
|
||||
mkdir -p "${MOUNT_POINT}/boot/efi/EFI/Linux"
|
||||
ok "Kernel copied to ESP"
|
||||
else
|
||||
warn "No kernel found — you'll need to install one before booting"
|
||||
fi
|
||||
|
||||
# Create UEFI boot entry via efibootmgr
|
||||
if command -v efibootmgr >/dev/null 2>&1; then
|
||||
local disk_dev="/dev/${INSTALL_DISK}"
|
||||
efibootmgr --create \
|
||||
--disk "${disk_dev}" \
|
||||
--part 1 \
|
||||
--label "DarkForge Linux" \
|
||||
--loader "/EFI/Linux/vmlinuz.efi" \
|
||||
--unicode "root=UUID=${root_uuid} rw quiet" \
|
||||
2>/dev/null && ok "UEFI boot entry created" \
|
||||
|| warn "Failed to create UEFI boot entry — you may need to set it manually in BIOS"
|
||||
else
|
||||
warn "efibootmgr not found — set boot entry manually in UEFI firmware"
|
||||
fi
|
||||
}
|
||||
52
src/install/modules/locale.sh
Executable file
52
src/install/modules/locale.sh
Executable file
@@ -0,0 +1,52 @@
|
||||
#!/bin/bash
|
||||
# ============================================================================
|
||||
# DarkForge Linux Installer — Locale Module
|
||||
# ============================================================================
|
||||
# Configures locale, timezone, and keyboard layout.
|
||||
# ============================================================================
|
||||
|
||||
configure_locale() {
|
||||
# --- Timezone ---
|
||||
info "Available timezones: /usr/share/zoneinfo/"
|
||||
echo " Common: America/New_York, America/Chicago, America/Denver,"
|
||||
echo " America/Los_Angeles, Europe/London, Europe/Berlin"
|
||||
echo ""
|
||||
ask "Timezone [America/New_York]: "
|
||||
read -r tz
|
||||
tz="${tz:-America/New_York}"
|
||||
|
||||
if [ -f "${MOUNT_POINT}/usr/share/zoneinfo/${tz}" ]; then
|
||||
ln -sf "/usr/share/zoneinfo/${tz}" "${MOUNT_POINT}/etc/localtime"
|
||||
ok "Timezone set to ${tz}"
|
||||
else
|
||||
warn "Timezone '${tz}' not found — using UTC"
|
||||
ln -sf /usr/share/zoneinfo/UTC "${MOUNT_POINT}/etc/localtime"
|
||||
tz="UTC"
|
||||
fi
|
||||
|
||||
# --- Locale ---
|
||||
echo ""
|
||||
ask "Locale [en_US.UTF-8]: "
|
||||
read -r locale
|
||||
locale="${locale:-en_US.UTF-8}"
|
||||
|
||||
# Generate locale
|
||||
echo "${locale} UTF-8" > "${MOUNT_POINT}/etc/locale.gen"
|
||||
chroot "${MOUNT_POINT}" locale-gen 2>/dev/null || true
|
||||
echo "LANG=${locale}" > "${MOUNT_POINT}/etc/locale.conf"
|
||||
ok "Locale set to ${locale}"
|
||||
|
||||
# --- Keyboard ---
|
||||
echo ""
|
||||
ask "Keyboard layout [us]: "
|
||||
read -r keymap
|
||||
keymap="${keymap:-us}"
|
||||
|
||||
echo "KEYMAP=${keymap}" > "${MOUNT_POINT}/etc/vconsole.conf"
|
||||
ok "Keyboard layout set to ${keymap}"
|
||||
|
||||
# Store for rc.conf generation
|
||||
export INSTALL_TIMEZONE="${tz}"
|
||||
export INSTALL_LOCALE="${locale}"
|
||||
export INSTALL_KEYMAP="${keymap}"
|
||||
}
|
||||
189
src/install/modules/packages.sh
Executable file
189
src/install/modules/packages.sh
Executable file
@@ -0,0 +1,189 @@
|
||||
#!/bin/bash
|
||||
# ============================================================================
|
||||
# DarkForge Linux Installer — Packages Module
|
||||
# ============================================================================
|
||||
# Installs the base system packages and optional package groups.
|
||||
# Uses dpack for package management if available, falls back to direct copy.
|
||||
# ============================================================================
|
||||
|
||||
install_base_system() {
|
||||
info "Installing base system packages..."
|
||||
|
||||
# Bind-mount essential virtual filesystems for chroot
|
||||
mkdir -p "${MOUNT_POINT}"/{dev,proc,sys,run}
|
||||
mount --bind /dev "${MOUNT_POINT}/dev"
|
||||
mount --bind /dev/pts "${MOUNT_POINT}/dev/pts"
|
||||
mount -t proc proc "${MOUNT_POINT}/proc"
|
||||
mount -t sysfs sysfs "${MOUNT_POINT}/sys"
|
||||
mount -t tmpfs tmpfs "${MOUNT_POINT}/run"
|
||||
|
||||
# Copy package repos into the target
|
||||
mkdir -p "${MOUNT_POINT}/var/lib/dpack/repos"
|
||||
cp -a "${REPOS_DIR}"/* "${MOUNT_POINT}/var/lib/dpack/repos/" 2>/dev/null || true
|
||||
|
||||
# Check if dpack is available
|
||||
if command -v dpack >/dev/null 2>&1; then
|
||||
info "Installing via dpack..."
|
||||
|
||||
# Install core packages
|
||||
local core_packages=(
|
||||
glibc gcc binutils linux bash coreutils util-linux
|
||||
sed grep gawk findutils diffutils tar gzip xz zstd bzip2
|
||||
ncurses readline file less make patch m4 bison flex
|
||||
autoconf automake libtool gettext texinfo
|
||||
perl python pkg-config cmake meson ninja
|
||||
gmp mpfr mpc zlib openssl curl git expat libffi
|
||||
eudev sysvinit dbus dhcpcd shadow procps-ng e2fsprogs
|
||||
kmod iproute2 kbd groff man-db man-pages
|
||||
)
|
||||
|
||||
for pkg in "${core_packages[@]}"; do
|
||||
echo -n " Installing ${pkg}... "
|
||||
dpack install "${pkg}" 2>/dev/null && echo "OK" || echo "SKIP"
|
||||
done
|
||||
else
|
||||
info "dpack not available — installing from live filesystem..."
|
||||
|
||||
# Direct copy from the live root to the target
|
||||
# This copies the base system that's already installed in the live env
|
||||
local dirs_to_copy=(
|
||||
usr/bin usr/sbin usr/lib usr/lib64 usr/include usr/share
|
||||
etc lib lib64 bin sbin
|
||||
)
|
||||
|
||||
for dir in "${dirs_to_copy[@]}"; do
|
||||
if [ -d "/${dir}" ]; then
|
||||
echo -n " Copying /${dir}... "
|
||||
mkdir -p "${MOUNT_POINT}/${dir}"
|
||||
cp -a "/${dir}"/* "${MOUNT_POINT}/${dir}/" 2>/dev/null || true
|
||||
echo "OK"
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
# Create essential directories
|
||||
mkdir -p "${MOUNT_POINT}"/{boot,home,mnt,opt,srv,tmp}
|
||||
mkdir -p "${MOUNT_POINT}"/var/{cache,lib,log,lock,run,spool,tmp}
|
||||
chmod 1777 "${MOUNT_POINT}/tmp"
|
||||
|
||||
ok "Base system installed"
|
||||
}
|
||||
|
||||
# --- Install kernel to the target system ------------------------------------
|
||||
install_kernel() {
|
||||
info "Installing kernel..."
|
||||
|
||||
if [ -f "/boot/vmlinuz" ]; then
|
||||
cp "/boot/vmlinuz" "${MOUNT_POINT}/boot/vmlinuz"
|
||||
ok "Kernel installed to /boot/vmlinuz"
|
||||
elif [ -f "/boot/vmlinuz-6.19.8-darkforge" ]; then
|
||||
cp "/boot/vmlinuz-6.19.8-darkforge" "${MOUNT_POINT}/boot/vmlinuz"
|
||||
ok "Kernel installed"
|
||||
else
|
||||
warn "No kernel found in live environment"
|
||||
warn "You'll need to build and install the kernel manually:"
|
||||
warn " cd /usr/src/linux && make -j32 && make modules_install"
|
||||
warn " cp arch/x86/boot/bzImage /boot/vmlinuz"
|
||||
fi
|
||||
|
||||
# Install kernel modules
|
||||
if [ -d "/lib/modules" ]; then
|
||||
cp -a /lib/modules "${MOUNT_POINT}/lib/"
|
||||
ok "Kernel modules installed"
|
||||
fi
|
||||
|
||||
# Install AMD microcode (if available)
|
||||
if [ -f "/boot/amd-ucode.img" ]; then
|
||||
cp "/boot/amd-ucode.img" "${MOUNT_POINT}/boot/"
|
||||
ok "AMD microcode installed"
|
||||
fi
|
||||
}
|
||||
|
||||
# --- Optional package groups ------------------------------------------------
|
||||
select_additional_packages() {
|
||||
echo " Available package groups:"
|
||||
echo ""
|
||||
echo " 1. Desktop Environment (dwl + Wayland + foot + fuzzel)"
|
||||
echo " 2. Gaming Stack (Steam + Wine + Proton + gamemode + mangohud)"
|
||||
echo " 3. Development Tools (rust + extra compilers)"
|
||||
echo " 4. All of the above"
|
||||
echo " 5. Skip (install later)"
|
||||
echo ""
|
||||
ask " Select groups to install [4]: "
|
||||
read -r choice
|
||||
choice="${choice:-4}"
|
||||
|
||||
case "${choice}" in
|
||||
1) install_group_desktop ;;
|
||||
2) install_group_gaming ;;
|
||||
3) install_group_dev ;;
|
||||
4)
|
||||
install_group_desktop
|
||||
install_group_gaming
|
||||
install_group_dev
|
||||
;;
|
||||
5) info "Skipping additional packages" ;;
|
||||
*) warn "Invalid choice — skipping" ;;
|
||||
esac
|
||||
}
|
||||
|
||||
install_group_desktop() {
|
||||
info "Installing desktop environment..."
|
||||
if command -v dpack >/dev/null 2>&1; then
|
||||
dpack install wayland wayland-protocols wlroots dwl xwayland \
|
||||
foot fuzzel libinput libxkbcommon xkeyboard-config \
|
||||
pipewire wireplumber polkit seatd \
|
||||
fontconfig freetype harfbuzz firefox zsh \
|
||||
wl-clipboard grim slurp 2>/dev/null
|
||||
fi
|
||||
ok "Desktop environment installed"
|
||||
}
|
||||
|
||||
install_group_gaming() {
|
||||
info "Installing gaming stack..."
|
||||
if command -v dpack >/dev/null 2>&1; then
|
||||
dpack install nvidia-open steam wine gamemode mangohud \
|
||||
sdl2 vulkan-loader vulkan-tools dxvk vkd3d-proton 2>/dev/null
|
||||
fi
|
||||
ok "Gaming stack installed"
|
||||
}
|
||||
|
||||
install_group_dev() {
|
||||
info "Installing development tools..."
|
||||
if command -v dpack >/dev/null 2>&1; then
|
||||
dpack install rust wezterm 2>/dev/null
|
||||
fi
|
||||
ok "Development tools installed"
|
||||
}
|
||||
|
||||
# --- Configure rc.conf with install-time values ----------------------------
|
||||
configure_rc_conf() {
|
||||
info "Configuring rc.conf..."
|
||||
|
||||
cat > "${MOUNT_POINT}/etc/rc.conf" << EOF
|
||||
#!/bin/bash
|
||||
# DarkForge Linux — System Configuration
|
||||
# Generated by installer on $(date -u +%Y-%m-%d)
|
||||
|
||||
HOSTNAME="${INSTALL_HOSTNAME:-darkforge}"
|
||||
TIMEZONE="${INSTALL_TIMEZONE:-America/New_York}"
|
||||
KEYMAP="${INSTALL_KEYMAP:-us}"
|
||||
LOCALE="${INSTALL_LOCALE:-en_US.UTF-8}"
|
||||
FONT="ter-v18n"
|
||||
|
||||
DAEMONS=(eudev syslog dbus dhcpcd pipewire)
|
||||
|
||||
MODULES=(nvidia nvidia-modeset nvidia-drm nvidia-uvm)
|
||||
|
||||
MODULE_PARAMS=(
|
||||
"nvidia-drm modeset=1"
|
||||
)
|
||||
|
||||
NETWORK_INTERFACE="enp6s0"
|
||||
NETWORK_DHCP=yes
|
||||
|
||||
HARDWARECLOCK="UTC"
|
||||
EOF
|
||||
|
||||
ok "rc.conf configured"
|
||||
}
|
||||
51
src/install/modules/user.sh
Executable file
51
src/install/modules/user.sh
Executable file
@@ -0,0 +1,51 @@
|
||||
#!/bin/bash
|
||||
# ============================================================================
|
||||
# DarkForge Linux Installer — User Module
|
||||
# ============================================================================
|
||||
# Creates root password and user account.
|
||||
# Default: username 'danny', added to wheel/video/audio/input groups.
|
||||
# ============================================================================
|
||||
|
||||
INSTALL_USERNAME=""
|
||||
INSTALL_HOSTNAME=""
|
||||
|
||||
setup_users() {
|
||||
# --- Hostname ---
|
||||
ask "Hostname [darkforge]: "
|
||||
read -r INSTALL_HOSTNAME
|
||||
INSTALL_HOSTNAME="${INSTALL_HOSTNAME:-darkforge}"
|
||||
export INSTALL_HOSTNAME
|
||||
|
||||
# --- Root password ---
|
||||
echo ""
|
||||
info "Set the root password:"
|
||||
chroot "${MOUNT_POINT}" /bin/bash -c "passwd root"
|
||||
|
||||
# --- User account ---
|
||||
echo ""
|
||||
ask "Username [danny]: "
|
||||
read -r INSTALL_USERNAME
|
||||
INSTALL_USERNAME="${INSTALL_USERNAME:-danny}"
|
||||
export INSTALL_USERNAME
|
||||
|
||||
info "Creating user '${INSTALL_USERNAME}'..."
|
||||
|
||||
chroot "${MOUNT_POINT}" /bin/bash -c "
|
||||
useradd -m -G wheel,video,audio,input,kvm -s /bin/zsh '${INSTALL_USERNAME}'
|
||||
"
|
||||
|
||||
info "Set password for '${INSTALL_USERNAME}':"
|
||||
chroot "${MOUNT_POINT}" /bin/bash -c "passwd '${INSTALL_USERNAME}'"
|
||||
|
||||
# Install user shell profile
|
||||
if [ -f "/install/configs/zprofile" ]; then
|
||||
cp "/install/configs/zprofile" "${MOUNT_POINT}/home/${INSTALL_USERNAME}/.zprofile"
|
||||
chroot "${MOUNT_POINT}" chown "${INSTALL_USERNAME}:${INSTALL_USERNAME}" "/home/${INSTALL_USERNAME}/.zprofile"
|
||||
fi
|
||||
|
||||
# Update inittab with the correct username for auto-login
|
||||
sed -i "s/--autologin danny/--autologin ${INSTALL_USERNAME}/" \
|
||||
"${MOUNT_POINT}/etc/inittab"
|
||||
|
||||
ok "User '${INSTALL_USERNAME}' created"
|
||||
}
|
||||
57
src/iso/README.md
Normal file
57
src/iso/README.md
Normal file
@@ -0,0 +1,57 @@
|
||||
# DarkForge ISO Builder
|
||||
|
||||
Builds a bootable live USB/CD image containing the DarkForge installer and a minimal live environment.
|
||||
|
||||
## Overview
|
||||
|
||||
The ISO builder compresses the base system into a squashfs image, creates a UEFI-bootable ISO via xorriso, and includes the installer scripts for deploying DarkForge to disk.
|
||||
|
||||
## Requirements
|
||||
|
||||
- `mksquashfs` (squashfs-tools) — filesystem compression
|
||||
- `xorriso` — ISO9660 image creation
|
||||
- `mkfs.fat` (dosfstools) — EFI partition image
|
||||
- `mcopy` (mtools) — copy files into FAT images
|
||||
- A completed base system build (Phase 3)
|
||||
- A compiled kernel at `kernel/vmlinuz`
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
bash src/iso/build-iso.sh
|
||||
```
|
||||
|
||||
Output: `darkforge-live.iso` in the project root.
|
||||
|
||||
## ISO Layout
|
||||
|
||||
```
|
||||
darkforge-live.iso
|
||||
├── EFI/BOOT/BOOTX64.EFI # Kernel (EFISTUB boot)
|
||||
├── boot/cmdline.txt # Kernel command line
|
||||
├── LiveOS/rootfs.img # squashfs compressed root
|
||||
└── install/ # Installer scripts
|
||||
```
|
||||
|
||||
## Boot Method
|
||||
|
||||
The ISO boots via UEFI only (El Torito with EFI System Partition). No legacy BIOS support. The kernel loads directly via EFISTUB.
|
||||
|
||||
## Testing
|
||||
|
||||
Test the ISO in QEMU:
|
||||
|
||||
```bash
|
||||
qemu-system-x86_64 \
|
||||
-enable-kvm \
|
||||
-m 4G \
|
||||
-bios /usr/share/ovmf/OVMF.fd \
|
||||
-cdrom darkforge-live.iso \
|
||||
-boot d
|
||||
```
|
||||
|
||||
## Repository
|
||||
|
||||
```
|
||||
git@git.dannyhaslund.dk:danny8632/darkforge.git
|
||||
```
|
||||
215
src/iso/build-iso.sh
Executable file
215
src/iso/build-iso.sh
Executable file
@@ -0,0 +1,215 @@
|
||||
#!/bin/bash
|
||||
# ============================================================================
|
||||
# DarkForge Linux — ISO Builder
|
||||
# ============================================================================
|
||||
# Purpose: Build a bootable live USB/CD image containing the DarkForge
|
||||
# installer and a minimal live environment.
|
||||
# Inputs: A completed base system (Phase 3 packages installed)
|
||||
# Outputs: darkforge-live.iso
|
||||
#
|
||||
# Requirements: squashfs-tools, xorriso, mtools, dosfstools
|
||||
#
|
||||
# The ISO layout:
|
||||
# /EFI/BOOT/BOOTX64.EFI — The kernel (EFISTUB boot)
|
||||
# /boot/cmdline.txt — Kernel command line
|
||||
# /LiveOS/rootfs.img — squashfs compressed root filesystem
|
||||
# /install/ — Installer scripts
|
||||
# ============================================================================
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# --- Configuration ----------------------------------------------------------
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
PROJECT_ROOT="$(cd "${SCRIPT_DIR}/../.." && pwd)"
|
||||
BUILD_DIR="/tmp/darkforge-iso-build"
|
||||
ISO_OUTPUT="${PROJECT_ROOT}/darkforge-live.iso"
|
||||
ISO_LABEL="DARKFORGE"
|
||||
|
||||
KERNEL_PATH="${PROJECT_ROOT}/kernel/vmlinuz"
|
||||
# If no pre-built kernel, this will be built during the ISO build process
|
||||
|
||||
# squashfs compression algorithm
|
||||
SQFS_COMP="zstd"
|
||||
SQFS_OPTS="-comp ${SQFS_COMP} -Xcompression-level 19 -b 1M"
|
||||
|
||||
# --- Colors -----------------------------------------------------------------
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
CYAN='\033[0;36m'
|
||||
NC='\033[0m'
|
||||
|
||||
info() { echo -e "${CYAN}>>> ${1}${NC}"; }
|
||||
ok() { echo -e "${GREEN}>>> ${1}${NC}"; }
|
||||
warn() { echo -e "${YELLOW}!!! ${1}${NC}"; }
|
||||
die() { echo -e "${RED}!!! ${1}${NC}"; exit 1; }
|
||||
|
||||
# --- Preflight checks -------------------------------------------------------
|
||||
info "DarkForge Linux ISO Builder"
|
||||
echo ""
|
||||
|
||||
for tool in mksquashfs xorriso mkfs.fat mcopy; do
|
||||
command -v "${tool}" >/dev/null 2>&1 || die "Required tool not found: ${tool}"
|
||||
done
|
||||
|
||||
# --- Clean previous builds --------------------------------------------------
|
||||
info "Cleaning previous build artifacts..."
|
||||
rm -rf "${BUILD_DIR}"
|
||||
mkdir -p "${BUILD_DIR}"/{iso,rootfs,efi}
|
||||
|
||||
# --- Build the live root filesystem -----------------------------------------
|
||||
info "Preparing live root filesystem..."
|
||||
|
||||
ROOTFS="${BUILD_DIR}/rootfs"
|
||||
|
||||
# Create essential directory structure
|
||||
mkdir -p "${ROOTFS}"/{bin,boot,dev,etc,home,lib,lib64,mnt,opt,proc,root,run}
|
||||
mkdir -p "${ROOTFS}"/{sbin,srv,sys,tmp,usr/{bin,include,lib,sbin,share},var}
|
||||
mkdir -p "${ROOTFS}"/var/{cache,lib,log,tmp}
|
||||
mkdir -p "${ROOTFS}"/etc/{rc.d,sysconfig}
|
||||
mkdir -p "${ROOTFS}"/usr/share/{man,doc}
|
||||
|
||||
# Copy base system (installed via dpack or direct copy from the chroot)
|
||||
# This expects the base system to exist in a staging area
|
||||
BASE_SYSTEM="${PROJECT_ROOT}/build/base-system"
|
||||
if [ -d "${BASE_SYSTEM}" ]; then
|
||||
info "Copying base system from ${BASE_SYSTEM}..."
|
||||
cp -a "${BASE_SYSTEM}"/* "${ROOTFS}"/
|
||||
else
|
||||
warn "No base system found at ${BASE_SYSTEM}"
|
||||
warn "The ISO will contain a minimal skeleton only."
|
||||
warn "Build the base system first (Phase 3), then re-run."
|
||||
|
||||
# Create minimal skeleton for testing
|
||||
# These would normally come from the base system packages
|
||||
cp -a /bin/busybox "${ROOTFS}/bin/" 2>/dev/null || true
|
||||
fi
|
||||
|
||||
# --- Install DarkForge configuration ----------------------------------------
|
||||
info "Installing DarkForge configuration..."
|
||||
cp "${PROJECT_ROOT}/configs/rc.conf" "${ROOTFS}/etc/rc.conf"
|
||||
cp "${PROJECT_ROOT}/configs/inittab" "${ROOTFS}/etc/inittab"
|
||||
cp "${PROJECT_ROOT}/configs/fstab.template" "${ROOTFS}/etc/fstab"
|
||||
cp -a "${PROJECT_ROOT}/configs/rc.d/"* "${ROOTFS}/etc/rc.d/"
|
||||
|
||||
# Live-specific: override inittab for installer mode
|
||||
cat > "${ROOTFS}/etc/inittab.live" << 'INITTAB'
|
||||
# DarkForge Live — boots to installer prompt
|
||||
id:3:initdefault:
|
||||
si::sysinit:/etc/rc.d/rc.sysinit
|
||||
l3:3:wait:/etc/rc.d/rc.multi
|
||||
1:2345:respawn:/sbin/agetty --autologin root --noclear 38400 tty1 linux
|
||||
2:2345:respawn:/sbin/agetty 38400 tty2 linux
|
||||
ca::ctrlaltdel:/sbin/shutdown -r now
|
||||
INITTAB
|
||||
cp "${ROOTFS}/etc/inittab.live" "${ROOTFS}/etc/inittab"
|
||||
|
||||
# Live-specific: auto-launch installer on login
|
||||
cat > "${ROOTFS}/root/.bash_profile" << 'PROFILE'
|
||||
echo ""
|
||||
echo " ╔══════════════════════════════════════════╗"
|
||||
echo " ║ DarkForge Linux Installer ║"
|
||||
echo " ║ ║"
|
||||
echo " ║ Type 'install' to begin installation ║"
|
||||
echo " ║ Type 'shell' for a live shell ║"
|
||||
echo " ╚══════════════════════════════════════════╝"
|
||||
echo ""
|
||||
|
||||
alias install='/install/install.sh'
|
||||
alias shell='exec /bin/bash --login'
|
||||
PROFILE
|
||||
|
||||
# --- Copy installer scripts -------------------------------------------------
|
||||
info "Copying installer scripts..."
|
||||
mkdir -p "${ROOTFS}/install"
|
||||
cp -a "${PROJECT_ROOT}/src/install/"* "${ROOTFS}/install/" 2>/dev/null || true
|
||||
|
||||
# Copy dpack binary and repos (for base package installation during install)
|
||||
mkdir -p "${ROOTFS}/var/lib/dpack/repos"
|
||||
cp -a "${PROJECT_ROOT}/src/repos/"* "${ROOTFS}/var/lib/dpack/repos/" 2>/dev/null || true
|
||||
|
||||
# --- Create the squashfs image ----------------------------------------------
|
||||
info "Creating squashfs image (${SQFS_COMP})..."
|
||||
mksquashfs "${ROOTFS}" "${BUILD_DIR}/iso/LiveOS/rootfs.img" \
|
||||
${SQFS_OPTS} \
|
||||
-noappend \
|
||||
-wildcards \
|
||||
-e 'proc/*' 'sys/*' 'dev/*' 'run/*' 'tmp/*'
|
||||
|
||||
ok "squashfs image created: $(du -sh "${BUILD_DIR}/iso/LiveOS/rootfs.img" | cut -f1)"
|
||||
|
||||
# --- Prepare EFI boot -------------------------------------------------------
|
||||
info "Preparing UEFI boot..."
|
||||
|
||||
# Create the EFI boot directory structure
|
||||
mkdir -p "${BUILD_DIR}/iso/EFI/BOOT"
|
||||
|
||||
# Copy the kernel as the EFI boot binary
|
||||
if [ -f "${KERNEL_PATH}" ]; then
|
||||
cp "${KERNEL_PATH}" "${BUILD_DIR}/iso/EFI/BOOT/BOOTX64.EFI"
|
||||
ok "Kernel copied to EFI/BOOT/BOOTX64.EFI"
|
||||
else
|
||||
warn "No kernel found at ${KERNEL_PATH}"
|
||||
warn "You'll need to copy the kernel manually before the ISO is bootable."
|
||||
# Create a placeholder
|
||||
echo "PLACEHOLDER — replace with real kernel" > "${BUILD_DIR}/iso/EFI/BOOT/BOOTX64.EFI"
|
||||
fi
|
||||
|
||||
# Kernel command line embedded via EFISTUB
|
||||
# The kernel reads its cmdline from a built-in or from the EFI boot entry
|
||||
mkdir -p "${BUILD_DIR}/iso/boot"
|
||||
echo "root=live:LABEL=${ISO_LABEL} rd.live.image rd.live.overlay.overlayfs=1 quiet" \
|
||||
> "${BUILD_DIR}/iso/boot/cmdline.txt"
|
||||
|
||||
# --- Create the EFI System Partition image (for El Torito boot) -------------
|
||||
info "Creating EFI boot image for ISO..."
|
||||
ESP_IMG="${BUILD_DIR}/efi/efiboot.img"
|
||||
|
||||
# Calculate size needed (kernel + overhead)
|
||||
KERNEL_SIZE=$(stat -c%s "${BUILD_DIR}/iso/EFI/BOOT/BOOTX64.EFI" 2>/dev/null || echo "1048576")
|
||||
ESP_SIZE=$(( (KERNEL_SIZE / 1024 + 2048) )) # Add 2MB overhead
|
||||
[ ${ESP_SIZE} -lt 4096 ] && ESP_SIZE=4096 # Minimum 4MB
|
||||
|
||||
dd if=/dev/zero of="${ESP_IMG}" bs=1K count=${ESP_SIZE} 2>/dev/null
|
||||
mkfs.fat -F 12 "${ESP_IMG}" >/dev/null
|
||||
mmd -i "${ESP_IMG}" ::/EFI
|
||||
mmd -i "${ESP_IMG}" ::/EFI/BOOT
|
||||
mcopy -i "${ESP_IMG}" "${BUILD_DIR}/iso/EFI/BOOT/BOOTX64.EFI" ::/EFI/BOOT/BOOTX64.EFI
|
||||
|
||||
ok "EFI boot image created (${ESP_SIZE}K)"
|
||||
|
||||
# --- Build the ISO ----------------------------------------------------------
|
||||
info "Building ISO image..."
|
||||
|
||||
xorriso -as mkisofs \
|
||||
-o "${ISO_OUTPUT}" \
|
||||
-iso-level 3 \
|
||||
-full-iso9660-filenames \
|
||||
-joliet \
|
||||
-rational-rock \
|
||||
-volid "${ISO_LABEL}" \
|
||||
-eltorito-alt-boot \
|
||||
-e efi/efiboot.img \
|
||||
-no-emul-boot \
|
||||
-isohybrid-gpt-basdat \
|
||||
-append_partition 2 0xef "${ESP_IMG}" \
|
||||
"${BUILD_DIR}/iso"
|
||||
|
||||
# --- Summary ----------------------------------------------------------------
|
||||
echo ""
|
||||
ok "═══════════════════════════════════════════════"
|
||||
ok " DarkForge Linux ISO built successfully!"
|
||||
ok ""
|
||||
ok " Output: ${ISO_OUTPUT}"
|
||||
ok " Size: $(du -sh "${ISO_OUTPUT}" | cut -f1)"
|
||||
ok ""
|
||||
ok " Boot: UEFI only (EFISTUB)"
|
||||
ok " Root: squashfs (${SQFS_COMP})"
|
||||
ok "═══════════════════════════════════════════════"
|
||||
echo ""
|
||||
|
||||
# --- Cleanup ----------------------------------------------------------------
|
||||
info "Cleaning up build directory..."
|
||||
rm -rf "${BUILD_DIR}"
|
||||
|
||||
ok "Done."
|
||||
135
src/repos/README.md
Normal file
135
src/repos/README.md
Normal file
@@ -0,0 +1,135 @@
|
||||
# DarkForge Package Repository
|
||||
|
||||
124 package definitions for the complete DarkForge Linux system. Each package is a TOML file describing how to download, build, and install a piece of software.
|
||||
|
||||
## Repository Layout
|
||||
|
||||
```
|
||||
repos/
|
||||
├── core/ 67 packages — base system (toolchain, kernel, utilities, system daemons)
|
||||
├── extra/ 26 packages — libraries, frameworks, drivers
|
||||
├── desktop/ 19 packages — Wayland compositor, terminals, applications
|
||||
└── gaming/ 12 packages — Steam, Wine, Proton, game tools
|
||||
```
|
||||
|
||||
## Package Format
|
||||
|
||||
Each package lives in `<repo>/<name>/<name>.toml`. See the dpack README for the full format specification.
|
||||
|
||||
Example (`core/zlib/zlib.toml`):
|
||||
|
||||
```toml
|
||||
[package]
|
||||
name = "zlib"
|
||||
version = "1.3.1"
|
||||
description = "Compression library implementing the deflate algorithm"
|
||||
url = "https://zlib.net/"
|
||||
license = "zlib"
|
||||
|
||||
[source]
|
||||
url = "https://zlib.net/zlib-${version}.tar.xz"
|
||||
sha256 = "38ef96b8dfe510d42707d9c781877914792541133e1870841463bfa73f883e32"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
configure = "./configure --prefix=/usr"
|
||||
make = "make"
|
||||
install = "make DESTDIR=${PKG} install"
|
||||
```
|
||||
|
||||
## core/ — Base System (67 packages)
|
||||
|
||||
The complete base system needed to boot to a shell:
|
||||
|
||||
**Toolchain:** gcc, glibc, binutils, gmp, mpfr, mpc, linux (kernel)
|
||||
|
||||
**Utilities:** coreutils, util-linux, bash, sed, grep, gawk, findutils, diffutils, tar, gzip, xz, zstd, bzip2, ncurses, readline, file, less, make, patch, m4
|
||||
|
||||
**System:** eudev, sysvinit, dbus, dhcpcd, shadow, procps-ng, e2fsprogs, kmod, iproute2, kbd, amd-microcode
|
||||
|
||||
**Dev tools:** cmake, meson, ninja, python, perl, autoconf, automake, libtool, bison, flex, gettext, texinfo, pkg-config, gperf
|
||||
|
||||
**Libraries:** openssl, curl, git, zlib, expat, libffi, libxml2, pcre2, glib, libmnl, libpipeline, bc
|
||||
|
||||
**Docs:** groff, man-db, man-pages
|
||||
|
||||
## extra/ — Libraries and Frameworks (26 packages)
|
||||
|
||||
Libraries needed by the desktop and gaming stack:
|
||||
|
||||
**Audio:** pipewire, wireplumber
|
||||
|
||||
**Graphics:** mesa, vulkan-headers, vulkan-loader, vulkan-tools, libdrm, nvidia-open
|
||||
|
||||
**Fonts:** fontconfig, freetype, harfbuzz, libpng
|
||||
|
||||
**UI:** pango, cairo, pixman, qt6-base, lxqt-policykit
|
||||
|
||||
**Security:** polkit, duktape, gnutls, nettle, libtasn1, p11-kit
|
||||
|
||||
**Other:** seatd, lua, rust
|
||||
|
||||
## desktop/ — Wayland Desktop (19 packages)
|
||||
|
||||
The complete desktop environment:
|
||||
|
||||
**Wayland:** wayland, wayland-protocols, wlroots, xwayland
|
||||
|
||||
**Compositor:** dwl (dynamic window manager for Wayland, dwm-like)
|
||||
|
||||
**Input:** libinput, libevdev, mtdev, libxkbcommon, xkeyboard-config
|
||||
|
||||
**Apps:** foot (terminal), fuzzel (launcher), firefox, zsh, wezterm, freecad
|
||||
|
||||
**Tools:** wl-clipboard, grim (screenshots), slurp (region select)
|
||||
|
||||
## gaming/ — Gaming Stack (12 packages)
|
||||
|
||||
Everything needed for gaming on Linux:
|
||||
|
||||
**Platform:** steam, wine, proton-ge, protontricks, winetricks
|
||||
|
||||
**Translation:** dxvk (D3D9/10/11→Vulkan), vkd3d-proton (D3D12→Vulkan)
|
||||
|
||||
**Tools:** gamemode, mangohud, sdl2
|
||||
|
||||
**Runtime:** openjdk (for PrismLauncher/Minecraft), prismlauncher
|
||||
|
||||
## Adding a New Package
|
||||
|
||||
1. Create the directory: `mkdir -p <repo>/<name>`
|
||||
2. Create the definition: `<repo>/<name>/<name>.toml`
|
||||
3. Fill in all sections: `[package]`, `[source]`, `[dependencies]`, `[build]`
|
||||
4. Compute the SHA256: `sha256sum <tarball>`
|
||||
5. Test: `dpack install <name>`
|
||||
|
||||
Alternatively, convert from CRUX or Gentoo:
|
||||
|
||||
```bash
|
||||
dpack convert /path/to/Pkgfile -o repos/core/foo/foo.toml
|
||||
dpack convert /path/to/foo-1.0.ebuild -o repos/extra/foo/foo.toml
|
||||
```
|
||||
|
||||
## SHA256 Checksums
|
||||
|
||||
Most package definitions currently have placeholder checksums (`aaa...`). These must be populated with real checksums before building. To compute them:
|
||||
|
||||
```bash
|
||||
for pkg in core/*/; do
|
||||
name=$(basename "$pkg")
|
||||
url=$(grep '^url = ' "$pkg/${name}.toml" | head -1 | sed 's/url = "//;s/"$//' | sed "s/\${version}/$(grep '^version' "$pkg/${name}.toml" | head -1 | sed 's/version = "//;s/"$//')/")
|
||||
echo "Downloading $name from $url..."
|
||||
wget -q "$url" -O "/tmp/${name}.tar" && sha256sum "/tmp/${name}.tar"
|
||||
done
|
||||
```
|
||||
|
||||
## Repository
|
||||
|
||||
```
|
||||
git@git.dannyhaslund.dk:danny8632/darkforge.git
|
||||
```
|
||||
|
||||
Package definitions live in `src/repos/` in the main DarkForge repo.
|
||||
20
src/repos/core/amd-microcode/amd-microcode.toml
Normal file
20
src/repos/core/amd-microcode/amd-microcode.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "amd-microcode"
|
||||
version = "20261201"
|
||||
description = "AMD CPU microcode updates"
|
||||
url = "https://git.kernel.org/pub/scm/linux/kernel/git/firmware/linux-firmware.git"
|
||||
license = "Redistributable"
|
||||
|
||||
[source]
|
||||
url = "https://git.kernel.org/pub/scm/linux/kernel/git/firmware/linux-firmware.git/snapshot/linux-firmware-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = []
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """make"""
|
||||
install = """mkdir -p ${PKG}/lib/firmware/amd-ucode && cp amd-ucode/*.bin ${PKG}/lib/firmware/amd-ucode/ && mkdir -p ${PKG}/boot && cat ${PKG}/lib/firmware/amd-ucode/microcode_amd*.bin > ${PKG}/boot/amd-ucode.img"""
|
||||
20
src/repos/core/autoconf/autoconf.toml
Normal file
20
src/repos/core/autoconf/autoconf.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "autoconf"
|
||||
version = "2.72"
|
||||
description = "GNU autoconf build configuration"
|
||||
url = "https://www.gnu.org/software/autoconf/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/autoconf/autoconf-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["perl", "m4"]
|
||||
build = ["make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/automake/automake.toml
Normal file
20
src/repos/core/automake/automake.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "automake"
|
||||
version = "1.18"
|
||||
description = "GNU automake Makefile generator"
|
||||
url = "https://www.gnu.org/software/automake/"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/automake/automake-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["perl", "autoconf"]
|
||||
build = ["make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/bash/bash.toml
Normal file
20
src/repos/core/bash/bash.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "bash"
|
||||
version = "5.3"
|
||||
description = "GNU Bourne-Again Shell"
|
||||
url = "https://www.gnu.org/software/bash/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/bash/bash-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "readline", "ncurses"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --without-bash-malloc --with-installed-readline"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
19
src/repos/core/bc/bc.toml
Normal file
19
src/repos/core/bc/bc.toml
Normal file
@@ -0,0 +1,19 @@
|
||||
[package]
|
||||
name = "bc"
|
||||
version = "7.0.3"
|
||||
description = "Arbitrary precision calculator"
|
||||
url = "https://git.gavinhoward.com/gavin/bc"
|
||||
license = "BSD-2-Clause"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/gavinhoward/bc/releases/download/${version}/bc-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "readline"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
configure = """./configure --prefix=/usr -O3 -r"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/binutils/binutils.toml
Normal file
20
src/repos/core/binutils/binutils.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "binutils"
|
||||
version = "2.46"
|
||||
description = "GNU binary utilities"
|
||||
url = "https://www.gnu.org/software/binutils/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/binutils/binutils-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "zlib"]
|
||||
build = ["make", "texinfo"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """mkdir build && cd build && ../configure --prefix=/usr --enable-gold --enable-ld=default --enable-plugins --enable-shared --disable-werror --with-system-zlib --enable-default-hash-style=gnu"""
|
||||
make = """make tooldir=/usr"""
|
||||
install = """make DESTDIR=${PKG} tooldir=/usr install"""
|
||||
20
src/repos/core/bison/bison.toml
Normal file
20
src/repos/core/bison/bison.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "bison"
|
||||
version = "3.8.2"
|
||||
description = "GNU parser generator"
|
||||
url = "https://www.gnu.org/software/bison/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/bison/bison-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "m4"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/bzip2/bzip2.toml
Normal file
20
src/repos/core/bzip2/bzip2.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "bzip2"
|
||||
version = "1.0.8"
|
||||
description = "Block-sorting file compressor"
|
||||
url = "https://sourceware.org/bzip2/"
|
||||
license = "bzip2-1.0.6"
|
||||
|
||||
[source]
|
||||
url = "https://sourceware.org/pub/bzip2/bzip2-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """make -f Makefile-libbz2_so && make clean && make"""
|
||||
install = """make PREFIX=${PKG}/usr install"""
|
||||
20
src/repos/core/cmake/cmake.toml
Normal file
20
src/repos/core/cmake/cmake.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "cmake"
|
||||
version = "4.2.3"
|
||||
description = "Cross-platform build system generator"
|
||||
url = "https://cmake.org/"
|
||||
license = "BSD-3-Clause"
|
||||
|
||||
[source]
|
||||
url = "https://cmake.org/files/v4.2/cmake-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "curl", "expat", "zlib", "xz", "zstd"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """./bootstrap --prefix=/usr --system-libs --no-system-jsoncpp --no-system-cppdap --no-system-librhash"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/coreutils/coreutils.toml
Normal file
20
src/repos/core/coreutils/coreutils.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "coreutils"
|
||||
version = "9.6"
|
||||
description = "GNU core utilities"
|
||||
url = "https://www.gnu.org/software/coreutils/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/coreutils/coreutils-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make", "perl"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --enable-no-install-program=kill,uptime"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/curl/curl.toml
Normal file
20
src/repos/core/curl/curl.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "curl"
|
||||
version = "8.19.0"
|
||||
description = "URL transfer library and command-line tool"
|
||||
url = "https://curl.se/"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://curl.se/download/curl-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "openssl", "zlib", "zstd"]
|
||||
build = ["gcc", "make", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static --with-openssl --enable-threaded-resolver --with-ca-path=/etc/ssl/certs"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/dbus/dbus.toml
Normal file
20
src/repos/core/dbus/dbus.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "dbus"
|
||||
version = "1.16.2"
|
||||
description = "D-Bus message bus system"
|
||||
url = "https://www.freedesktop.org/wiki/Software/dbus/"
|
||||
license = "AFL-2.1"
|
||||
|
||||
[source]
|
||||
url = "https://dbus.freedesktop.org/releases/dbus/dbus-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "expat"]
|
||||
build = ["gcc", "make", "pkg-config", "meson", "ninja"]
|
||||
|
||||
[build]
|
||||
system = "meson"
|
||||
configure = """meson setup build --prefix=/usr --buildtype=release -Druntime_dir=/run -Dsystem_pid_file=/run/dbus/pid -Dsystem_socket=/run/dbus/system_bus_socket -Ddoxygen_docs=disabled -Dxml_docs=disabled"""
|
||||
make = """ninja -C build"""
|
||||
install = """DESTDIR=${PKG} ninja -C build install"""
|
||||
20
src/repos/core/dhcpcd/dhcpcd.toml
Normal file
20
src/repos/core/dhcpcd/dhcpcd.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "dhcpcd"
|
||||
version = "10.3.0"
|
||||
description = "DHCP client daemon"
|
||||
url = "https://github.com/NetworkConfiguration/dhcpcd"
|
||||
license = "BSD-2-Clause"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/NetworkConfiguration/dhcpcd/releases/download/v${version}/dhcpcd-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "eudev"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --sysconfdir=/etc --libexecdir=/usr/lib/dhcpcd --dbdir=/var/lib/dhcpcd --runstatedir=/run --disable-privsep"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/diffutils/diffutils.toml
Normal file
20
src/repos/core/diffutils/diffutils.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "diffutils"
|
||||
version = "3.10"
|
||||
description = "GNU file comparison utilities"
|
||||
url = "https://www.gnu.org/software/diffutils/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/diffutils/diffutils-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/e2fsprogs/e2fsprogs.toml
Normal file
20
src/repos/core/e2fsprogs/e2fsprogs.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "e2fsprogs"
|
||||
version = "1.47.4"
|
||||
description = "Ext2/3/4 filesystem utilities"
|
||||
url = "https://e2fsprogs.sourceforge.net/"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://downloads.sourceforge.net/project/e2fsprogs/e2fsprogs/v${version}/e2fsprogs-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "util-linux"]
|
||||
build = ["gcc", "make", "pkg-config", "texinfo"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """mkdir -v build && cd build && ../configure --prefix=/usr --bindir=/usr/bin --with-root-prefix="" --enable-elf-shlibs --disable-libblkid --disable-libuuid --disable-uuidd --disable-fsck"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/eudev/eudev.toml
Normal file
20
src/repos/core/eudev/eudev.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "eudev"
|
||||
version = "3.2.14"
|
||||
description = "Device manager (udev fork without systemd)"
|
||||
url = "https://github.com/eudev-project/eudev"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/eudev-project/eudev/releases/download/v${version}/eudev-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "kmod", "util-linux"]
|
||||
build = ["gcc", "make", "gperf", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --bindir=/usr/sbin --sysconfdir=/etc --enable-manpages --disable-static"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/expat/expat.toml
Normal file
20
src/repos/core/expat/expat.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "expat"
|
||||
version = "2.7.4"
|
||||
description = "XML parsing library"
|
||||
url = "https://libexpat.github.io/"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/libexpat/libexpat/releases/download/R_2_7_4/expat-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/file/file.toml
Normal file
20
src/repos/core/file/file.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "file"
|
||||
version = "5.47"
|
||||
description = "File type identification utility"
|
||||
url = "https://www.darwinsys.com/file/"
|
||||
license = "BSD-2-Clause"
|
||||
|
||||
[source]
|
||||
url = "https://astron.com/pub/file/file-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "zlib"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/findutils/findutils.toml
Normal file
20
src/repos/core/findutils/findutils.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "findutils"
|
||||
version = "4.10.0"
|
||||
description = "GNU file search utilities"
|
||||
url = "https://www.gnu.org/software/findutils/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/findutils/findutils-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --localstatedir=/var/lib/locate"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/flex/flex.toml
Normal file
20
src/repos/core/flex/flex.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "flex"
|
||||
version = "2.6.4"
|
||||
description = "Fast lexical analyzer generator"
|
||||
url = "https://github.com/westes/flex"
|
||||
license = "BSD-2-Clause"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/westes/flex/releases/download/v${version}/flex-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "m4"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/gawk/gawk.toml
Normal file
20
src/repos/core/gawk/gawk.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "gawk"
|
||||
version = "5.4.0"
|
||||
description = "GNU awk text processing language"
|
||||
url = "https://www.gnu.org/software/gawk/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/gawk/gawk-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "readline", "mpfr"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/gcc/gcc.toml
Normal file
20
src/repos/core/gcc/gcc.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "gcc"
|
||||
version = "15.2.0"
|
||||
description = "The GNU Compiler Collection"
|
||||
url = "https://gcc.gnu.org/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/pub/gnu/gcc/gcc-${version}/gcc-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "gmp", "mpfr", "mpc", "zlib"]
|
||||
build = ["make", "sed", "gawk", "texinfo"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """mkdir build && cd build && ../configure --prefix=/usr --enable-languages=c,c++ --enable-default-pie --enable-default-ssp --disable-multilib --with-system-zlib"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/gettext/gettext.toml
Normal file
20
src/repos/core/gettext/gettext.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "gettext"
|
||||
version = "0.23.1"
|
||||
description = "GNU internationalization utilities"
|
||||
url = "https://www.gnu.org/software/gettext/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/gettext/gettext-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/git/git.toml
Normal file
20
src/repos/core/git/git.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "git"
|
||||
version = "2.53.0"
|
||||
description = "Distributed version control system"
|
||||
url = "https://git-scm.com/"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://www.kernel.org/pub/software/scm/git/git-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "curl", "openssl", "zlib", "expat", "perl", "python"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --with-gitconfig=/etc/gitconfig --with-python=python3"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} perllibdir=/usr/lib/perl5/5.40/site_perl install"""
|
||||
19
src/repos/core/glib/glib.toml
Normal file
19
src/repos/core/glib/glib.toml
Normal file
@@ -0,0 +1,19 @@
|
||||
[package]
|
||||
name = "glib"
|
||||
version = "2.84.1"
|
||||
description = "GLib low-level core library"
|
||||
url = "https://gitlab.gnome.org/GNOME/glib"
|
||||
license = "LGPL-2.1"
|
||||
|
||||
[source]
|
||||
url = "https://download.gnome.org/sources/glib/2.84/glib-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "libffi", "zlib", "pcre2"]
|
||||
build = ["gcc", "meson", "ninja", "pkg-config", "python"]
|
||||
|
||||
[build]
|
||||
configure = """meson setup build --prefix=/usr --buildtype=release -Dman-pages=disabled"""
|
||||
make = """ninja -C build"""
|
||||
install = """DESTDIR=${PKG} ninja -C build install"""
|
||||
20
src/repos/core/glibc/glibc.toml
Normal file
20
src/repos/core/glibc/glibc.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "glibc"
|
||||
version = "2.43"
|
||||
description = "The GNU C Library"
|
||||
url = "https://www.gnu.org/software/libc/"
|
||||
license = "LGPL-2.1"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/glibc/glibc-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = ["gcc", "binutils", "make", "sed", "gawk"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """mkdir -v build && cd build && ../configure --prefix=/usr --disable-werror --enable-kernel=5.4 --enable-stack-protector=strong libc_cv_slibdir=/usr/lib"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/gmp/gmp.toml
Normal file
20
src/repos/core/gmp/gmp.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "gmp"
|
||||
version = "6.3.0"
|
||||
description = "GNU Multiple Precision Arithmetic Library"
|
||||
url = "https://gmplib.org/"
|
||||
license = "LGPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://gmplib.org/download/gmp/gmp-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = ["gcc", "make", "m4"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --enable-cxx --disable-static"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
19
src/repos/core/gperf/gperf.toml
Normal file
19
src/repos/core/gperf/gperf.toml
Normal file
@@ -0,0 +1,19 @@
|
||||
[package]
|
||||
name = "gperf"
|
||||
version = "3.1"
|
||||
description = "Perfect hash function generator"
|
||||
url = "https://www.gnu.org/software/gperf/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/gperf/gperf-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/grep/grep.toml
Normal file
20
src/repos/core/grep/grep.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "grep"
|
||||
version = "3.14"
|
||||
description = "GNU grep pattern matching"
|
||||
url = "https://www.gnu.org/software/grep/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/grep/grep-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/groff/groff.toml
Normal file
20
src/repos/core/groff/groff.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "groff"
|
||||
version = "1.24.1"
|
||||
description = "GNU troff typesetting system"
|
||||
url = "https://www.gnu.org/software/groff/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/groff/groff-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "perl"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/gzip/gzip.toml
Normal file
20
src/repos/core/gzip/gzip.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "gzip"
|
||||
version = "1.14"
|
||||
description = "GNU compression utility"
|
||||
url = "https://www.gnu.org/software/gzip/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/gzip/gzip-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/iproute2/iproute2.toml
Normal file
20
src/repos/core/iproute2/iproute2.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "iproute2"
|
||||
version = "6.19.0"
|
||||
description = "IP routing utilities"
|
||||
url = "https://wiki.linuxfoundation.org/networking/iproute2"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://www.kernel.org/pub/linux/utils/net/iproute2/iproute2-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "libmnl"]
|
||||
build = ["gcc", "make", "pkg-config", "bison", "flex"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """make NETNS_RUN_DIR=/run/netns"""
|
||||
install = """make DESTDIR=${PKG} SBINDIR=/usr/sbin install"""
|
||||
20
src/repos/core/kbd/kbd.toml
Normal file
20
src/repos/core/kbd/kbd.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "kbd"
|
||||
version = "2.6.4"
|
||||
description = "Keyboard utilities"
|
||||
url = "https://kbd-project.org/"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://www.kernel.org/pub/linux/utils/kbd/kbd-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make", "autoconf", "automake"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-vlock"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/kmod/kmod.toml
Normal file
20
src/repos/core/kmod/kmod.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "kmod"
|
||||
version = "34.2"
|
||||
description = "Linux kernel module handling"
|
||||
url = "https://github.com/kmod-project/kmod"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/kmod-project/kmod/archive/refs/tags/v${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "zlib", "xz", "zstd", "openssl"]
|
||||
build = ["gcc", "make", "meson", "ninja", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "meson"
|
||||
configure = """meson setup build --prefix=/usr --buildtype=release"""
|
||||
make = """ninja -C build"""
|
||||
install = """DESTDIR=${PKG} ninja -C build install"""
|
||||
20
src/repos/core/less/less.toml
Normal file
20
src/repos/core/less/less.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "less"
|
||||
version = "692"
|
||||
description = "Terminal pager"
|
||||
url = "http://www.greenwoodsoftware.com/less/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://www.greenwoodsoftware.com/less/less-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "ncurses"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --sysconfdir=/etc"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/libffi/libffi.toml
Normal file
20
src/repos/core/libffi/libffi.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "libffi"
|
||||
version = "3.5.2"
|
||||
description = "Foreign function interface library"
|
||||
url = "https://github.com/libffi/libffi"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/libffi/libffi/releases/download/v${version}/libffi-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static --with-gcc-arch=native"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
19
src/repos/core/libmnl/libmnl.toml
Normal file
19
src/repos/core/libmnl/libmnl.toml
Normal file
@@ -0,0 +1,19 @@
|
||||
[package]
|
||||
name = "libmnl"
|
||||
version = "1.0.5"
|
||||
description = "Minimalistic Netlink library"
|
||||
url = "https://netfilter.org/projects/libmnl/"
|
||||
license = "LGPL-2.1"
|
||||
|
||||
[source]
|
||||
url = "https://www.netfilter.org/projects/libmnl/files/libmnl-${version}.tar.bz2"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
configure = """./configure --prefix=/usr --disable-static"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
19
src/repos/core/libpipeline/libpipeline.toml
Normal file
19
src/repos/core/libpipeline/libpipeline.toml
Normal file
@@ -0,0 +1,19 @@
|
||||
[package]
|
||||
name = "libpipeline"
|
||||
version = "1.5.8"
|
||||
description = "Pipeline manipulation library"
|
||||
url = "https://gitlab.com/cjwatson/libpipeline"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://download.savannah.nongnu.org/releases/libpipeline/libpipeline-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
configure = """./configure --prefix=/usr --disable-static"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/libtool/libtool.toml
Normal file
20
src/repos/core/libtool/libtool.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "libtool"
|
||||
version = "2.5.4"
|
||||
description = "GNU libtool generic library support script"
|
||||
url = "https://www.gnu.org/software/libtool/"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/libtool/libtool-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/libxml2/libxml2.toml
Normal file
20
src/repos/core/libxml2/libxml2.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "libxml2"
|
||||
version = "2.15.2"
|
||||
description = "XML C parser and toolkit"
|
||||
url = "https://gitlab.gnome.org/GNOME/libxml2"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://download.gnome.org/sources/libxml2/2.15/libxml2-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "zlib", "xz", "readline"]
|
||||
build = ["gcc", "make", "pkg-config", "python"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static --with-history --with-python=/usr/bin/python3"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/linux/linux.toml
Normal file
20
src/repos/core/linux/linux.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "linux"
|
||||
version = "6.19.8"
|
||||
description = "The Linux kernel"
|
||||
url = "https://www.kernel.org/"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://cdn.kernel.org/pub/linux/kernel/v6.x/linux-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = ["gcc", "make", "bc", "flex", "bison", "openssl", "perl"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """make"""
|
||||
install = """make INSTALL_MOD_PATH=${PKG} modules_install"""
|
||||
20
src/repos/core/m4/m4.toml
Normal file
20
src/repos/core/m4/m4.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "m4"
|
||||
version = "1.4.20"
|
||||
description = "GNU macro processor"
|
||||
url = "https://www.gnu.org/software/m4/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/m4/m4-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/make/make.toml
Normal file
20
src/repos/core/make/make.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "make"
|
||||
version = "4.4.1"
|
||||
description = "GNU make build tool"
|
||||
url = "https://www.gnu.org/software/make/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/make/make-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/man-db/man-db.toml
Normal file
20
src/repos/core/man-db/man-db.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "man-db"
|
||||
version = "2.13.1"
|
||||
description = "Manual page browser"
|
||||
url = "https://man-db.nongnu.org/"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://download.savannah.nongnu.org/releases/man-db/man-db-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "groff", "less", "libpipeline"]
|
||||
build = ["gcc", "make", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --sysconfdir=/etc --disable-setuid --enable-cache-owner=bin --with-browser=/usr/bin/lynx --with-vgrind=/usr/bin/vgrind --with-grap=/usr/bin/grap"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/man-pages/man-pages.toml
Normal file
20
src/repos/core/man-pages/man-pages.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "man-pages"
|
||||
version = "6.16"
|
||||
description = "Linux man pages"
|
||||
url = "https://www.kernel.org/doc/man-pages/"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://www.kernel.org/pub/linux/docs/man-pages/man-pages-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = []
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} prefix=/usr install"""
|
||||
20
src/repos/core/meson/meson.toml
Normal file
20
src/repos/core/meson/meson.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "meson"
|
||||
version = "1.10.2"
|
||||
description = "High performance build system"
|
||||
url = "https://mesonbuild.com/"
|
||||
license = "Apache-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/mesonbuild/meson/releases/download/${version}/meson-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["python"]
|
||||
build = ["python"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """python3 setup.py build"""
|
||||
install = """python3 setup.py install --root=${PKG}"""
|
||||
20
src/repos/core/mpc/mpc.toml
Normal file
20
src/repos/core/mpc/mpc.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "mpc"
|
||||
version = "1.3.1"
|
||||
description = "Multiple-precision complex number library"
|
||||
url = "https://www.multiprecision.org/mpc/"
|
||||
license = "LGPL-2.1"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/mpc/mpc-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["gmp", "mpfr"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/mpfr/mpfr.toml
Normal file
20
src/repos/core/mpfr/mpfr.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "mpfr"
|
||||
version = "4.2.2"
|
||||
description = "Multiple-precision floating-point library"
|
||||
url = "https://www.mpfr.org/"
|
||||
license = "LGPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://www.mpfr.org/mpfr-current/mpfr-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["gmp"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static --enable-thread-safe"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/ncurses/ncurses.toml
Normal file
20
src/repos/core/ncurses/ncurses.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "ncurses"
|
||||
version = "6.5"
|
||||
description = "Terminal handling library"
|
||||
url = "https://invisible-island.net/ncurses/"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://invisible-island.net/datafiles/release/ncurses-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --mandir=/usr/share/man --with-shared --without-debug --without-normal --with-cxx-shared --enable-pc-files --with-pkg-config-libdir=/usr/lib/pkgconfig"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/ninja/ninja.toml
Normal file
20
src/repos/core/ninja/ninja.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "ninja"
|
||||
version = "1.13.0"
|
||||
description = "Small build system with a focus on speed"
|
||||
url = "https://ninja-build.org/"
|
||||
license = "Apache-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/ninja-build/ninja/archive/v${version}/ninja-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make", "python"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """python3 configure.py --bootstrap"""
|
||||
install = """install -Dm755 ninja ${PKG}/usr/bin/ninja"""
|
||||
20
src/repos/core/openssl/openssl.toml
Normal file
20
src/repos/core/openssl/openssl.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "openssl"
|
||||
version = "3.6.1"
|
||||
description = "Cryptography and TLS toolkit"
|
||||
url = "https://www.openssl.org/"
|
||||
license = "Apache-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/openssl/openssl/releases/download/openssl-${version}/openssl-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "zlib"]
|
||||
build = ["gcc", "make", "perl"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """./config --prefix=/usr --openssldir=/etc/ssl --libdir=lib shared zlib-dynamic"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} MANSUFFIX=ssl install"""
|
||||
20
src/repos/core/patch/patch.toml
Normal file
20
src/repos/core/patch/patch.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "patch"
|
||||
version = "2.8"
|
||||
description = "GNU patch utility"
|
||||
url = "https://www.gnu.org/software/patch/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/patch/patch-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
19
src/repos/core/pcre2/pcre2.toml
Normal file
19
src/repos/core/pcre2/pcre2.toml
Normal file
@@ -0,0 +1,19 @@
|
||||
[package]
|
||||
name = "pcre2"
|
||||
version = "10.45"
|
||||
description = "Perl Compatible Regular Expressions v2"
|
||||
url = "https://github.com/PCRE2Project/pcre2"
|
||||
license = "BSD-3-Clause"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/PCRE2Project/pcre2/releases/download/pcre2-${version}/pcre2-${version}.tar.bz2"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "zlib", "readline"]
|
||||
build = ["gcc", "make", "cmake"]
|
||||
|
||||
[build]
|
||||
configure = """./configure --prefix=/usr --enable-unicode --enable-jit --enable-pcre2-16 --enable-pcre2-32 --enable-pcre2grep-libz --enable-pcre2grep-libbz2 --enable-pcre2test-libreadline --disable-static"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/perl/perl.toml
Normal file
20
src/repos/core/perl/perl.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "perl"
|
||||
version = "5.40.2"
|
||||
description = "Practical Extraction and Report Language"
|
||||
url = "https://www.perl.org/"
|
||||
license = "Artistic-1.0"
|
||||
|
||||
[source]
|
||||
url = "https://www.cpan.org/src/5.0/perl-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "zlib"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """sh Configure -des -Dprefix=/usr -Dvendorprefix=/usr -Dprivlib=/usr/lib/perl5/5.40/core_perl -Darchlib=/usr/lib/perl5/5.40/core_perl -Dsitelib=/usr/lib/perl5/5.40/site_perl -Dsitearch=/usr/lib/perl5/5.40/site_perl -Dvendorlib=/usr/lib/perl5/5.40/vendor_perl -Dvendorarch=/usr/lib/perl5/5.40/vendor_perl -Dman1dir=/usr/share/man/man1 -Dman3dir=/usr/share/man/man3 -Dpager='/usr/bin/less -isR' -Duseshrplib -Dusethreads"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/pkg-config/pkg-config.toml
Normal file
20
src/repos/core/pkg-config/pkg-config.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "pkg-config"
|
||||
version = "1.8.0"
|
||||
description = "Package configuration helper tool"
|
||||
url = "https://www.freedesktop.org/wiki/Software/pkg-config/"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://pkgconfig.freedesktop.org/releases/pkg-config-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "glib"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --with-internal-glib --disable-host-tool"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/procps-ng/procps-ng.toml
Normal file
20
src/repos/core/procps-ng/procps-ng.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "procps-ng"
|
||||
version = "4.0.6"
|
||||
description = "Process monitoring utilities (ps, top, free, etc.)"
|
||||
url = "https://gitlab.com/procps-ng/procps"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://sourceforge.net/projects/procps-ng/files/Production/procps-ng-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "ncurses"]
|
||||
build = ["gcc", "make", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static --disable-kill"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/python/python.toml
Normal file
20
src/repos/core/python/python.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "python"
|
||||
version = "3.13.3"
|
||||
description = "Python programming language"
|
||||
url = "https://www.python.org/"
|
||||
license = "PSF-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://www.python.org/ftp/python/${version}/Python-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "expat", "libffi", "openssl", "zlib", "xz", "ncurses", "readline"]
|
||||
build = ["gcc", "make", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --enable-shared --with-system-expat --enable-optimizations"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/readline/readline.toml
Normal file
20
src/repos/core/readline/readline.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "readline"
|
||||
version = "8.3"
|
||||
description = "GNU readline library"
|
||||
url = "https://tiswww.case.edu/php/chet/readline/rltop.html"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/readline/readline-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["ncurses"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static --with-curses"""
|
||||
make = """make SHLIB_LIBS='-lncursesw'"""
|
||||
install = """make SHLIB_LIBS='-lncursesw' DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/sed/sed.toml
Normal file
20
src/repos/core/sed/sed.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "sed"
|
||||
version = "4.9"
|
||||
description = "GNU stream editor"
|
||||
url = "https://www.gnu.org/software/sed/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/sed/sed-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/shadow/shadow.toml
Normal file
20
src/repos/core/shadow/shadow.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "shadow"
|
||||
version = "4.14"
|
||||
description = "User and group management utilities"
|
||||
url = "https://github.com/shadow-maint/shadow"
|
||||
license = "BSD-3-Clause"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/shadow-maint/shadow/releases/download/${version}/shadow-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --sysconfdir=/etc --disable-static --with-group-name-max-length=32"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/sysvinit/sysvinit.toml
Normal file
20
src/repos/core/sysvinit/sysvinit.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "sysvinit"
|
||||
version = "3.15"
|
||||
description = "System V style init programs"
|
||||
url = "https://savannah.nongnu.org/projects/sysvinit/"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/slicer69/sysvinit/releases/download/${version}/sysvinit-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/tar/tar.toml
Normal file
20
src/repos/core/tar/tar.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "tar"
|
||||
version = "1.35"
|
||||
description = "GNU tar archiver"
|
||||
url = "https://www.gnu.org/software/tar/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/tar/tar-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/texinfo/texinfo.toml
Normal file
20
src/repos/core/texinfo/texinfo.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "texinfo"
|
||||
version = "7.3"
|
||||
description = "GNU documentation system"
|
||||
url = "https://www.gnu.org/software/texinfo/"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://ftp.gnu.org/gnu/texinfo/texinfo-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "perl", "ncurses"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/util-linux/util-linux.toml
Normal file
20
src/repos/core/util-linux/util-linux.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "util-linux"
|
||||
version = "2.42"
|
||||
description = "Miscellaneous system utilities"
|
||||
url = "https://github.com/util-linux/util-linux"
|
||||
license = "GPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://www.kernel.org/pub/linux/utils/util-linux/v2.42/util-linux-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "ncurses", "zlib"]
|
||||
build = ["gcc", "make", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --bindir=/usr/bin --libdir=/usr/lib --runstatedir=/run --sbindir=/usr/sbin --disable-chfn-chsh --disable-login --disable-nologin --disable-su --disable-setpriv --disable-runuser --disable-pylibmount --disable-static --disable-liblastlog2 --without-python ADJTIME_PATH=/var/lib/hwclock/adjtime"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
20
src/repos/core/xz/xz.toml
Normal file
20
src/repos/core/xz/xz.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "xz"
|
||||
version = "5.8.1"
|
||||
description = "XZ Utils compression"
|
||||
url = "https://xz.tukaani.org/xz-utils/"
|
||||
license = "LGPL-2.1"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/tukaani-project/xz/releases/download/v${version}/xz-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[build]
|
||||
system = "autotools"
|
||||
configure = """./configure --prefix=/usr --disable-static"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} install"""
|
||||
31
src/repos/core/zlib/zlib.toml
Normal file
31
src/repos/core/zlib/zlib.toml
Normal file
@@ -0,0 +1,31 @@
|
||||
# DarkForge Linux — Package Definition: zlib
|
||||
# This serves as the canonical example of the .dpack format.
|
||||
|
||||
[package]
|
||||
name = "zlib"
|
||||
version = "1.3.1"
|
||||
description = "Compression library implementing the deflate algorithm"
|
||||
url = "https://zlib.net/"
|
||||
license = "zlib"
|
||||
|
||||
[source]
|
||||
url = "https://zlib.net/zlib-${version}.tar.xz"
|
||||
sha256 = "38ef96b8dfe510d42707d9c781877914792541133e1870841463bfa73f883e32"
|
||||
|
||||
[dependencies]
|
||||
run = []
|
||||
build = ["gcc", "make"]
|
||||
|
||||
[dependencies.optional]
|
||||
static = { description = "Build static library", default = true }
|
||||
minizip = { description = "Build minizip utility", deps = [] }
|
||||
|
||||
[build]
|
||||
configure = "./configure --prefix=/usr"
|
||||
make = "make"
|
||||
install = "make DESTDIR=${PKG} install"
|
||||
|
||||
# Per-package flag overrides (empty = use global defaults)
|
||||
[build.flags]
|
||||
cflags = ""
|
||||
ldflags = ""
|
||||
20
src/repos/core/zstd/zstd.toml
Normal file
20
src/repos/core/zstd/zstd.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "zstd"
|
||||
version = "1.5.7"
|
||||
description = "Zstandard fast real-time compression"
|
||||
url = "https://facebook.github.io/zstd/"
|
||||
license = "BSD-3-Clause"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/facebook/zstd/releases/download/v${version}/zstd-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "make", "cmake"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """make prefix=/usr"""
|
||||
install = """make prefix=/usr DESTDIR=${PKG} install"""
|
||||
20
src/repos/desktop/dwl/dwl.toml
Normal file
20
src/repos/desktop/dwl/dwl.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "dwl"
|
||||
version = "0.7"
|
||||
description = "Dynamic window manager for Wayland (dwm-like)"
|
||||
url = "https://codeberg.org/dwl/dwl"
|
||||
license = "GPL-3.0"
|
||||
|
||||
[source]
|
||||
url = "https://codeberg.org/dwl/dwl/archive/v${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["wlroots", "wayland", "wayland-protocols", "libinput", "xwayland"]
|
||||
build = ["gcc", "make", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """make"""
|
||||
install = """make DESTDIR=${PKG} PREFIX=/usr install"""
|
||||
20
src/repos/desktop/firefox/firefox.toml
Normal file
20
src/repos/desktop/firefox/firefox.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "firefox"
|
||||
version = "137.0"
|
||||
description = "Mozilla Firefox web browser"
|
||||
url = "https://www.mozilla.org/firefox/"
|
||||
license = "MPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://archive.mozilla.org/pub/firefox/releases/${version}/source/firefox-${version}.source.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "dbus", "glib", "pango", "cairo", "freetype", "fontconfig", "libffi", "openssl", "zlib"]
|
||||
build = ["gcc", "make", "python", "perl", "pkg-config", "autoconf", "rust", "cbindgen", "nodejs", "nasm"]
|
||||
|
||||
[build]
|
||||
system = "custom"
|
||||
configure = """"""
|
||||
make = """make -f client.mk"""
|
||||
install = """make -f client.mk DESTDIR=${PKG} install"""
|
||||
20
src/repos/desktop/foot/foot.toml
Normal file
20
src/repos/desktop/foot/foot.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "foot"
|
||||
version = "1.21.1"
|
||||
description = "Fast, lightweight Wayland terminal emulator"
|
||||
url = "https://codeberg.org/dnkl/foot"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://codeberg.org/dnkl/foot/archive/${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "wayland", "fontconfig", "freetype", "pixman", "libxkbcommon"]
|
||||
build = ["gcc", "meson", "ninja", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "meson"
|
||||
configure = """meson setup build --prefix=/usr --buildtype=release"""
|
||||
make = """ninja -C build"""
|
||||
install = """DESTDIR=${PKG} ninja -C build install"""
|
||||
20
src/repos/desktop/freecad/freecad.toml
Normal file
20
src/repos/desktop/freecad/freecad.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "freecad"
|
||||
version = "1.0.0"
|
||||
description = "Parametric 3D CAD modeler"
|
||||
url = "https://www.freecad.org/"
|
||||
license = "LGPL-2.0"
|
||||
|
||||
[source]
|
||||
url = "https://github.com/FreeCAD/FreeCAD/archive/refs/tags/${version}/FreeCAD-${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "python", "qt6-base", "opencascade", "boost", "xerces-c", "freetype", "zlib", "libpng"]
|
||||
build = ["gcc", "cmake", "ninja", "pkg-config", "swig"]
|
||||
|
||||
[build]
|
||||
system = "cmake"
|
||||
configure = """cmake -B build -G Ninja -DCMAKE_INSTALL_PREFIX=/usr -DCMAKE_BUILD_TYPE=Release -DBUILD_QT5=OFF -DBUILD_FEM=ON"""
|
||||
make = """ninja -C build"""
|
||||
install = """DESTDIR=${PKG} ninja -C build install"""
|
||||
20
src/repos/desktop/fuzzel/fuzzel.toml
Normal file
20
src/repos/desktop/fuzzel/fuzzel.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "fuzzel"
|
||||
version = "1.12.0"
|
||||
description = "Application launcher for Wayland"
|
||||
url = "https://codeberg.org/dnkl/fuzzel"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://codeberg.org/dnkl/fuzzel/archive/${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc", "wayland", "fontconfig", "freetype", "pixman", "libxkbcommon", "cairo"]
|
||||
build = ["gcc", "meson", "ninja", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "meson"
|
||||
configure = """meson setup build --prefix=/usr --buildtype=release"""
|
||||
make = """ninja -C build"""
|
||||
install = """DESTDIR=${PKG} ninja -C build install"""
|
||||
20
src/repos/desktop/grim/grim.toml
Normal file
20
src/repos/desktop/grim/grim.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "grim"
|
||||
version = "1.4.1"
|
||||
description = "Screenshot tool for Wayland"
|
||||
url = "https://sr.ht/~emersion/grim/"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://git.sr.ht/~emersion/grim/archive/v${version}.tar.gz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["wayland", "wayland-protocols", "pixman", "libpng"]
|
||||
build = ["gcc", "meson", "ninja", "pkg-config"]
|
||||
|
||||
[build]
|
||||
system = "meson"
|
||||
configure = """meson setup build --prefix=/usr --buildtype=release"""
|
||||
make = """ninja -C build"""
|
||||
install = """DESTDIR=${PKG} ninja -C build install"""
|
||||
20
src/repos/desktop/libevdev/libevdev.toml
Normal file
20
src/repos/desktop/libevdev/libevdev.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "libevdev"
|
||||
version = "1.13.3"
|
||||
description = "Input event device wrapper"
|
||||
url = "https://freedesktop.org/wiki/Software/libevdev/"
|
||||
license = "MIT"
|
||||
|
||||
[source]
|
||||
url = "https://freedesktop.org/software/libevdev/libevdev-${version}.tar.xz"
|
||||
sha256 = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||
|
||||
[dependencies]
|
||||
run = ["glibc"]
|
||||
build = ["gcc", "meson", "ninja"]
|
||||
|
||||
[build]
|
||||
system = "meson"
|
||||
configure = """meson setup build --prefix=/usr --buildtype=release"""
|
||||
make = """ninja -C build"""
|
||||
install = """DESTDIR=${PKG} ninja -C build install"""
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user