(1) should be able does not imply must, people are free to continue to use whatever tools they see fit
(2) Most of Debian work is of course already git-based, via Salsa [1], Debian's self-hosted GitLab instance. This is more about what is stored in git, how it relates to a source package (= what .debs are built from). For example, currently most Debian git repositories base their work in "pristine-tar" branches built from upstream tarball releases, rather than using upstream branches directly.
I really wish all the various open source packaging systems would get rid of the concept of source tarballs to the extent possible, especially when those tarballs are not sourced directly from upstream. For example:
- Fedora has a “lookaside cache”, and packagers upload tarballs to it. In theory they come from git as indicated by the source rpm, but I don’t think anything verifies this.
- Python packages build a source tarball. In theory, the new best practice is for a GitHub action to build the package and for a complex mess to attest that really came from GitHub Actions.
- I’ve never made a Debian package, but AFAICT the maintainer kind of does whatever they want.
IMO this is all absurd. If a package hosted by Fedora or Debian or PyPI or crates.io, etc claims to correspond to an upstream git commit or release, then the hosting system should build the package, from the commit or release in question plus whatever package-specific config and patches are needed, and publish that. If it stores a copy of the source, that copy should be cryptographically traceable to the commit in question, which is straightforward: the commit hash is a hash over a bunch of data including the full source!
Just having a deterministic binary can be non-trivial, let alone a way to confirm "this output came from that source" without recompiling everything again from scratch.
I prefer rebasing git histories over messing with the patch quilting that debian packaging standards use(d to use). Though last I had to use the debian packaging mechanisms, I roundtripped them into git for working on them. I lost nothing during the export.
Please, please, stop the nonsense with the patch quilting -- it's really cumbersome, it adds unnecessary cognitive load, it raises the bar to contributions, it makes maintenance harder, and it adds _zero value_. Patch quilting is a lose-lose proposition.
I'd say that `quilt` the utility is pretty much abandoned at this point. The name `quilt` remains in the format name, but otherwise is not relevant.
Nowadays people that maintain patches do it via `gbp-pq` (the "patch queue" subcommand of the badly named `git-buildpackage` software). `gbp-pq switch` reads the patches stored in `debian/patches/`, creates an ephemeral branch on top of the HEAD, and replays them there. Any change done to this branch (new commits, removed comments, amended commits) are transformed by `gbp-pq export` into a valid set of patches that replaces `debian/patches/`.
This mechanism introduces two extra commands (one to "enter" and one to "exit" the patch-applied view) but it allows Debian to easily maintain a mergeable Git repo with floating patches on top of the upstream sources. That's impossible to do with plain Git and needs extra tools or special workflows even outside of Debian.
Rebase.
[0] https://gist.github.com/nicowilliams/ea2fa2b445c2db50d2ee650...
Merges would avoid those problems, but are harder to do if there are lots of conflicts, as you can't fix conflicts patch by patch.
Perhaps a workflow based on merges-of-rebases or rebase-and-overwrite-merge would work, but I don't think it's fair to say "oh just rebase".
Cherry picks preserve that Commit-Id. And so do rebases; because they're just text in a commit message.
So you can track history of patches that way, if you needed to. Which you won't.
(PS some team at google didn't understand git or their true requirements, so they wasted SWE-decades at that point on some rebasing bullshit; I was at least able to help them make it slightly less bad and prevent other teams from copying it)
I've tried it
(I know that's not quite the Greenspun quote)
> Debian guarantees every binary package can be built from the available source packages for licensing and security reasons. For example, if your build system downloaded dependencies from an external site, the owner of the project could release a new version of that dependency with a different license. An attacker could even serve a malicious version of the dependency when the request comes from Debian's build servers. [1]
[1] https://wiki.debian.org/UpstreamGuide#:~:text=make%20V=1-,Su...
With Nix, any fetcher will download the source. It does so in a way that guarantees the shasum of what is fetched is identical, and if you already have something in the nix store with that shasum, it won't have to fetch it.
However, with just a mirror of the debian source tree, you can build everything without hitting the internet. This is assuredly not true with just a mirror of nixpkgs.
OK, I see how the Debian idea differs from the Portage/Nix/etc. idea. For Portage and Nix it is enough that the build proper be offline, but the source code is fetched at the beginning of the package build. Not only do I find this sufficient, I prefer it because IMO it makes the package easier to work with (since you're only wrangling the packaging code, not upstream's).
Nix isn't functional: it's a functional core that moved every bit of the imperative part to an even less parseable stage, labelled it "evaluation" and then ignored any sense of hygiene about it.
No: your dependency tree for packaging should absolutely not include an opaque binary from a cache server, or a link to a years old patch posted on someone else's bugzilla instance (frequently link rotted as well).
Nothing has made me appreciate the decisions of mainstream distributions more then dealing with an alternative like this.
What the OP was referring to, is that Debian's tooling stores the upstream code along with the debian build code. There is support tooling for downloading new upstream versions (uscan) and for incorporating the upstream changes into Debian's version control (uupdate) to manage this complexity, but it does mean that Debian effectively mirrors the upstream code twice: in its source management system (mostly salsa.debian.org nowadays), and in its archive, as Debian source archives.
It's sad how much Linux stuff is moving away from apt to systems like snap and flatpak that ship directly from upstream.
Many packages have stopped shipping the whole source and just keep the debian directory in Git.
Notable examples are
- gcc-*
- openjdk-*
- llvm-toolchain-*
and many more.
But it's still nice to have when an upstream source goes dark unexpectedly, as does occasionally still happen.
On the other hand, it makes for a far easier life when bumping compile or run time dependency versions. There's only one single source of truth providing both the application and the packaging.
It's just the same with Docker and Helm charts. So many projects insist on keeping sometimes all of them in different repositories, making change proposals an utter PITA.
I don’t think it’s a bad move, but it also seems like they were getting by with patches and tarballs.
Debian may still be "getting by" but if they don't make changes like this Git transition they will eventually stop getting by.
Huh. I just learned to use quilt this year as part of learning debian packaging. I've started using it in some of my own forks so I could eventually, maybe, contribute back.
I guess the old quilt/etc recommendation in the debian build docs is part of the docs updates project needed that the linked page talks about.
Is that a fair general read of the situation? (I have further comments to make but wanted to check my basic assumptions first).
At the moment, it is nothing but pain if one is not already accustomed and used to building Debian packages to even get a local build of a package working.
If you want a "simple custom repository" you likely want to go in a different direction and explicitly do things that wouldn't be allowed in the official Debian repositories.
For example, dynamic linking is easy when you only support a single Debian release, or when the Debian build/pkg infrastructure handles this for you, but if you run a custom repository you either need a package for each Debian release you care about and have an understanding of things like `~deb13u1` to make sure your upgrade paths work correctly, or use static binaries (which is what I do for my custom repository).
I kind of appreciate that debian put FOSS at a core value very early on; in fact, it was the first distribution I used that forced me to learn the commandline. The xorg-server or rather X11 server back then was not working so I only had the commandline, and a lean debian handbook. I typed in the commands and learned from that. Before this I had SUSE and it had a much thicker book, with a fancypants GUI - and it was utterly useless. But that was in 2005 or so.
Now, in 2025, I have not used debian or any debian based distribution in a long time. I either compile from source loosely inspired by LFS/BLFS; or I may use Manjaro typically these days, simply because it is the closest to a modern slackware variant (despite systemd; slackware I used for a long time, but sadly it slowed down too much in the last 10 years, even with modern variants such as alienbob's slackware variant - manjaro moves forward like 100x faster and it also works at the same time, including when I want to compile from source; for some reason, many older distributions failed to adapt to the modern era. Systemd may be one barrier here, but the issue is much more fundamental than that. For instance, you have many more packages now, and many things take longer to compile, e. g. LLVM and what not, which in turn is needed for mesa, then we have cmake, meson/ninja and so forth. A lot more software to handle nowadays).
Yeah definitely. I guess this is a result of their weird idea that they have to own the entire world. Every bit of open source Linux software ever made must be in Debian.
If you have to upgrade the entire world it's going to take a while...
How many Debian packages have patches applied to upstream?
I didn't know about this. Link?