Top
Best
New

Posted by birdculture 12/26/2025

Package managers keep using Git as a database, it never works out(nesbitt.io)
784 points | 465 commentspage 4
hk1337 12/26/2025|
I like Go but it’s dependency management is weird and seems to be centered around GitHub a lot.
andreashaerter 12/26/2025||
It's mostly tradition rather than a hard requirement. Go has long supported vanity import paths: https://pkg.go.dev/cmd/go#hdr-Remote_import_paths

For example, we use Hugo to provide independent Go package URLs even though the code is hosted on GitHub. That makes migrating away from GitHub trivial if we ever choose to do so (Repo: https://github.com/foundata/hugo-theme-govanity; Example: https://golang.foundata.com/hugo-theme-dev/). Usage works as expected:

  go get golang.foundata.com/hugo-theme-dev
Edit: Formatting
Hendrikto 12/26/2025|||
There is nothing tying Go to GitHub.
rewgs 12/26/2025||
Not at all. It can grab git repos (as well as work with other VCSs). There's just a lot of stuff on GitHub, hence your impression.
themk 12/26/2025||
I think git is overkill, and probably a database is as well.

I quite like the hackage index, which is an append-only tar file. Incremental updates are trivial using HTTP range requests making hosting it trivial as well.

nacozarina 12/26/2025||
successful things often have humble origins, it’s a feature not a bug

for every project that managed to out-grow ext4/git there were a hundred that were well-served and never needed to over-invest in something else

kuahyeow 12/26/2025||
GitLab employee here. We have completed the move away from Gollum years ago (see https://gitlab.com/groups/gitlab-org/-/epics/2381).

It looks like that doc https://docs.gitlab.com/development/wikis/ was outdated - since fixed to no longer mention Gollum.

PunchyHamster 12/26/2025||
The article conclusion is just... not good. There are many benefits to using Git as backend, you can point your project to every single commit as a version which makes testing any fixes or changes in libs super easy, it has built in integrity control and technically (sadly not in practice) you could just sign commits and use that to verify whether package is authentic.

It being unoptimal bandwidth wise is frankly just a technical hurdle to get over it, with benefits well worth the drawback

aidenn0 12/26/2025||
As far as I know, Nixpkgs doesn't use git as a package database. The packages definitions are stored and developed in git, but the channels certainly are not.
shellkr 12/27/2025||
I am not sure this is necessarily a git issue as it is mostly a GitHub issue. just look at the Aur of Arch Linux which works perfectly.
mikepurvis 12/26/2025||
The nix cli almost exclusively pulls GitHub as zipballs. Not perfect but certainly far faster than a real git clone.
pxc 12/26/2025|
That it supports fetching via Git as well as various via forge-specific tarballs, even for flakes, is pretty nice. It means that if your org uses Nix, you can fall back to distribution via Git as a solution that doesn't require you to stand up any new infra or tie you to any particular vendor, but once you get rolling it's an easy optimization to switch to downloading snapshots.

The most pain probably just becomes from the hugeness of Nixpkgs, but I remain an advocate for the huge monorepo of build recipes.

mikepurvis 12/26/2025||
Yes agreed. It’s possible to imagine some kind of cached-deltas scheme to get faster/smaller updates, but I suspect the folks who would have to build and maintain that are all on gigabit internet connections and don’t feel the complexity is worth it.
pxc 12/26/2025||
> It’s possible to imagine some kind of cached-deltas scheme to get faster/smaller updates

I think the snix¹ folks are working on something like this for the binary caches— the greater granularity of the content-addressing offers morally the same kind of optimization as delta RPMs: you can download less of what you don't need to re-download.

But I'm not aware of any current efforts to let people download the Nixpkgs tree itself more efficiently. Somehow caching Git deltas would be cool. But I'd expect that kind of optimization to come from a company that runs a Git forge, if it's generally viable, and to benefit many projects other than Nix and Nixpkgs.

--

1: https://snix.dev/

mikepurvis 12/27/2025||
Yes indeed. That said nix typically throws away the .git dir so it would require some work to adapt a solution to nix that operates at the git repo level.

The ideal for nix would be “I have all content at commit X and need the deltas for content at commit Y” and i suspect nix would be fairly unique in being able to benefit from that. To the point that it might actually make sense to just implement the fact git repo syncs and have a local client serving those tarballs to the nix daemon.

drzaiusx11 12/26/2025||
I'd add git gemfile dependencies to the list of languages called out here as well. It supports git repos, but in general it's a bad idea unless you are diligent with git tag use and disallow git tag mutability, which also assumes you have complete control of your git dependencies...
krbaccord941x 12/27/2025|
I understand article is concerning RFC2789, in cloning whole indexes for lang indexes, but /cargo/src shallow-clones need another layer, where tertiary compilation or decompression takes place in mutex libraries, whether its SSL certificate is dependent on HTTP fetch.
More comments...