Posted by gingerBill 9/8/2025
Fine, now what if you need to connect to a database, or parse a PDF, or talk to a grpc backend. What a hilariously short-sighted example.
To me, this whole article just screams inexperience.
"be careful all the time" doesn't scale. Half of all developers have below-average diligence, and that's a low bar. No-one is always vigilant, don't think that you're immune to human error.
No, you need tooling, automation to assist. It needs to be supported at the package manager side. Managing a site where many files are uploaded, and then downloaded many times is not a trivial undertaking. It comes with oversight responsibilities. If it's video you have to check for CSAM. If it's executable code, then you have to check for malware.
Package managers are not evil, but they are a tempting target and need to be secured. This can't just be an individual consumer responsibility.
I can't speak for other ecosystems, but some NuGet measures are here:
https://devblogs.microsoft.com/dotnet/building-a-safer-futur...
https://learn.microsoft.com/en-us/nuget/concepts/security-be...
I believe that there have been (a few) successful compromises of packages in NuGet, and that these have been mitigated. I don't know how intense the arms race is now.
Yes, this is the C attitude, where you provide no safety rails or poka-yokes or, indeed, package managers, and therefore you get a lot of fragile reimplementations of package managers (autoconf, anyone?). But you get to keep the satisfaction of blaming the users.
nuget is pretty good. It helps that packages tend to be substantial things, not left-pad.
GNU Autoconf isn't a package manager, it's more an analogue to a setup executable on MS Windows, to detect where the user wants stuff to be installed, where the user has stuff already installed and which features the user wants.
Agree, this is IMHO also a better pattern. 1-liners or even 20-liners are not worth the overhead of extracting a package. Or of depending on a package.
"developers, be more conscious" isn't going to fix all the issues. In general, there are not individual effort fixes to systemic issues.
NPM is also quite a wild west when it comes to publishing packages, any kid can make an account and publish 'left-pad' kind of crap.
We already have quite safe and working setup with APT and software repositories for Debian, Ubuntu etc. While it is not so easy to publish your software to Debian, you get dedicated maintainer and all kinds of requirements you have to fulfill.
But this way all the issues with trust are if not mitigated, they are minimized and for example XZ Utils hack didn't make it to production systems and it took 3 years to prepare and pull it off.
I do not think that the two are cleanly separable. They are client and server ends of the same system.
And I think my point is that I view it as more of a server (registry) and governance problem than the OP author does.
Despite the fact that my employer also has an internal package feed, the security of nuget.org and the central public feed is intrinsic to the security of the whole system.
Nuget was closer to the NPM end of the spectrum, but has tightened up considerably over time. Particularly the "Package ID Prefix Reservations" feature tells me that package names that start with certain words are owned by the relevant entity, be it "System." or "Azure." from Microsoft, or "AWS" from Amazon.
This is important as it's used to distribute SDKs and optional but standard library components and updates.
There is certainly junk on there, but not much load-bearing junk.
My argument was that this concept is not the problem.
Problem is in governance of NPM while NuGet or Maven are stricter and therefore it is registry governance problem.
But on the other hand NPM is much more popular than any other registry.
Then we're in agreement that the article's author has the wrong end of the stick, by focusing on the client end of the file transfer connection.
The popularity of such repositories and package managers are due to users of them.
And the concepts are trivially separable in my opinion. A package manager uses a repo of packages to download from. You don't need a package manager to use a repo. And a package manager could be just local to your machine and thus not need an external repo either. I know in practice the two are combined but that doesn't mean they are not distinct concepts.
Kind of bonkers this even needs to be said, and even then it's missed/ignored.
I'd prefer instead a more balanced title like "Remember to Consider the Costs When Using Package Managers", or whatever.
Yeah, but its down right stupid to do so.
The title isn't even misleading or part of a Motte-and-bailey argument.
People just hear "Package Managers are Evil" and assume that the author means you shouldn't use third party dependencies. Which is NOT what's being argued.
But I guess you'd know that, if you read passed the title.
I think you're splitting hairs if you're saying that these points from the article argue against package managers but don't argue against using third party dependencies.
I similarly think you're splitting hairs if to consider "package managers are useful?" and "third party dependencies are useful?" as distinct points.
Third party dependencies absolutely are liabilities. You are liable to vet them, inspect their licenses and keep them updated while ensuring that they continue working with your existing code.
This is not something package managers help you do. Package managers like NPM make it trivial to skip these steps entirely.
What is being argued for, is a more thoughtful approach to handling third party dependencies. Or at the very least, the need for people to realise that there are costs associated with bringing third party dependencies into your codebase.
Its not splitting hairs at all. Its more of an presumption on the part of a large number of readers, that the 2 points argued conflate to "Package manager suck, because third party dependencies suck and you should write everything from scratch instead".
You should try reading the article before passing judgement.
Its not like the article is called "5 facts that will make you hate package managers. Number 5 will shock you"
I a little annoyed that HackerNews post renamed it to "A critique of package managers" because that implies very different connotations. I'd view an article written like that as if I have some criticisms that could be addressed, rather than the entire concept being bad from the start.
What I'm saying is that you have failed in this argument. You hardly even attempt to make it. Thus clickbait.
You said "this is why I am saying it is evil, as it will send you to hell quicker."
Okay, so then it's up to you to prove this hell actually exists. But you don't. You just assert its existence -- "Dependency hell is a real thing which anyone who has worked on a large project has experienced." By framing it this way, you can dismiss anyone who claims to not have experienced this as not having sufficient experience. But reading the comments here, a lot of people have experienced a sort of "dependency hell" (the kind that's talked about in the wiki you link to) that is solved by package managers.
So that's why it's classed as clickbait -- you (admittedly) wrote a provocative headline that you don't even remotely back up.
FYI for the future since you're lamenting in many comments that people are misinterpreting you, this is why. Given that you don't really make an attempt to prove this dependency hell and package managers are evil, and you don't acknowledge anything good about them, it's reasonable to assume your bias is just that dependencies are evil at their core. It's actually the most charitable reading because otherwise you seem confused.
No it isn't.
Your "more balanced title" isn't even close to what I am saying. I am saying that Package Managers are just bad and should not be used. Not "remember to consider the costs". The net cost is bad for everyone, that's why I said "evil".
For NuGet or Maven I think dependency hell is not something you run into and I don’t have package manager manager for those languages.
There should be enough trust just like I can do sudo apt install.
His take screams „I want to push my niche approach and promote my language from my Ivory Tower of language creator”. He still might not have any relevant experience building businesses line software just like O don’t have experience with building compilers or languages.
Actually his perspective is quite reasonable. Go is in the other part of the spectrum than languages encouraging "left-pad"-type of libraries, and this is a good thing.
As my psychology professor used to say. "Smart is how efficiently use your intelligence. Or don't."
So someone pretty low IQ can be smart - Forrest Gump. Or someone high IQ can be dumb occasionally - a professor so very attuned to his research topic at expense of everything else.
In other words: when someone's knowledge is disproportionately localized/siloed to their prospective subfield or domain of expertise, it does not necessitate generalization to others.
I'm certainly not saying this is the case with this particular individual, as I'm personally not familiar with their background. I'm simply stating that it's a plausible explanation for when experts in one domain make naive assertions about another domain they might not have the same experience in.
A guy designing and then implementing a programming language has a much bigger chance to put a lot of rational thinking into the tooling like dependency manager, than a typical language consumer, who can and often is easily falling into the languages emo wars.
Language designers in general terms will fall into the "more knowledgeable than the average developer"category , but let's not pretend they're anything but mere mortals like the rest of us.
NGL Ive somehow lost the thread and can't tell if we're talking about language integrated dependency managers in the abstract (in the OP), or specifically speaking about golang, odin or something else. I don't know what the emo wars are specifically in reference to but I think we jumped the shark here.
Yes dependency hell is "bad", but we have several language and package management systems today from ninja to uv that make various, obvious trade offs. Optimizing developer time, ergonomics, reproducible builds, configuration complexity are just some of the axes these pre-existing systems focus on.
If you're extremely lucky you get to pick a system that aligns with your style of work and ideals for how software should be built. If you're not, and like the rest of us, you get stuck with everyone else's poor decisions and are forced to make do. All code is legacy code given the right time horizon, so think about software with all those manual dependencies included on disk and nowhere else. How do you safely apply those required security fixes, etc. Don't be user hostile, this will just lead to our past sins like the C of old.
From a purist perspective, you can forgo all other software that you have not written in-house / or does not come with the standard library. This is the monk approach, but outside a few niche work environments that's untenable.
How is ginger bill excluded from this group? No one is more invested in a language than its creator(s).
Sure, he might have given it a lot of thought, but he came up with some completely bonkers conclusions. If you don't want dependencies, DON'T IMPORT DEPENDENCIES. Don't make your dependencies extremely hard to add.
Yawn.. saw it before...next, please
I'm glad you saw through me like a Superman through a lead book. Which is to say, not at all. I wasn't even thinking of Go. Where did this come from? I never mentioned Go. I don't use it or know how it does its packaging.
Are you projecting your feelings onto me as a sort of substitute for the HN gestalt? The discussion was about package managers being evil.
Now please return to the topic at hand.
Let's say you have NPM package manager. What prevents you a rational individual from saying:
{
"depedencies": {}
}So my snarky remark was about him, not about you. I think it's ok to rewind the tree up to see what is about whom. I can sincerely apologize that I have put replies to two distinct human beings, you and that other commenter, in one paragraph. Honestly, I can see that could let to confusion.
I think we can stop now..
Not clear-headed about this? https://old.reddit.com/r/programming/comments/1nbkwzt/packag...
> gingerbill[S] 1 point 2 hours ago
> So a tool that enables evil is not an evil tool?
See counterpoint: hammers, freezers, cars, arrows, guns, bombs, planes, etc. Each of them *can* enable evil. Same way a package manager *can* enable sprawling dependency list.> Let's put it this way, what does a package manager specifically (not the other distinctions I make in the article) do (other than enable bad laziness and lack of proper vetting) that is actually good?
https://old.reddit.com/r/programming/comments/1nbkwzt/packag...
And to reply to your next post:
> Getting to hell quicker is not a good thing. "Emerge on the other side quickly", the other side is still hell, you haven't emerged out of it.
Remaining stuck in limbo forever is worse than going to hell faster :) At least in hell you have a decent company.I'd rather use a hammer even if there is a higher chance to smack my fingers than to have to hit a nail repeatedely with my head.
Odin is "successful enough" so far. Also, you know about it, so that says something.
I have technically written more Odin than Hare (one Godbolt example, arguably two if you count my explaining how to modify the example to illustrate another problem) but that just means I have more justification to say I don't like it.
I've written a lot more Scheme and I had so thoroughly forgotten writing Scheme that I had to go read the source for myself when I got email about it decades later to be sure it wasn't just a coincidence of author names.
I'm not convinced there is space for any of the "C successor" languages in the twenty-first century and in the event space is made or given for one I doubt there'll somehow be room for more. So with today's field I would bet on Zig.
And there doesn't have to be "one winner". This isn't Highlander. It is just wonderful that there is now choice in this domain beyond just the old and obvious.
I don't see how a detailed comparing of language successfullness would bring anything valuable to the point being made about the author being experienced. That seemed just a noise.
Btw the Js ecosystem also has quite a few good packages (and a ton of terrible ones, including some which everyone seems to consider as the gold standard).
Anyone I want to work with on a project is going to have to have the same frustration and want to work on the project less. Only even more because you see they downloaded version 2.7.3-2 but the version I use is 2.7.3-1.
Odin's compiler knows what a package is and will compile it into your program automatically.
`base` is intrinsically necessary to port Odin. `core` seems to be its standard library, your `libc`, `xml`, etc.
And `vendor` is everything else. So you basically get the Python's '`core` is where packages go to die' approach iff they take backwards compatibility seriously. Otherwise, they have breaking changes mid-language version change.
EDIT: Package collections not packages per gingerBill.
And we will take backwards compatibility seriously when we hit 1.0, and only "break" on major versions.
I'm talking about post 1.0 language choices:
- Choose backwards compatibility. Packages frozen in time, you get "Packages go to std to die." - Choose to break backwards compatibility. The ecosystem is split, some choose to go Odin 2 some are Odin 3.
Then those people will have to manage dependencies, which is a hell on its own. Which will cause problems. Because people are super lazy so they will automate it. In the end only thing no package manager gets you is multiple package managers to juggle.
Many languages started without package managers and eventually got them - Java, JavaScript, Python, C, C++
I know people are lazy and will automate hell. That's the entire point of the article: not everything that can be automated ought to be automated.
And the argument about multiple package managers to juggle is only the case IFF there are multiple competing ones, which with Odin, I honestly doubt it would happen if we enforced what a package is in the language. I just don't want to officially endorse one ever because I do view them to be evil.
And I don't care many languages started without them, I am not going to give in.
I don't see how changing package definition is going to help. JS had no concept of package and it was bolted on with NPM. If Odin becomes big enough, the community will override the will of the author.
Plus I don't see huge benefits to not having a package manager other than saving disk space.
Security isn't that much meaningfully better than NPM.
Trust problem exists regardless of package manager.
And people aren't far to trusting, but far too lazy. And importing packages gets job done quickly.
Dunno, hanging out in the Odin discord, it definitely attracts a crowd that thinks similarly to Bill. All the "automate everything" crowd have definitely gone to Zig, where you can create automated monstrosities with the comptime stuff and build.zig files. And the crowd that likes NPM gravitates to Rust. So Odin is just fine IMO. People on the discord share libraries that actually do things, versus an entire dependency to write a few basic procedures.
And speaking of JavaScript, nowadays ES6 does have an idea of what packages/modules/libraries are and it's so much better. All my JS dependencies for my Rails projects are just .esm.js files. I choose modules carefully, don't pull in obfuscated files, read the source, so I have 2 JS dependencies in one project and a single one in another, I write the rest myself in vanilla JS and life is great.
Sure, hence the big enough part. If you get big enough, you'll get people who are using it as a day job language, not their special darling. Having used JS and Java without package managers in a professional setting, they sucked to use.
You import a package, run the main program, see compiler/browser errors then search local repo or th Net for the missing library. Essentially you're the package manager. Which does little for bloat. You can still have a folder and import stuff en masse.
> Odin's compiler knows what a package is and will compile it into your program automatically.
...the word "automatically" should be dropped. Of course compilers compile any supplied dependency "automatically", but it is so obvious that we don't often use the adverb just for that.
They often don't though. Rust, C, C++ need either long command line invocations or a build system for anything beyond hello world. Zig needs a build file for anything beyond hello world.
With Odin, you just invoke "odin build ." and all your dependencies are taken in without needing a build system, build file, make file, etc...
So how do you explain such a system to someone? This is a genuine question I am not sure how to answer.
But yes, Odin builds it into their compiler. Rust doesn't but does have Cargo. Both are easy, as far as typical usage goes. Rust automates dependency management, Odin doesn't automate it per se but does make it easy. Which is what the whole discussion is about. A bunch of HNers whining that Odin makes it too hard, even though everyone sane uses Git anyway, and you can add dependencies using Git, and Odin will compile them without a build tool.
So for a Rust project you use Cargo + Git, for an Odin project you use Odin + Git, for a C/C++ project you use Meson (or something else if you hate life) + Git. In the end it's mostly the same, Bill just doesn't seem to want to deal with an NPM or Crates.io situation (and fair enough!).
There is no technical reason the compiler can do the job of build system, but they are typically separated because of separation of concerns. Rustc needs not be tightly coupled with Cargo---it just has to understand enough of package concepts (`--extern`) for the actual compilation, so they can be independently managed and evolve. Odin's would be the polar oppsite, while Zig's approach is somewhere in the middle (compiler-as-a-library).
Both Zig and Rust supply all of what you needed in the box, so all that Odin is doing here is commingling these features inside a single executable - there's no end user benefit that I can see.
Odin's compiler does more than rustc or clang, about the same as javac and less than Zig or Go's executables.
Some distos might try to support multiple versions of a library. That could require installing it to different prefixes instead of the default. Thus, the build system will have to comprehend that.
Literally all code I write runs on Windows, macOS, Android, and Linux. In roughly that order of priority. No I do not and will not use WSL2, it’s an abomination.
To give a concrete example, you said that javascript does not have a definition of a "package" in its langauge. But what does that really mean, and why should it lead to package manager managers? Because for me, a person who has worked with javascript just a little bit, I know package.json exists and most of the package managers I've worked with agree on what the contents of this file mean. If we limit our understanding to just npm, yarn and probably bun, we don't see how that causes or contributes to the dependency hell problem (sure it exists, but how?).
You said that Go mitigates the issue of dependency hell to some degree, but this is an interesting thought, give it more exploration! Why should something like Go not have this problem not be not as severe as in Javascript?
I may not remember the details of what you said in the article and I would like to check, but currently I can't access the site because it times-out for me.
Dependencies do suck but it is because managing a lot of complicated code sucks. You need some way to find issues over time and keep things up to date. Dependencies and package managers at least offer us a path to deal with problems. If you are managing your own dependencies, which I imagine would mean vendoring, then you aren't going to keep these dependencies up to date. You aren't going to find out about exploits in the dependencies and apply them.
No, the alternative is to imagine how many issues we would be in if every project pulled in 5 different SSL libraries. Having one that everybody uses and that is already installed on everyone's system is avoiding dependency hell. Even better if it's in stdlib.
I sympathise with the arguments but IMO laziness will always win out. If Rust didn't have Cargo to automate dependency hell, someone would create a third party script to fill the gap.
When I worked at Google every single dependency was strictly vendored (and not in the mostly useless way that Cargo vendors things). There was generally only one version of a dep in the mono repo, and if you wanted something.. you generally got to own maintaining it, and you had to make sure it worked for every "customer" -- the giant CI system made sure that you knew if an upgrade would break things. And you reached out to stakeholders to manage the process. Giant trains of dependencies were not a thing. You can do that when you have seemingly infinite budget.
But technology can indeed make it worse. I love Rust, but I'm not a fan of the loose approach in Cargo and esp Crates.io, which seems to have pulled inspiration from NPM -- which I think is more of a negative than positive example. It's way too easy to make a mess. Crates.io is largely unmoderated, and its namespace is full of abandoned or lightly maintained projects.
It's quite easy to get away with a maze of giant transitive deps w/ Cargo because Rust by default links statically, so you don't usually end up in DLL hell. But just doing cargo tree on the average large Rust project is a little depressing -- to see how many separate versions of random number generators, SHA256, MD5, etc libs you end up with in a single linkage. It may not be the case that every single one is contributing to your binary size... but it's also kind of hard to know.
Understanding the blast radius of potential issues that come from unmoderated 3rd-party deps is I think something that many engineers have to learn the hard way. When they deal with a security vulnerability, or a fundamental incompatibility issue, or have to deal with build time and binary size explosions.
I wish there was a far more mature approach to this in our industry. The trend seems to be going in the opposite direction.
I feel we need more of these kinds of distros so you don't need to manage dependencies directly from upstream and deal with the integration effort yourself. What if we had a Rust disto following this same model, where there is only one version of each dep, some reasonable curation, and also you had nice clear release cycles? I feel that could real boon for the industry.
I dunno maybe what is needed is a crates.io alternative that is highly highly moderated and highly highly selective. A subscription service with a paid staff that manages the packages and makes sure their deps are minimal, consistent with each other, secure, etc.
I can see that being a service that some corporations might pay for. I just came off a gig at a medical devices company that was using Rust and the software BoM side of things kept me up at night. The list of dependencies in the root workspace was long, and in my imagination, full of terrors.
Maybe they'll be fine, but it's not a practice I would recommend if I were starting such a project from scratch.
Possibly but not guaranteed. Some other languages without a built in package manager haven't had an external one manage to take over the ecosystem, most (in)famously C and C++, while others have.
I rather appreciate that C and C++ don't have a default package manager that took over - yes, integrating libraries is a bit more difficult, but we also have a lot of small, self-contained libraries that just "do the thing" without pulling in a library that does colored text for logging, which pulls in tokio, which pulls in mio, which pulls in wasi, which pulls in serde, which is insane.
RPM and APT packages are also usually not maintained by the upstream developer but by distro developers who care about making different packages work together so you don't get the dependency hell problem as a user.
But it’s annoying to have to deal with 3 different time libraries and 3 different error creation libraries and 2 regex libraries somehow in my dependency tree. Plus many packages named stuff like “anyhow” or “nom” or other nonsense words where you need to google for a while to figure out what a package is supposed to do. Makes auditing more difficult than if your library is named structured-errors or parser-combinator.
I don’t like go programming language but I do like go tooling & go ecosystem. I wish there was a Rust with Go Principles. Swift is kinda in the right ballpark, packages are typically named stuff that makes sense and Swift is closer to Rust perf and Rust safety than Go perf and Go safety. But Swift is a tiny ecosystem outside of stuff that depends on the Apple proprietary universe, and the actual APIs in packages can be very magical/clever. ¯\_(ツ)_/¯
I mostly write C++ whose committee is incompetent and sniffs glue. And I deal a lot with Khronos committed who design pure garbage. “Design by Committee” is a pejorative for a reason.
Unless I am reading this (https://index.ros.org/?search_repos=true) wrong.
Something like OpenCV in the language stdlib is my idea of "batteries included". Like I said: what you consider batteries are not what people in my line of work consider batteries; Odin advertises "batteries included" but looking through the list I wouldn't use any of that in my day to day.
That's why Rust has a small stdlib of just the essentials, because doing so keeps things general, and everyone gets to choose their own idea of batteries.
Python is much older than Go, and has had more packages move from 3rd party into the stdlib to become a "battery", and then atrophy over the years while people move back to 3rd party alternatives with more features that are actually receiving maintenance. Eventually some of those modules were removed from core.
Perhaps the Go model only works when you have a very dedicated core group (for Go, mostly Google employees) around to continuously build and maintain the Cathedral of the standard library + toolchain together. Golang feels very much like UNIX (eg FreeBSD) for this reason, and Rust/Python more like Linux.
Slackware Linux does precisely that.
I'm a Slackware user. Slackware does have a package manager that can install or remove packages, and even a frontend that can use repositories (slackpkg), but it does have manual dependency resolution. Sure, there are 3rd-party managers that can add dependency resolution, but they do not come with the distro as default.
This is a very personal opinion, but manual dependency management is a feature. Back in the day, I remember installing Mandrake Linux 9.2 and activating the (then new-ish) framebuffer console. The distro folks had no better idea than to force a background "9.2" image on framebuffer consoles, which I hated. I finally found the package responsible for that. Removing it with urpmi, however, meant removing all the graphical desktop components (including X11) because that stupid package was listed as a dependency of everything graphical.
That prompted me to seek alternatives to Mandrake and ended up using Slackware. Its simplicity had the added bonus of offering manual dependency resolution.
[0] https://en.wikipedia.org/wiki/Dependency_hell
I find it strange that they use a term with a common meaning, link to that meaning, and then talk about something else?
The second thing is that their version of dependency hell - having lots of dependencies introducing lots of bugs that you would not have written - is not my experience. 99% of the time, my bugs are in my own code, lol. Maybe once you become a much better programmer than me, you stop writing bugs in your own code and instead start having to deal with bugs in the PNG parsing library you depend on or something, and at that point writing your own PNG parsing library becomes a good use of your time. But I'm certainly not at that point.
I've had to fix bugs in dependencies of course. Here is one I fixed yesterday [0]. But it's much closer to the exception than the rule.
1) Have problem that feels too complicated to hand-code.
2) Go on Internet/forums, find a library. The library is usually a small, flat collection of atomic functions.
3) A senior engineer vets the library and approves it for use.
4) Download the stable version: header file, and the lib file for our platform (on rare occasions, build it from source)
5) Place the .h file in the header path, and the lib file in the lib path; update the Makefile.
6) #include the header and call functions.
7) Update deployment scripts (bash script) to scp the lib file to target environment, or in some cases, use static linking.
8) Subscribe to a mailing list and very occasionally receive news of a breaking change that requires a rebuild.
This may sound like a lot of work, but somehow, it was a lot less stressful than dealing with NPM and node_modules today.
I find that it's the hell of transitive dependencies--you as a developer can reasonably vet a single layer of 10-30 standalone libraries. But if those libraries depend on other libraries, etc, then it balloons into hundreds or thousands of dependencies, and then you're sunk.
For what it's worth, I don't think much of this is essential complexity. Often a library is complicated because it supports 10 different ways of using it, but when you use the library, you're only using 1 of those ways. If everyone is only using 10% of thousands of transitive dependencies, the overall effect is incredibly complicated, but could have been achieved with 10-100% more short-term effort. Sure, "it took twice as long to develop but at least we don't have 10x the dependencies" is a hard sell to management (and often to ourselves), but that's because we usually choose to ignore the costs of depending on software we don't understand and don't control. We think that we're cleverly avoiding having to maintain and secure those libraries we outsourced, but most open-source developers aren't doing a great job of that anyway.
Often it really is easier to develop something from scratch, rather than learn and integrate a library. Not always though, of course.
The author asserts that most open-source projects don't hit the quality standards so that their libraries can be just included, and they'll do what they say.
I assert that this is because there's no serious product effort behind most libraries (as in no dedicated QA/test/release cycle), no large commercial products use it (or if they do, either they do it in a very limited fashion, or just fork it).
Hobbyists do QA as long as it interests them/fits their usecase, but only the big vendors do bulletproof releases (which in the desktop realm seems to be only MS/Apple)
This might have to do with the domain the author chose - desktop development has unfortunately had the life sucked out of it with every dev either being a fullstack/cloud/ML/mobile dev, its mindshare and the resources going toward it have plummeted.
(I also have a sneaking suspicion the author might've encountered those bugs on desktop Linux, which, despite all the cheerleading (and policing negative opinions), is as much as a buggy mess as ever.
In my experience, it's quite likely to run into a bug that nobody has written about on the internet ever.
I have an article on my unstructured thoughts on the problems of OSS/FOSS which goes into more depth about this: https://www.gingerbill.org/article/2025/04/22/unstructured-t...
I find myself nodding along to many of the technical and organizational arguments. But I get lost in the licensing discussion.
If it is a cultural problem that people insist on giving things away for free (and receiving them for free), then viral licenses can be very helpful, not fundamentally pernicious.
Outside of the megaprojects, my mental model for GPL is similar to proprietary enterprise software with free individual licenses. The developer gets the benefits of open projects: eyeballs, contributors, adoption, reputational/professional benefits, doing a good deed (if that motivates them) while avoiding permissively giving everything away. The idea that it's problematic that you can't build a business model on their software is akin to the "forced charity" mindset—"why did you make something that I can't use for free?"
If you see a GPL'd bit of code that you really want to use in your business, email the developers with an offer of $X,000 for a perpetual commercial license and a $Y,000/yr support contract. Most are not so ideologically pure to refuse. It's a win-win-win: your business gets the software, the developers don't feel exploited, noncommercial downstream users can enjoy the fruits of open software, and everybody's contributed to a healthier attitude on open source.
This the wrong thing to automate. You can do this manually, however it doesn’t stop you getting into hell, rather just slow you down, as you can put yourself into hell (in fact everyone puts themselves into hell voluntarily). The point is it makes you think how you get there, so if you have to download manually, you will start thinking “maybe I don’t want this” or “maybe I can do this instead”. And when you need to update packages, being manual forces you to be very careful."
I sympathise with this, but I have to respond that we have to live within existing ecosystems. Getting rid of npm and doing things manually won't make building SPAs have fewer dependencies, build would be incredibly slow and painful.
The other thing is your package manager cannot go out to the internet randomly. You need it to download from a place you are comfortable with (which might or might not be the default) existing as long as you need packages, and that will keep the versions of packages you want around. If you are a company project that means an internal server/mirror because otherwise something you depend on will disappear in the future. (most of they decide nobody is using it, delete it, but sometimes it is discovered the thing is an illegal copyright violation - but you have ask your lawyers what this means for you - perhaps a license is easy to get)
Honestly, I don't think this is true in the slightest. Rather, I hypothesize that people want to use such tooling and think the alternatives are slower, which I don't think is true.
If people actually did use fewer dependencies, people would have actually have websites that didn't take ages to load and were responsive.
So the existing ecosystems are just bad.
Part of my reproducing the build was to conduct all the library downloading in a miniconda environment, so at the end I had a reproducible recipe.
Is the original author seriously claiming that anybody was better off with the original, "pure" ad-hoc approach?
You don't think making adding dependencies incredibly slow and painful would make people have fewer of them?
But in the context of newer ecosystems or ones that are more tightly controlled things might be different. For example if apple massively expanded the swift standard library and made dependency management painful, iOS apps might end up having fewer dependencies.
Same number of lines but in fewer dependencies.
I remember installing software in the early 90s: download the source code, read the README, find and download the dependencies, read their READMEs, repeat a few times. Sometimes one dependency could not compile because of any incompatibility or bug. Some could be fixed, some couldn't. Often everything ended up with a successful compilation and install and in one day of work I could have what I'm getting in a few minutes now.
Actually those were small programs by today standards. My take is that we would achieve less if we have to use less dependencies.
By the way, the last time I compiled something from source was yesterday. It was openvpn3 on Debian 13, which is still unsupported. TLDR, it works but the apt-get are a little different from the ones in BUILD.md
It will at least massively help prevent things from breaking unexpectedly.
It won't prevent you from having to cascade a necessary upgrade (such as a security fix) across the entire project until resolution/new equilibrium is achieved.
My solution to the latter is simply to try to depend on as few things as possible. But eventually, the cancer will overtake the project if it keeps growing.
Source: Have worked on a million-LOC Ruby app
The solution is just to depend on less and manage them manually.