Top
Best
New

Posted by varunsharma07 18 hours ago

Postmortem: TanStack NPM supply-chain compromise(tanstack.com)
https://github.com/TanStack/router/issues/7383
954 points | 400 commentspage 6
LelouBil 5 hours ago|
pull_request_target is really a landmine.
Hamuko 4 hours ago|
I'm shocked that big open-source projects are even using it. I was reading through the Actions documentation recently and it did make it pretty clear that you should not be using it for untrusted code.

>Running untrusted code on the pull_request_target trigger may lead to security vulnerabilities. These vulnerabilities include cache poisoning and granting unintended access to write privileges or secrets.

https://docs.github.com/en/actions/reference/workflows-and-a...

LelouBil 3 hours ago||
I feel like GitHub should deprecate it and replace it with pull_request_untrusted or something and have every shareable aspect (like cache or secrets) an explicit boolean opt-in
sn0n 17 hours ago||
As Theo goes live…
philipwhiuk 6 hours ago||
GitHub Actions are insecure by default.

Episode #900

slopinthebag 18 hours ago||
My decision to abandon the JS ecosystem and language entirely continues to pay off. What a mess...

I am, however, concerned that this will pwn my workplace. We don't use Tanstack but this seems self-propagating and I doubt all of our dependencies are doing enough to prevent it.

nine_k 18 hours ago||
Abandon NPM in exchange for what? Cargo? Go get? Pip install?

Every package manager that does not analyze and run tests on the packages being uploaded (like Linux distros do) is vulnerable.

ljm 17 hours ago|||
The community decided it's too much effort to vet code before publishing it so here we are.

(I'm not being stupid, even ten years ago there were arguments on HN about whether you should audit your dependencies)

I landed on the 'yes, you should know what code you are getting involved with' side.

baq 6 hours ago||
'yes, you should' needs to be reconciled with 'it's f*g expensive' and 'risk is low'.

nowadays, 'risk is low' isn't true anymore and it's actually cheaper to have a robot spit out a reimplementation of the 5.4% of what you need out of your dependencies instead of auditing the 100%.

devttyeu 17 hours ago||||
Cargo is spiritually based on NPM so it's not much better.

Go Get is closer to always locking dependencies unless you explicitly upgrade them with a go get, so it's much much better in my view.

Yes, you can lock deps in NPM/Cargo/etc. but that's not the default. It is the default in Go.

In Go projects my policy for upgrading dependencies includes running full AI audit of all code changed across all dependencies, comes out to ~$200 in tokens every time but it gives those warm 'not likely to get pwned' vibes. And it comes with a nice report of likely breaking changes etc.

nine_k 17 hours ago|||
> comes out to ~$200 in tokens every time

BTW a curated mirror of <whatever ecosystem> packages, where every package is guaranteed to have been analyzed and tested, could be an easy sell now. Also relatively easy to create, with the help of AI. A $200 every time is less pleasant than, say, $100/mo for the entire org.

Docker does something vaguely similar for Docker images, for free though.

AgentME 17 hours ago|||
People are already scanning npm constantly. You can limit yourself to pre-scanned packages by setting npm's minimum release age setting to 1 or 2 days (a timeframe that all the recent high-profile malicious package versions were unpublished within).
nine_k 17 hours ago||
Note to self: the test suite for vetting a package should include setting the system date some time in the future, to check if an exploit is trying to sleep long enough to defeat the age limit.
chickensong 7 hours ago|||
https://www.chainguard.dev/
voxl 17 hours ago||||
It's insane to me you spend $200 on a report you likely rarely read in detail or double check for correctness, yet you're doing it to feel good about security.
devttyeu 17 hours ago||
If it runs in a harness that will alert me when something dodgy is detected I'm fine to stay at that level.

I don't read it in detail because reading in detail is precisely what I delegate to the harness. The alternative is that I delegate all this trust to package managers and the maintainers which quite clearly is a bad idea.

Whether the $$ pricetag is worth it is.. relative. Also in Go you don't update all that often, really when something either breaks or there is a legitimate security reason to do so, which in deep systems software is quite infrequent.

Funnily enough for frontend NPM code our policy was to never ever upgrade and run with locked dependencies, running few years old JS deps. For internal dashboards it was perfectly fine, never missed a feature and never had a supply chain close call.

crab_galaxy 16 hours ago||
> running few years old JS deps

What do you when a critical vulnerability gets discovered and you have to update a package? How many critical/high severity vulnerabilities are you running with in production every day to avoid supply chain attacks?

devttyeu 14 hours ago|||
For the stuff in more sensitive deployments it's really quite simple, just setup CORS etc properly and don't do anything overly fancy on the frontend. Worst case the user may force some internal function to eval some JS by pasting scripts into the browsers debug console.

Critical severity vulnerabilities are only critical when they are reachable, but are completely meaningless if your application doesn't touch that code at all. It's objectively more risky to "patch" those by updating dependencies than just let them be there.

throawayonthe 15 hours ago|||
they said internal dashboards
nine_k 15 hours ago||
Anyone who gets into the security perimeter may be in for a feast then.
n_e 17 hours ago|||
> Yes, you can lock deps in NPM/Cargo/etc. but that's not the default. It is the default in Go.

How is it not the default in npm?

chuckadams 16 hours ago||
It is the default in both cargo and npm, but "npm install" stupidly enough still updates the lockfile, and you need "npm ci" to actually respect it. I think there's some flag to make install work sanely, but long-term I find the best approach is to use anything other than npm.

I ditched npm for yarn years ago because it had saner dependency resolution (npm's peer dependency algorithm was a constantly moving target), and now I've switched from yarn to bun because it doesn't run hooks in dependencies by default. It also helps that it installs dependencies 10x faster.

cluckindan 16 hours ago||
”npm install” does not update the lockfile in any current major version.

At least not if you haven’t edited your package.json manually.

chuckadams 16 hours ago||||
> Abandon NPM in exchange for what? Cargo? Go get? Pip install?

pnpm, deno, or bun, none of which will run the malicious "prepare" hook in the first place unless specifically allowed.

vsgherzi 18 hours ago||||
Even linux was subjected to an attack in xz utils. Granted it is much harder and they have a much better auditing problem (something npm should learn from). There really isn't a silver bullet here unfortunately. The industry as a whole needs to get more serious about this.
nine_k 17 hours ago|||
There's no silver bullet, but getting an exploit into xz took extraordinary effort, a long time, and bespoke code, because it needed to slip under the radar of actual humans reading the code. A shai hulud-style attack won't work with any reasonable Linux distro, like it does with npm.
kelvinjps10 15 hours ago|||
but it was caught with the existing release model, where first it goes to testing where many people before reaching the production systems in the stable release. for example debian
m4rtink 12 hours ago||||
Distro packages maintained and (hopefully audited on update) by separate maintainers ?
jadbox 17 hours ago||||
Exactly, the only real way to escape this madness is if we move back to "Standard Libs" where your project only depends on 1-3 core libraries. For example, .NET and Java are almost entire 'kitchen sink' ecosystems. Arguably for simple projects, Go has a fairly large standard lib.
spartanatreyu 16 hours ago||
This is exactly why I love Deno so much, it has a standard lib AND a security model that's secure by default.
TZubiri 17 hours ago||||
Just writing the actual code that you are being paid to write
vinyl7 16 hours ago||
The only correct answer
slopinthebag 17 hours ago||||
Both Cargo and Go's package manager are a lot better. Can you name comparable security incidents they've had in the last 5 years?

Idk about Python, I refuse to use that language for other reasons.

pier25 17 hours ago||
It makes more sense to attack packages in NPM since it's by far the most popular package manager.
gitaarik 9 hours ago||
Yeah indeed, you can move to a less popular ecosystem and have less risk. Back in the day when I moved from PHP ecosystem to Python, that was a big improvement. But with NPM I feel mixed; there's a lot of crap, but there's also genuinely good stuff. So you have to be a bit more conscious and alert when you make decisions on packages etc. With more mature ecosystems you have that problem less, and you don't have to spend so much time on package research and can rely more on the community. But still there's always a risk there too, so you have to stay alert.
hans-l 17 hours ago|||
[dead]
febusravenga 7 hours ago|||
This is GitHub FU.

Key issue here is cache poisoning, that is feature/bug that exist in utility functions/actions provided by Github.

Even if there was misconfiguration on tanstack side, then root cause is on. GH for even allowing insecure workflows to interfere with secure ones.

Here people are trying to fix defaults - not to write cache in insecure context -> https://github.com/actions/cache/issues/1756

(even if sufficiely smart attacker would find the key somewhere and skip this kind of prodection, not sure where but write-allowing-key it must exist somewhere in runtime if actions/cache can us it)

Someone else on this thread:

> On GitLab even if you set the same cache key it will not cross between unprotected and protected runs.

Havoc 18 hours ago|||
Yeah it's a dumpster fire, but I also don't think the other major ecosystems like say python's pypi are any safer structurally
gred 17 hours ago||
There are npm supply chain exploits in the news every other day. I'm honestly surprised that something as decentralized as Go Modules is more reliable, but here we are. The fact that we're not seeing these stories about e.g. Maven is not at all surprising, given the limited need for third party libraries and the culture of careful upgrades in the Java ecosystem. If npm proponents want the ecosystem to survive, they need to demand / create better and stop making excuses.
bakugo 18 hours ago||
I highly recommend enforcing a minimum dependency release age of at least a week across all package managers used at your workplace. Most package managers support it now, and it will save you from the vast majority of these attacks.

https://news.ycombinator.com/item?id=47582632

AgentME 17 hours ago||
Highly recommend using the minimum release age setting, though I think a week is probably overkill. Did any of the recent supply-chain attacks have a malicious version up for more than a day?
bakugo 16 hours ago||
Maybe not, but how much of that was luck? I think it's only a matter of time until a similar compromise happens but nobody notices it for a few days, better safe than sorry.
shevy-java 5 hours ago||
NPM is a never-ending joy of daily what-the-fudges.

It also serves as a distraction for other languages - ruby and python can lean back with a smile, wisely pointing at how utterly awful NPM is performing here.

idoxer 17 hours ago||
Ah shit, here we go again
anonymousab 10 hours ago||
Yet another day where 'pull_request_target` is allowed to exist and cause tons of pain. They really ought to kill it off by now.
rvz 17 hours ago||
Once again, Shai-Hulud wrecking havock in the Javascript and Typescript ecosystems via NPM.

One of the worst ecosystems that has been brought into the software industry and it is almost always via NPM. Not even Cargo (Rust) or go mod (Golang) get as many attacks because at least with the latter, they encourage you to use the standard library.

Both Javascript and Typescript have none and want you to import hundreds of libraries, increasing the risk of a supply chain attack.

At this point, JS and TS are considered harmful.

robertjpayne 17 hours ago||
I don't really buy this. NPM is targeted because it's the largest attack surface with the biggest payoff for a successful attack.

Other ecosystems package managers are really no different in a lot of ways.

NPM's biggest fault is just it allows post/pre install scripts by default without user intervention.

devilsdata 16 hours ago|||
Look I love Rust and hate Typescript. But if NPM didn't exist, wouldn't the attackers just hit the next most popular supply chain? Cargo isn't immune to this, as much as I love Rust and wish more shops used it.
squidsoup 17 hours ago|||
If cargo was as popular as npm, the same issues would surface.
febusravenga 7 hours ago|||
It's not failure of npm/js ecosystem. It's Github Actions failure that allowed this to happen.
pier25 16 hours ago|||
> Both Javascript and Typescript have none and want you to import hundreds of libraries

There are plenty of very popular packages with zero dependencies like Hono or Zod. If you decide to blindly install something with hundreds of deps it's on you.

That said, I do agree the JS standard library should provide a lot more than it does now.

AlotOfReading 17 hours ago|||
I wonder whether NPM has surpassed the costs of the billion dollar mistake, null references. NPM hasn't been around as long, but the industry is much bigger today than it was when systems languages were dominant.
silverwind 16 hours ago|||
Python had these too, no ecosystem is safe.
skydhash 17 hours ago||
The Standard C library is also very small. Even though there’s POSIX, for anything that’s not system programming, you will be using libraries.

The difference is that the usual C libraries don’t split the project into small molecules for no good reasons. You have to be as big as GTK to start splitting library in my opinion.

gajus 18 hours ago||
Reminder to secure your npm environments.

https://gajus.com/blog/3-pnpm-settings-to-protect-yourself-f...

Just a handful of settings to save a whole lot of trouble.

jdxcode 14 hours ago||
In aube you get all this out of the box plus a lifecycle jail (next MV will have that on by default) and defaults to trustPolicy=no-downgrade (would not have helped here but still a good default).

It has the strongest security posture of any node pm.

https://aube.en.dev/security.html#jailed-lifecycle-scripts

9dev 10 hours ago|||
Heads up: Your website at en.dev says you're a one-person open source company. That immediately ruled out any of your tools for me and my team; no matter how great they may be, a single developer is a supply chain risk. I wholeheartedly recommend enlarging the team.
Imustaskforhelp 13 hours ago|||
What a pleasant surprise to see jdx within comments! I was actually using mise and found aube and decided to publish it on hackernews, I found it really cool!

Though a bit sad that it hadn't received traction back then but I must admit jdx that a lot of the work that you do is really cool.

Also I am happy to know that you are finally able to work on Open source full time, I am glad that I can use open source software created by (in my opinion generous) people like you too, mise is awesome :-D

https://news.ycombinator.com/item?id=48012248

arcza 17 hours ago|||
Wild claim that setting the minimum age to 7 days will result in me "never" getting a supply chain npm vuln.
andix 17 hours ago|||
In this case it would have, because the compromised packages were pulled within 3 hours.
saghm 16 hours ago|||
This sort of mitigation seems like it makes sense in the short term, but it seems like it would only work as long as most people don't do it. If everyone has this set to seven days, it will take seven days plus three hours to get things yanked, and then there will be people who will set to 14 days...
worble 16 hours ago|||
No, its still a very useful mitigation tool.

1) Package owners will often realise they've been hacked quickly, since there are releases they never authorised. This gives them plenty of time to raise the alarm and yank the packages

2. Independent security researchers and other automated vulnerability scans will still be checking the latest releases even if users aren't using them

Yes it's not a perfect defense but it would help a lot.

bmandale 10 hours ago||||
Some people would set up tooling to look for compromises the moment they get published. What's neat about this is that as an attacker you have no way to determine beforehand whether you'll get caught by this. So you would run your attack, it would lead to a compromised package being published, then the world would get a chance to look at it and see if they can detect the issue with it. This would of course lead to attackers being a lot sneakier. But I think due to the opaque nature of what checks people are running against packages and what they might notice, a much smaller number of attacks would make it through. Of course the ones that did by definition would be the ones that were impossible to detect and would thus stick around a lot longer.
omcnoe 15 hours ago||||
These malicious packages are being caught by the authors, and by automated package security scanners, not just by end users. npm should start setting this 7 day cooldown as default.
andix 15 hours ago||
Even 12 hours would probably be enough. Those automatic malware scanning companies are getting really fast.
conradkay 12 hours ago|||
Mine's set to 1 day (seems to be enough from all the cases we've learned about), I got you.

Also seems like this attack and most others were caught by automated tooling from 3rd parties

mayama 13 hours ago|||
you are betting that the package is popular, has enough eyes to mitigate attack in 7 days. attackers could also target unpopular packages for long game
pastel8739 16 hours ago||||
There is a “fresh” in there
Narretz 17 hours ago|||
Isn't this article wrong about npm minumum release age. 1. The config is min-release-age. 2. For some reason they have chosen to make it days instead of minutes: https://docs.npmjs.com/cli/v11/using-npm/config#min-release-...

Completely unforced fragmentation of the dependency manager space imo

bakugo 17 hours ago||
This confused me too, until I realized that the article is about pnpm, not npm (pnpm reads .npmrc for some reason, despite not having the same options as npm)

On a related note, it seems to be impossible to find the documentation of min-release-age by googling it. Very annoying.

davnicwil 16 hours ago||
I just set this up for npm, here's the command that worked for me:

npm config set min-release-age 7

The '7' is days. This is the only format that worked for me, just a single integer number of days.

Confirmed by trying to install the latest version of React 19.2.6 (published 5 days ago as of the time of this comment). It failed with a comment confirming that it could not find such a version published before a week ago.

arkon_hn 15 hours ago|||
Also `allow-git=none` for npm v11+: https://github.blog/changelog/2026-02-18-npm-bulk-trusted-pu...
mebcitto 6 hours ago|||
Unfortunately there is currently an issue in pnpm that makes `minimumReleaseAge` difficult: https://github.com/pnpm/pnpm/issues/11068
rvz 17 hours ago||
And absolutely pin, pin, pin, ALL your dependencies.

If I see a package version dependency that looks like this: ^1.0.0 or even this: "*", then stop reading, pin it to a secure version immediately.

AgentME 17 hours ago|||
Npm's package-lock.json already handles pinning everything to exact versions, including subdependencies. Pinning exact versions in package.json doesn't affect your subdependencies.
beart 13 hours ago||
You aren't wrong. However, this article does offer some additional advice on this matter, and some potential reasons why it might still be desirable to pin your deps in package.json.

https://docs.renovatebot.com/dependency-pinning/#pinning-dep...

Some exerts:

> If a lock file gets out of sync with its package.json, it can no longer be guaranteed to lock anything, and the package.json will be the source of truth for installs.

> provides much less visibility than package.json, because it's not designed to be human readable and is quite dense.

> If the package.json has a range, and a new in-range version is released that would break the build, then essentially your package.json is in a state of "broken", even if the lock file is still holding things together.

eqvinox 16 hours ago||||
Or help distributions do the manual process of packaging - which involves at least rudimentary security checks - so they can ship newer versions faster.

And then use distro packages.

(I'm not accepting distro fragmentation as counterargument. With containerization the distro is something you can choose. Choose one, help there, and use it everywhere.)

losvedir 16 hours ago||||
Are you talking about in package.json? What's your threat model? That's what the lock file is for, which also pins transitive dependencies, which is just as crucial. Now what's actually insecure is if you don't commit the lockfile. and if you don't do `npm ci`.

I think `npx` might pull down new versions, too? I wish npm worked more like Elixir where updating the lock file was an explicit command, and everything else used the lock file directly.

jonchurch_ 17 hours ago||||
its so wild to have seen this advice reverse course over the past year.

it used to be that projects that pinned deps were called out as being less secure due to not being able to receive updates without a publish.

different times, different threat model I suppose

n_e 17 hours ago||
> it used to be that projects that pinned deps were called out as being less secure due to not being able to receive updates without a publish.

This is still the right advice for libraries. For security it doesn’t matter a whole lot anymore as package managers can force the transitive dependencies version, but it allows for much better transitive dependency de duplication.

For non-libraries it doesn’t matter as the exact versions get pinned in the package-lock.

captn3m0 17 hours ago|||
I've been collecting things you can't pin:

- Python inline dependencies in PEP-0723, which you can pin with a==1.0, but can't be hash-pinned afaik.

- The bin package manager lets you pin binaries, but they aren't hash-pinned either.

- The pants build tool suggests vendoring a get-pants.sh script[0] but it downloads the latest. Even if you pass it a version, it doesn't do any checks on the version number and just installs it to ~/.local/bin

[0]: https://github.com/pantsbuild/setup/blob/gh-pages/get-pants....

vorsken 2 hours ago|
[dead]
More comments...