Posted by varunsharma07 18 hours ago
>Running untrusted code on the pull_request_target trigger may lead to security vulnerabilities. These vulnerabilities include cache poisoning and granting unintended access to write privileges or secrets.
https://docs.github.com/en/actions/reference/workflows-and-a...
Episode #900
I am, however, concerned that this will pwn my workplace. We don't use Tanstack but this seems self-propagating and I doubt all of our dependencies are doing enough to prevent it.
Every package manager that does not analyze and run tests on the packages being uploaded (like Linux distros do) is vulnerable.
(I'm not being stupid, even ten years ago there were arguments on HN about whether you should audit your dependencies)
I landed on the 'yes, you should know what code you are getting involved with' side.
nowadays, 'risk is low' isn't true anymore and it's actually cheaper to have a robot spit out a reimplementation of the 5.4% of what you need out of your dependencies instead of auditing the 100%.
Go Get is closer to always locking dependencies unless you explicitly upgrade them with a go get, so it's much much better in my view.
Yes, you can lock deps in NPM/Cargo/etc. but that's not the default. It is the default in Go.
In Go projects my policy for upgrading dependencies includes running full AI audit of all code changed across all dependencies, comes out to ~$200 in tokens every time but it gives those warm 'not likely to get pwned' vibes. And it comes with a nice report of likely breaking changes etc.
BTW a curated mirror of <whatever ecosystem> packages, where every package is guaranteed to have been analyzed and tested, could be an easy sell now. Also relatively easy to create, with the help of AI. A $200 every time is less pleasant than, say, $100/mo for the entire org.
Docker does something vaguely similar for Docker images, for free though.
I don't read it in detail because reading in detail is precisely what I delegate to the harness. The alternative is that I delegate all this trust to package managers and the maintainers which quite clearly is a bad idea.
Whether the $$ pricetag is worth it is.. relative. Also in Go you don't update all that often, really when something either breaks or there is a legitimate security reason to do so, which in deep systems software is quite infrequent.
Funnily enough for frontend NPM code our policy was to never ever upgrade and run with locked dependencies, running few years old JS deps. For internal dashboards it was perfectly fine, never missed a feature and never had a supply chain close call.
What do you when a critical vulnerability gets discovered and you have to update a package? How many critical/high severity vulnerabilities are you running with in production every day to avoid supply chain attacks?
Critical severity vulnerabilities are only critical when they are reachable, but are completely meaningless if your application doesn't touch that code at all. It's objectively more risky to "patch" those by updating dependencies than just let them be there.
How is it not the default in npm?
I ditched npm for yarn years ago because it had saner dependency resolution (npm's peer dependency algorithm was a constantly moving target), and now I've switched from yarn to bun because it doesn't run hooks in dependencies by default. It also helps that it installs dependencies 10x faster.
At least not if you haven’t edited your package.json manually.
pnpm, deno, or bun, none of which will run the malicious "prepare" hook in the first place unless specifically allowed.
Idk about Python, I refuse to use that language for other reasons.
Key issue here is cache poisoning, that is feature/bug that exist in utility functions/actions provided by Github.
Even if there was misconfiguration on tanstack side, then root cause is on. GH for even allowing insecure workflows to interfere with secure ones.
Here people are trying to fix defaults - not to write cache in insecure context -> https://github.com/actions/cache/issues/1756
(even if sufficiely smart attacker would find the key somewhere and skip this kind of prodection, not sure where but write-allowing-key it must exist somewhere in runtime if actions/cache can us it)
Someone else on this thread:
> On GitLab even if you set the same cache key it will not cross between unprotected and protected runs.
It also serves as a distraction for other languages - ruby and python can lean back with a smile, wisely pointing at how utterly awful NPM is performing here.
One of the worst ecosystems that has been brought into the software industry and it is almost always via NPM. Not even Cargo (Rust) or go mod (Golang) get as many attacks because at least with the latter, they encourage you to use the standard library.
Both Javascript and Typescript have none and want you to import hundreds of libraries, increasing the risk of a supply chain attack.
At this point, JS and TS are considered harmful.
Other ecosystems package managers are really no different in a lot of ways.
NPM's biggest fault is just it allows post/pre install scripts by default without user intervention.
There are plenty of very popular packages with zero dependencies like Hono or Zod. If you decide to blindly install something with hundreds of deps it's on you.
That said, I do agree the JS standard library should provide a lot more than it does now.
The difference is that the usual C libraries don’t split the project into small molecules for no good reasons. You have to be as big as GTK to start splitting library in my opinion.
https://gajus.com/blog/3-pnpm-settings-to-protect-yourself-f...
Just a handful of settings to save a whole lot of trouble.
It has the strongest security posture of any node pm.
Though a bit sad that it hadn't received traction back then but I must admit jdx that a lot of the work that you do is really cool.
Also I am happy to know that you are finally able to work on Open source full time, I am glad that I can use open source software created by (in my opinion generous) people like you too, mise is awesome :-D
1) Package owners will often realise they've been hacked quickly, since there are releases they never authorised. This gives them plenty of time to raise the alarm and yank the packages
2. Independent security researchers and other automated vulnerability scans will still be checking the latest releases even if users aren't using them
Yes it's not a perfect defense but it would help a lot.
Also seems like this attack and most others were caught by automated tooling from 3rd parties
Completely unforced fragmentation of the dependency manager space imo
On a related note, it seems to be impossible to find the documentation of min-release-age by googling it. Very annoying.
npm config set min-release-age 7
The '7' is days. This is the only format that worked for me, just a single integer number of days.
Confirmed by trying to install the latest version of React 19.2.6 (published 5 days ago as of the time of this comment). It failed with a comment confirming that it could not find such a version published before a week ago.
If I see a package version dependency that looks like this: ^1.0.0 or even this: "*", then stop reading, pin it to a secure version immediately.
https://docs.renovatebot.com/dependency-pinning/#pinning-dep...
Some exerts:
> If a lock file gets out of sync with its package.json, it can no longer be guaranteed to lock anything, and the package.json will be the source of truth for installs.
> provides much less visibility than package.json, because it's not designed to be human readable and is quite dense.
> If the package.json has a range, and a new in-range version is released that would break the build, then essentially your package.json is in a state of "broken", even if the lock file is still holding things together.
And then use distro packages.
(I'm not accepting distro fragmentation as counterargument. With containerization the distro is something you can choose. Choose one, help there, and use it everywhere.)
I think `npx` might pull down new versions, too? I wish npm worked more like Elixir where updating the lock file was an explicit command, and everything else used the lock file directly.
it used to be that projects that pinned deps were called out as being less secure due to not being able to receive updates without a publish.
different times, different threat model I suppose
This is still the right advice for libraries. For security it doesn’t matter a whole lot anymore as package managers can force the transitive dependencies version, but it allows for much better transitive dependency de duplication.
For non-libraries it doesn’t matter as the exact versions get pinned in the package-lock.
- Python inline dependencies in PEP-0723, which you can pin with a==1.0, but can't be hash-pinned afaik.
- The bin package manager lets you pin binaries, but they aren't hash-pinned either.
- The pants build tool suggests vendoring a get-pants.sh script[0] but it downloads the latest. Even if you pass it a version, it doesn't do any checks on the version number and just installs it to ~/.local/bin
[0]: https://github.com/pantsbuild/setup/blob/gh-pages/get-pants....