Posted by varunsharma07 15 hours ago
Given the recent lpe vulns docker 100% won’t cut it.
And containers were never meant primarily as a security boundary anyways
Also, in addition to isolation and https://en.wikipedia.org/wiki/Capability-based_security between processes, capability security within processes, see languages like E (https://web.archive.org/web/20260506035108/https://erights.o...) or Monte (https://monte.readthedocs.io/en/latest/index.html)
Is it no longer the right idea?
It would limit the blast radius, which at least is an improvement.
Is there evidence that any downstream packages that may have pulled/included tanstack packages should be considered safe?
Tanstack infected a bunch of other packages; then resolving their issue doesn’t fix the widespread issue
This is too reductive of the situation.
If it ain’t broke don’t fix it. Except, in this case, unless you have someone tell you it’s broken you won’t even know you need to fix it.
And this is where asymmetry comes in to play. Attackers are free to test and break as much as they want as long as they are silent. Whereas maintainers don’t know if the fix an LLM proposes will actually address the issue or cause some regression elsewhere.
IMO, if Microsoft wants actually good PR around GitHub for once they would offer free LLM security audits on all actions for at least the X most popular repos…
Well, one of simplest mitigation is that `pull_request_target` jobs shouldn't have access to write to cache, they can read for performance, but not write.
To extrapolate rule, the `pull_request_target` shouldn't have any ways to invoke external side effects.
In most strict scenario, they shouldn't have access to network at all ... or only to GET <safeUrl> - where safeUrls are somehow vetted previously on main, derived from yarn.locks and similar manifests. Pita to setup, no wonder nobody does that.
Okay it's a security issue, but just mitigate it as we won't fix it.
In a recent comment people asked me how come GitHub Action isn't a positive added feature since MS acquisition.
PSA: npm/bun/pnpm/uv now all support setting a minimum release age for packages. I also have `ignore-scripts=true` in my ~/.npmrc. Based on the analysis, that alone would have mitigated the vulnerability. bun and pnpm do not execute lifecycle scripts by default. Here's how to set global configs to set min release age to 7 days: ~/.config/uv/uv.toml exclude-newer = "7 days"
~/.npmrc
min-release-age=7 # days
ignore-scripts=true
~/Library/Preferences/pnpm/rc
minimum-release-age=10080 # minutes
~/.bunfig.toml
[install]
minimumReleaseAge = 604800 # seconds
If you do need to override the global setting, you can do so with a CLI flag: npm install <package> --min-release-age 0
pnpm add <package> --minimum-release-age 0
uv add <package> --exclude-newer "0 days"
bun add <package> --minimum-release-age 0
I should add one extra note. There seems to be some concern that the mass adoption of dependency cooldowns will lead to vulnerabilities being caught later, or that using dependency cooldowns is some sort of free-riding. I disagree with that. What you're trading by using dep cooldowns is time preference. Some people will always have a higher time preference than you. Note that commands explicitly intended to run a particular script, such as
npm start, npm stop, npm restart, npm test, and npm run-script will still
run their intended script if ignore-scripts is set, but they will not
run any pre- or post-scripts.
0: https://docs.npmjs.com/cli/v8/commands/npm-run-script#ignore...If you don't have min-release-age set, remember that you can still pull in affected packages via indirect dependencies.
And ideally pin your package manager version too.
~/.config/pip/pip.conf
[install] uploaded-prior-to = P3D
Interesting days.