Posted by sohkamyung 12/22/2025
Something that I find to be a frustrating side effect of malware issues like this is that it seems to result in well-intentioned security teams locking down the data in apps.
The justification is quite plausible -- in this case WhatsApp messages were being stolen! But the thing is... that if this isn't what they steal they'll steal something else.
Meanwhile locking down those apps so the only apps with a certain signature can read from your WhatsApp means that if you want to back up your messages or read them for any legitimate purpose you're now SOL, or reliant on a usually slow, non-automatable UI-only flow.
I'm glad that modern computers are more secure than they have been, but I think that defense in depth by locking down everything and creating more silos is a problem of its own.
So users go through the same steps as if they were connecting another client to their WhatsApp account, and the client gets full access to all data of course.
From what I understand WhatsApp is already fairly locked down, so people had to resort to this sort of thing – if WA had actually offered this data via a proper API with granular permissions, there might have been a lower chance of this happening.
A great microcosm illustration of this is automation permission on macOS right now: there's a separate allow dialog for every single app. If you try to use a general purpose automation app it needs to request permission for every single app on your computer individually the first time you use it. Having experienced that in practice it... absolutely sucks.
At this point it makes me feel like we need something like an async audit API. Maybe the OS just tracks and logs all of your apps' activity and then:
1) You can view it of course.
2) The OS monitors for deviations from expected patterns for that app globally (kinda like Microsoft's SmartScreen?)
3) Your own apps can get permission to read this audit log if you want to analyze it your own way and/or be more secure. If you're more paranoid maybe you could use a variant that kills an app in a hurry if it's misbehaving.
Sadly you can't even implement this as a third party thing on macOS at this point because the security model prohibits you from monitoring other apps. You can't even do it with the user's permission because tracing apps requires you to turn SIP off.
The problem here, is that like so many social-media apps, the first thing the app will do is scrape as much as it possibly can from the device, lest it lose access later, at which point auditing it and restricting its permissions is already too late.
Give an inch, and they’ll take a mile. Better to make them justify every millimetre instead.
We're not in 1980 anymore. Most people need zero, and even power users need at most one or two apps that need that full access to the disk.
In macOS, for example, the sandbox and the file dialog already allow opening any file, bundle or folder on the disk. I haven't really come across any app that does better browsing than this dialog, but if there's any, it should be a special case. Funny enough, WhatsApp on iOS is an app that reimplements the photo browser, as a dark pattern to force users to either give full permission to photos or suffer.
The only time where the OS file dialog becomes limited is when a file is actually "multiple files". Which is 1) solvable by bundles or folders and 2) a symptom of developers not giving a shit about usability.
I don't think we should be handing more power to OS makers and away from users. There has to be a middle ground between wall gardens and open systems. It would be much better for node & npm to come up with a solution than locking down access.
Currently OSs are a free-for-all, where the user must blindly trust third-party apps, or they enforce it clumsily like in macOS.
This was fine in 1980 but isn't anymore.
...and this gives them more control, so they can profit from it. Corporate greed knows no bounds.
I'm glad that modern computers are more secure than they have been
I'm not. Back when malware was more prevalent among the lower class, there was also far more freedom and interoperability.
Yeah, “the lower class” had the freedom of having their IM accounts hacked and blast spam/scam messages to all contacts all the time. How nostalgic.
Live internet popups you didn't ask for, live tracking of everything you do, new buttons suddenly appearing in every toolbar. All of it slowing down your machine.
It is already a major security and privacy risk for users to rely on the beneficence and competence of developers (let alone corporations and their constant shady practices/rug-pulls), as all the recent malware and large scale supply chain compromises have shown. I find the only acceptable solution would be to use AI to help users (and devs, for that matter) navigate and manage the exponential complexity of privacy and security.
For a practical example, imagine your iOS AI Agent notifying you that as you had requested, it is informing you that it adjusted the Facebook data sharing settings because the SOBs changed them to be more permissive again after the last update. It may even then suggest that since this is the 5685th shady incident by Facebook, that it may be time to adjust the position towards what to share on Facebook.
That could also extend to the subject story; where one’s agent blocks and warns of the behavior of a library an app uses, which is exfiltrating WhatsApp messages/data and sending it off device.
Ideally such malicious code will soon also be identified way sooner as AI agents can become code reviewers, QA, and even maintainers of open source packages/libraries, which would intercept such behaviors well before being made available; but ultimately, I believe it should all become a function of the user’s agent looking out for their best interests on the individual level. We simply cannot sustain “trust me, bro” security and privacy anymore…especially since as has been demonstrated quite clearly, you cannot trust anyone anymore in the west, whether due to deliberate or accidental actions, because the social compact has totally broken down… you’re on your own… just you and your army of AI agents in the matrix.
It's all well and good for us all to use Linux to side-step this, but sometimes (shock, horror), we even want to _share_ those hacks with other people!
As such, it's kinda nice if the Big Tech software on those devices didn't lock all of our friends in tiny padded cells 'for their own safety'.
NPM and NPM-style package managers that are designed to late-fetch dependencies just before build-time are already fundamentally broken. They're an end-run around the underlying version control system, all in favor of an ill-considered, half-baked scheme to implement an alternative approach to version control of the package manager project maintainers' devising.
And they provide cover for attacks like this, because they encourage a culture where, because one's dependencies are all "over there", the massive surface area gets swept under the rug and they never get reviewed (because 56K NPM users can't be wrong).
The issue I have is that I don't really have a good idea for a solution to this problem - on one hand, I don't expect everyone to roll the entire modern stacks by hand every time. Killing collaborative software development seems like literally throwing the baby out with the bath water. On the other hand, I feel like nothing I touch is "secure" in any real sense - the tick boxes are there, and they are all checked, but I don't think a single one of them really protects me against anything - most of the time, the monster is already inside the house.
Is NPM really collaborative? People just throw stuff out there and you can pick it up. It's the least commons denominator of collaboration.
The thing that NPM is missing is trust and trust doesn't scale to 1000x dependencies.
Yes, but IMO a tooling change is needed first. There just isn't good infrastructure fir doing this.
If, for code, there is a parallel "state" document with the intent behind each line of code, each function
And in conjunction that state document is connected to a "higher layer of abstraction" document (recursively up as needed) to tie in higher layers of intent
Such a thing would make it easier to surface weird behavior imo, alongside general "spec driven design" perks. More human readable = more eyes, and potential for automated LLM analysis too.
I'm not sure it'd be _Perfect_, but I think it'd be loads better than what we've got now
Speed of development and development experience are not metrics to be minimized/discarded lightly. If you were to start a company/product/project tomorrow, a lot of the things you want to be doing in the beginning are not related to these tools. You probably, most of the time, want to be exploring your solution space. Creating a development and CI/CD environment that can fully take advantage of these tools capabilities (like hermeticity and reproducibility) is not straightforward - in most cases setting up, scaling and maintaining these often requires a whole team with knowledge that most developers won't have. You don't want to gatekeep the writing of new software behind such requirements. But I do agree that the default should be closer to this, than what we have today. How we get there - now that is the million dollar question.
Yes it is. Git doesn't operate based on package.json.
You're still trying to devise a scheme where, instead of Git tracking the source code of what you're building and deploying and/or turning into a release, you're excluding parts of that content from Git's purview. That's doing an end-run around the VCS.
But okay, let's go further and use git submodules so that package.json is out of the picture. Even in that case we have the same problem.
Or, let's go even further and vendor the dependency so it is now copied into our source code. Even in that case too, we still have the same problem.
The dependency has been malicious all along, so if we use it in any way the game is already over.
Not "hardly". That's very literally an end-run around the VCS.
This is not a productive discussion.
Your claim is that no matter whether dependencies' source code is acquired by git-clone or npm-install, then everything related to this attack unfolds exactly the same as it did in the timeline where we live. But as I said in my first comment in this thread the effect of going along with The NPM Way changes how people interact with third-party code.
My contention is that in the universe where dependencies get checked into version control, this is one package that (assuming it ever got created at all) would have been less successful in conscripting others to choose it as a dependency, and that wrt the remaining instances if any where it was approved to be checked in, the question of what effect the mere act of checking it into version control and the fact of its existing there has on its being discovered sooner is non-zero.
Should the OS prevent you from doing API calls to WhatsApps servers? What about the actual library this is based on, should that be blocked as well?
The root of the problem is that users and developers may have legitimate reasons to want API access to a service, like WhatsApp. That just comes with a level of risk. Especially in a world where we're not use to auditing our dependencies. The only sort of maybe solution I can see is the operating system prompting you when an application want's to make an outgoing request, but in this case the messages might just go to AWS and an S3 bucket, or it could send them via WhatsApp to the attack, how would you spot that in the operating system, without built in knowledge of WhatsApp specifically?
I don't know, it just seems like every tech area has these problems and I honestly don't understand why there aren't more 'standardized' solutions here
I assume by "underlying version control system" you mean apt, rpm, homebrew and friends? They don't solve this problem either. Nobody in the opensource world is auditing code for you. Compromised xz still made it into apt. Who knows how many other packages are compromised in a similar way?
Also, apt and friends don't solve the problem that npm, cargo, pip and so on solve. I'm writing some software. I want to depend on some package X at version Y (eg numpy, serde, react, whatever). I want to use that package, at that version, on all supported platforms. Debian. Ubuntu. Redhat. MacOS. And so on. Try and do that using the system package manager and you're in a world of hurt. "Oh, your system only has official packages for SDL2, not SDL3. Maybe move your entire computer to an unustable branch of ubuntu to fix it?" / "Yeah, we don't have that python package in homebrew. Maybe you could add it and maintain it yourself?" / "New ticket: I'm trying to run your software in gentoo, but it only has an earlier version of dependency Y."
Hell. Utter hell.
It's not perfect and bad things still make it through, but just look at your example - XZ. This never made it into Debian stable repositories and it was caught remarkably quickly. Meanwhile, we have NPM vulnerability after vulnerability.
Even OpenBSD, famous for auditing their code, doesn't audit packages. Only the base system.
I’m not really sure what you think a maintainer adds here. They don’t audit the code. A well written npm or cargo or pip module works automatically on all operating systems. Why would we need or want human intervention? To what? Manually add each package to N other operating systems? Sounds like a huge waste of time. Especially given the selection of packages (and versions of those packages) in every operating system will end up totally different. It’s a massive headache if you want your software to work on multiple Linux distros. And everyone wants that.
Npm also isn’t perfect. But npm also has 20x as many packages as apt does on Ubuntu (3.1M vs 150k). I wouldn’t be surprised if there is more malicious code on npm. Until we get better security tools, its buyer beware.
No. Git.
That's still true of nix. Whether you should trust a package is on you. But nix solves everything else listed here.
> I want to use that package, at that version, on all supported platforms...
Nix derivations will fail to build if their contents rely on the FHS (https://refspecs.linuxfoundation.org/FHS_3.0/fhs/index.html), so if a package tries to blindly trust that `/bin/bash` is in fact a compatible version of what you think it is, it won't make it into the package set. So we can each package our a bash script, and instead of running on "bash" each will run on the precise version of bash that we packaged with it. This goes for everything though, compilers, linkers, interpreters, packages that you might otherwise have installed with pip or npm or cargo... nix demands a hash for it up front. It could still have been malicious the whole time, but it can't suddenly become malicious at a later date.
> ... Debian. Ubuntu. Redhat. MacOS. And so on. Try and do that using the system package manager and you're in a world of hurt.
If you're on NixOS, nix is your system package manager. If you're not, you can still install nix and use it on all of those platforms (not Windows, certain heroic folk are working on that, WSL works though)
> Oh, your system only has official packages for SDL2, not SDL3. Maybe move your entire computer to an unustable branch of ubuntu to fix it?"
I just installed SDL3, nix put it in `/nix/store/yla09kr0357x5khlm8ijkmfm8vvzzkxb-sdl3-3.2.26`. Then I installed SDL2, nix put it in `/nix/store/a5ybsxyliwbay8lxx4994xinr2jw079z-sdl2-compat-2.32.58` If I want one or the other at different times, nix will add or remove those from my path. I just have to tell nix which one I want...
$ nix shell nixpkgs#sdl2-compat
$ # now I have sdl2
$ exit
$ nix shell nixpkgs#sdl3
$ # now I have sdl3
> "Yeah, we don't have that python package in homebrew. Maybe you could add it and maintain it yourself?"All of the major languages have some kind of foo2nix adapter package. When I want to use a python package that's not in nixpkgs, I use uv2nix and nix handles enforcing package sanity on them (i.e. maps uv.lock, a python thing, into flake.lock, a nix thing). I've been dabbling with typescript lately, so I'm using pnpm2nix to map typescript libraries in a similar way.
The learning curve is no joke, but if you climb it, only the hard problems will remain (deciding if the package is malicious in the first place).
Also, you'll have a new problem. You'll be forever cursed to watch people shoot themselves in the foot with inferior packaging, you'll know how to help them, but they'll turn you down with a variant of "that looks too unfamiliar, I'm going to stick with this thing that isn't working".
If apt's DNA was to download package binaries straight from Github, then I would blame it on the package manager for making it so inherently easy to download malware, wouldn't I?
You're making assumptions that I am making assumptions, but I wasn't making assumptions. I understand the attack.
> Package manager doesn’t really play into this.
It does, for the reasons I described.
Kind of a terrifying statement, right there.
The industry runs on a lot more unexamined trust than people think.
They’re deployed automatically by machine, which definitionally can’t even give it a second thought. The upstream trust is literally specified in code, to be reused constantly automatically. You could get owned in your sleep without doing anything just because a publisher got phished one day.
Well, I should qualify that. I do use quite a few dependencies, but they are ones that I wrote.
i dont know what the solution here is other than stop using npm
Personally I think we need to start adding capability based systems into our programming languages. Random code shouldn't have "ambient authority" to just do anything on my computer with the same privileges as me. Like, if a function has this signature:
function add(a: int, b: int) -> int
Then it should only be able to read its input, and return any integer it wants. But it shouldn't get ambient authority to access anything else on my computer. No network access. No filesystem. Nothing.Philosophically, I kind of think of it like function arguments and globals. If I call a function foo(someobj), then function foo is explicitly given access to someobj. And it also has access to any globals in my program. But we generally consider globals to be smelly. Passing data explicitly is better.
But the whole filesystem is essentially available as a global that any function, anywhere, can access. With full user permissions. I say no. I want languages where the filesystem itself (or a subset of it) can be passed as an argument. And if a function doesn't get passed a filesystem, it can't access a filesystem. If a function isn't passed a network socket, it can't just create one out of nothing.
I don't think it would be that onerous. The main function would get passed "the whole operating system" in a sense - like the filesystem and so on. And then it can pass files and sockets and whatnot to functions that need access to that stuff.
If we build something like that, we should be able to build something like npm but where you don't need to trust the developers of 3rd party software so much. The current system of trusting everyone with everything is insane.
1. Capabilities given to a program by the user. Eg, "This program wants to access your contacts. Allow / deny". But everything within a program might still have undifferentiated access. This requires support from the operating system to restrict what a program can do. This exists today in iOS and Android.
2. Capabilities within a program. So, if I call a function in a 3rd party library with the signature add(int, int), it can't access the filesystem or open network connections or access any data thats not in its argument list. Enforcing this would require support from the programming language, not the operating system. I don't know of any programming languages today which do this. C and Rust both fail here, as any function in the program can access the memory space of the entire program and make arbitrary syscalls.
Application level permissions are a good start. But we need the second kind of fine-grained capabilities to protect us from malicious packages in npm, pip and cargo.
When you look at a mobile program such as the GadgetBridge which is synchronizing data between a mobile device and a watch, and number of permissions it requires like contacts, bluetooth pairing, notifications, yadda yadda the list goes on.
Systems like E-Lang wouldn't bundle all these up into a single application. Your watch would have some capabilities, and those would interact directly with capabilities on the phone. I feel like if you want to look at our current popular mobile OS's as capability systems the capabilities are pretty coarse grained.
One thing I would add about compilers, npm, pip, cargo. Is that compilers are transformational programs, they really only need read and write access to a finite set of input, and output. In that sense, even capabilities are overkill because honestly they only need the bare minimum of IO, a batch processing system could do better than our mainstream OS security model.
Ironically, any c++ app I've written on windows does exactly this. "Are you sure you want to allow this program to access networking?" At least the first time I run it.
I also rarely write/run code for windows.
If I can't yum (et.al.) install it I absolutely review the past major point releases for an hour and do my research on the library.
- Random numbers
- Timezones, date formatting
- JSON parsing & serialization
- Functional programming tools (map, filter, reduce, Object.fromEntries, etc)
- TypedArrays
And if you use bun or nodejs, you also have out of the box access to an HTTP server, filesystem APIs, gzip, TLS and more. And if you're working in a browser, almost everything in jquery has since been pulled into the browser too. Eg, document.querySelector.
Of course, web frameworks like react aren't part of the standard library in JS. Nor should they be.
What more do you want JS to include by default? What do java, python and go have in their standard libraries that JS is missing?
But of course it fucking doesn't because it's a scripting language for the web. It has what it needs, and to do that it doesn't need much.
It does though! The JS stdlib even includes an entire wasm runtime. Its huge!
Seriously. I can barely think of any features in the C++ stdlib that are missing from JS. There's a couple - like JS is missing std::priority_queue. But JS has soooo much stuff that C++ is missing. Its insane.
Also not many people seem to know this, but in the aftermath of leftpad being pulled from npm, npmjs changed their policy to disallow module authors from ever pulling old packages, outside a few very exceptional circumstances. The leftpad fiasco can’t happen again.
And no programming language's stdlib includes e. g. WhatsApp API libraries
I guess if you ship it you are still passing along contagion
... So you're saying there is a blueprint for mitigating this already, and it just isn't followed?
Also means you can put an end to a popular antipattern that has grown in recent years: letting your production infrastructure talk to whatever it likes to download whatever it likes from the Internet.
I've heard rumor of a few 100k people laid off in tech over the past few years that might be interested.
I've watched developers judge dependencies by GH stars, and "shiny" quotient.
On a completely unrelated tangent, I remember reading about a "GH Stars as a Service" outfit. I don't see any way that could be abused, though.../s
> They also left helpful comments in their code marking the malicious sections - professional development practices applied to supply chain attacks. Someone probably has a Jira board for this.
> The package has been available on npm for 6 months and is still live at the time of writing.
> (...) malware that steals your WhatsApp credentials, intercepts every message, harvests your contacts, installs a persistent backdoor, and encrypts everything before sending it to the threat actor's server.
Containerize all of your dev environments and lock dependency files to only resolve to a specific version of a dependency that is known safe.
Never do global installs directly, ideally don't even install node outside of a container.
Lag dependency updates by a couple weeks, and enable automated security scans like dependabot on GH. Do not allow automated updates, and verify every dependency prior to updating.
If you work on anything remotely sensitive, especially crypto adjacent, expect to be a target and use a dedicated workstation that you wipe regularly.
Sounds tedious, but thats the job.
Alternatively you could find a job outside the JS ecosystem, you'll likely get a pay bump too.
In this economy? I'll take any job lol.
I think I'm gonna skip the containers and go straight for a VPS. And keep everything completely sandboxed. My editor's can work via SSH anyways.
Containers are convenient because they work locally and you are likely using a containerized solution to deploy to production anyways.
What would have helped is if the dev/user had the ability for the dev/user to confirm before the code connected to a new domain or IP - api.WhatsApp.com? Approve. JoesServer.com or a random IP? Block. Such functionality could be at the OS or Docker level, etc.
Containers still have some risk since they share the host kernel, but they're a pretty good choice for protection against the types of attacks we see in the JS ecosystem. I'll switch to VM's when we start seeing container escape exploits being published as npm packages :)
When I first started doing development this way it felt like I was being a bit too paranoid, but honestly it's so fast and easy it's not at all noticeable. I often have to work on projects that use outdated package managers and have hundreds of top-level dependencies, so it's worth the setup in my opinion.
Haven't seen any in the wild, but i built a few poc's just to prove to myself that I wasn't being overly paranoid.
With LXC any changes you make to the os/filesystem are persisted and there after the container boots up and shutsdown. So I don't have to worry about ephemeral storage or changes being lost. It feels more like a "computer" if that makes sense.
That's what's needed and I am seriously surprised NPM is trusted like it is. And I am seriously surprised developers aren't afraid of being sued for shipping malware to people.
Which when compared to NPM, which has no meaningful controls of any sort, is an enormous difference.
Yeah thats the entire point.
Realistically, this is impossible.
And, if you do need a lib because it's too much work, like maybe you have to parse some obscure language, just vendor the package. Read it, test it, make sure it works, and then pin the version. Realistically, you should only have a few dozens packages like this.
I ran a little experiment recently, and it does take longer than just pulling in npm dependencies, but not that much longer for my particular project: logging, routing, rpc layer with end-to-end static types, database migrations, and so on. It took me a week to build a realistic, albeit simple app with only a few dependencies (Preact and Zod) running on Bun.
At least they seemed to have policies:
You can mitigate it by fully containerizing your dev env, locking your deps, enabling security scans, and manually updating your deps on a lagging schedule.
Never use npm global deps, pretty much the worst thing you can do in this situation.
Users should know better as well but you can’t really blame them.
Nothing wrong with that if the official API has less features.
> Authentication uses a shared secret and it’s obvious that you as a third party obtaining this secret from your users
What do you mean? Usually, you install such a package to automate WhatsApp for your own account.
There is no public WhatsApp API. You need to sign up for "WhatsApp Business Platform" to be able to use an API to interact with WhatsApp.
If there was a real API for WhatsApp, this probably wouldn't have happened.