Posted by bearsyankees 1 day ago
This feels like the core of the article, but it doesn’t prove the need for open source.
One of which I am experiencing right now is somebody just copying my repo, not crediting me, didn't even try to change the README. It's pretty discouraging.
The other is security reasons, the premise that volunteers will report vulnerabilities really matter if you are big enough for small portion of people to dedicate themselves, for the most part people take open source tool use it and then forget about it, they only want stuff fixed.
Lastly, open source development kinda sucks so far. I'v been working on a few different tools and the amount of trolling and just bad faith actors I had to deal with is exhausting. On top of that there is a constant stream of people just demanding stuff to be fixed quickly.
Making the assumption that the same amount of money needed to attack a critical vulnerability is also required to find and fix it.
Lets say we have a project with 100 modules, and it costs us $100 000 to check these modules for vulnerabilities. What is stopping an attacker from spending the same amount of money to scan, lets say 10 modules but this time with 10x the number of tokens per module than the defender had when hardening the software?
Some things just can't be truly secure as well, ddos protection is mostly a guessing/preventive game, exposing your firewall config/scripts will make you more vulnerable than NOT.
If your codebase isn't exposed, attackers are constrained by the network and other external restrictions which greatly reduce the number of possible trials, even with a swarm of residential proxies, it's not the same at all from inspecting a codebase in depth with thousand of agents and all models.
- it’s not open vs closed anymore, it’s more like bug finding going a few devs poking around to basically infinite parallel scanners
- so now you don’t get a couple of thoughtful reports, you get a many edge cases and half-real junk. fixing capacity didn’t change though
- closing the repo doesn’t really save you, it just switches from white-box to black-box… and that’s getting pretty damn good anyway
real problem is: vuln discovery scaled, patching didn’t. now everything is a backlog game
Cal.com folks are getting a red team for free, wouldn't that further convince them their closed source software is strong enough?
Isn't Strix's business companies paying for scans regardless of whether the software scanned is open source or closed?
At $WORK we had a system which, if you traced its logic, could not possibly experience the bug we were seeing in production. This was a userspace control module for an FPGA driver connected to some machinery you really don't want to fuck around with, and the bug had wasted something like three staff+ engineer-years by the time I got there.
Recognizing that the bug was impossible in the userspace code if the system worked as intended end-to-end, the engineers started diving into verilog and driver code, trying to find the issue. People were suspecting miscompilations and all kinds of fun things.
Eventually, for unrelated reasons, I decided to clean up the userspace code (deleting and refactoring things unlocks additional deletion and refactoring opportunities, and all said and done I deleted 80% of the project so that I had a better foundation for some features I had to add).
For one of those improvements, my observation was just that if I had to write the driver code to support the concurrency we were abusing I'd be swearing up a storm and trying to find any way I could to solve a simpler problem instead.
Long story short, I still don't know what the driver bug was, but the actual authors must've felt the same way, since when I opted for userspace code with simpler concurrency demands the bug disappeared.
Tying it back to AI and hacking, the white box approach here literally didn't work, and the black box approach easily illuminated that something was probably fucky. Given that AI can de-minify and otherwise spot patterns from fairly limited data, I wouldn't be shocked if black-box hacking were (at least sometimes) more token-efficient than white-box.
This seems to be extremely common. Been a very long time since I looked at Linux kernel stuff, but there were numerous drivers that disabled hardware acceleration or offloading features simply because they became unreliable if they were given heavy loads or deep queues.