Posted by dbalatero 9/3/2025
As an analogy, can you imagine being a startup that hired a developer, and months later finding out the bulk of the new Web app they "coded" for you was actually copy&pasted open source code, loosely obfuscated, which they were passing it off as something they developed, and to which the company had IP rights?
You'd immediately convene the cofounders and a lawyer, about how to make this have never happened.
First you realize that you need to hand the lawyer the evidence (against the employee), and otherwise remove all traces of that code and activity from the company.
Simultaneously, you need to get real developers started rushing to rewrite everything without obvious IP taint.
Then one of you will delicately ask whether firing and legal action against the employee is sufficient, or whether the employee needs to sleep with the fishes to keep them quiet.
The lawyer will say this kind of situation isn't within the scope of their practice, but here's the number of a person they refer to only as 'the specialist'.
Soon, not only are you losing the startup, and the LLC is being pierced to go after your personal assets, but you're also personally going to prison. Because you were also too cheap to pay the professional fee for 'the specialist', and you asked ChatGPT to make the employee have a freak industrial shredder accident.
All this because you tried to cheap out, and spend $20 or $200 on getting some kind of code to appear in your repo, while pretending you didn't know where it came from.
Oh, and then you have the actual tech giants offering legal commitment to protect you against any copyright claims:
https://blogs.microsoft.com/on-the-issues/2023/09/07/copilot...
https://cloud.google.com/blog/products/ai-machine-learning/p...
You might be right, but the point needs to be made.
https://techcrunch.com/2024/09/30/y-combinator-is-being-crit...
Maybe investors will care, but for now they stand to make more money from "AI" gold rush startups, and don't want to be a wet blanket on "AI" at all by bringing up concerns.
From my experience, it's much easier to get an LLM to generate code for a React/Tailwind CSS web app than a mobile app, and that's why we're seeing so many of these apps showing up in the SaaS space.
In fact it looks like there were less products launched last month on PH than the same period a year ago.
https://hunted.space/monthly-overview
It's a bit hard as they're not summing by months but it looks like less to me quickly scanning it.
And as Claude Code has only really been out 3/4 months you'd be expecting launches to be shooting up week-by-week right about now as all the vibe products get finished.
They're not, see the 8 week graph:
Well... no significant effects show except for a few projects. It was really hard torturing the data to come to my manager's desired conclusion.
Its sometimes helpful when writing an email but otherwise has not touched any of my productive work.
Changing domain to writing and images and video you can see LinkedIn is awash with everyone generating everything by LLMs. The posting cadence has quickened too as people shout louder to raise their AI assisted voice over other people’s.
We’ve all seen and heard the AI images and video tsunami
So why not software (yet but soon)??
Firstly, Software often has a function and AI tool creations cannot make that work. Lovable/Bolt etc are too flakey to live up to their text to app promises. A shedload of horror debugging or a lottery win of luck is required to fashion an app out of that. This will improve over time but the big question is, by enough?
And secondly, like on LinkedIn above: perhaps the standards of the users will drop? LinkedIn readers now tolerate the llm posts, it is not a mark of shame. Will the same reduction in standards in software users open the door to good-enough shovelware?
How much of it is to be blamed on AI, and how much on a culture of making users test their products, I do not know.
Everything has been enshittified so much that nothing phases them anymore.
It'll increase incremental developments manyfold. A non-programmer spending a few hours on AI to make their workflow better and easier and faster. This is what everyone here keeps missing. It's not the programmers that should be using AI; it's 'regular' people.
If AI enables regular folks to make programs, even if the worst quality shovelware, there should’ve been an explosion in quantity. All the programs that people couldn’t made, they would start making them in the past two years.
I'm not sure how that challenges the point of the article, which is that metrics of the total volume of code being publicly released is not increasing. If LLMs are opening the door to software development for many people whose existing skills aren't yet sufficient to publish working code, then we'd expect to see a vast expansion in the code output by such people. If that's not happening, why not?
On my computer. Once I've built something I often realize the problems with the idea and abandon the project, so I'm never shipping it.
In fact, being able to throw it out like this is a big time saver in itself. I've always built a lot of projects but when you've spent weeks or months on something you get invested in it, so you ship it even though you no longer believe in it. Now when it only takes a couple of days to build something you don't have the same emotional attachment to it.