Posted by dbalatero 9/3/2025
Where I have found them very useful are for one-off scripts and stuff I need done quick and dirty, that isn't too complex and easily verifiable (so I can catch the mistakes it makes, and it does make them!), and especially in languages I don't know that well or don't like (i.e., bash, powershell, javascript)
No need to learn a programming language, wow, anyone can be a programmer now. A few projects come out of it, people marvel at how efficient it was, and it fizzles out and programmers continue writing code.
If anything, things like visual programming did more than AI does now. For games, if you want to see the shovelware, look at Flash, RPG maker, etc... not AI. On the business side of things, Excel is king. Can you get your vibe coded app out faster than by using Flash or Excel?
Instead I’m not waiting for something like Linux on smartphones to come so soon.
I guess someone could try a prompt of "generate a patch set from Linux tree X to apply to mainline Linux for this CPU".
I see pseudo-scientific claims from both sides of this debate but this is a bit too far for me personally. "We all know" sounds like Eternal September [1] kind of reasoning. I've been in the industry about as long as the article author and I think he might be looking with rose-tinted glasses on the past. Every aging generation looks down at the new cohort as if they didn't go through the same growing pains.
But in defense of this polemic, and laying out my cards as an AI maximalist and massive proponent of AI coding, I've been wondering the same. I see articles all the time about people writing this and that software using these new tools and it so often is the case they never actually share what they built. I mean, I can understand if someone is heads-down cranking out amazing software using 10 Claude Code instances and raking in that cash. But not even to see one open source project that embraces this and demonstrates it is a bit suspicious.
I mean, where is: "I rewrote Redis from scratch using Claude Code and here is the repo"?
This is one of my big datapoints in the skepticism, there's all these articles about how individual developers are doing amazing things, but almost no data points about the increase of productivity as a result.
Meanwhile I see WhatsApp sunsetting their native clients and making everything a single web-based client. I guess they must not be using LLMs to code if they can’t cope with maintaining the existing codebases, right?
There's really a lot to get from this "tool". Because in the end its a tool, and knowing how to use it is the most important aspect of it. It takes time, iteration, and practice to understand how to effectively use it
When you consider this, "generate me a whole repo" is trivially garbage and not meeting the measurement metric. However having AI autocomplete "getUser(..." clearly IS productive.
Now is that a 0.1% increase, 1%, or 10%? That I can't tell you.
Even if AI made them more productive, it's on a person to decide what to build and how to ship, so the number (and desire) of humans is a bottleneck. Maybe at some point AI will start buying up domains and spinning up hundreds of random indiehacker micro-SaaS, but we're not there. Yet.
This isn't likely to happen -- if the problem is very specific, you won't be able to sufficiently express it in natural language. We invented programming languages precisely because natural languages are completely unsuited for the task of precisely specifying a problem.
So how are you going to express the specificity of the problem to the LLM in natural language? Try, and you'll discover their shortcomings for yourself. Then you'll reinvent programming languages.