Posted by vbtechguy 4/7/2025
A new dev tech shows up. Old devs say: That's not real programming. Real programmers use the old ways. You take all the skill out of it. You'll never learn to do it the "right way". And then, it becomes standard, an no-one wants to go back but a few hobbyists.
It's been this way with switching from assembly to more human-readable languages.
It's been this way with syntax highlighting.
... and with IDEs.
I remember when we scoffed at IntelliSense over the Water Cooler because them kids didn't have to memorise stuff anymore.
I kept cursing at Docker and npm insanity, having colourful languages for people who hid behind abstraction because they did not understand basic functionality.
And today, it is AI. Right now, it divides. Those who love it, those who consider it 'cheating' or 'stealing other people's code'. In reality, it is just another level of abstraction in the development stack. Tomorrow, it'll just be 'the standard way to do things'.
I wonder what comes next.
This is neither innovative nor showing creativity IMO and reminds me more of twitter hype-bro posts, than something truly HN front-page worthy.
Ya'll are gonna be blindsided by AI if you don't turn around and see what is happening.
Programming is getting democratized. People who have never written a CLI command in their life will get the ability to tell a computer what they want it to do. Instead of having to get bent over with a stupid feature-packed $9.99/mo app, they can say "Make me a program that saves my grocery list and suggests healthier alternatives via AI analyses." And get a program that is dead simple and does exactly the simple task they want it to do.
Keep in mind that Excel '98 probably covers 90% of excel uses for 90% of users. Yet here we are in 2025 with society having spent billions over the years to upgrade and now subscribe to msoffice so Amy in HR can add up columns of payroll.
You have completely lost sight of reality if you think seeing a program like this is dumb because "anyone can just go and copy the git from a million different clones, load it into their choice programming enviroment, get the relevant dependencies, compile it and be playing in no time!". Maybe you live in a bubble where those words are English to everyone, but it is not reality.
For SWE's, I have great sympathy and the comradery for engineers solving hard problems all day.
But for the software industry as a whole? I hope it burns hell. The tolls to cross the moat of "telling the computer what to do" have been egregious and predatory for years.
I have no idea what that is supposed to mean but I keep hearing the same about art, music and other creative fields and it sure sounds like contempt for creative people.
I personally don't lose any sleep over LLMs being powerful wizards for getting started on new projects.. that's what LLMs are good at.. pulling together bits of things they've seen on the internet. There's a chasm between that and maintaining, iterating on a complex project. Things that require actual intelligence.
Instead of dealing with the costs associated with using, developing and printing from film, as well as the skills associated with knowing what a photo would look like before it was developed, digital cameras allowed new photographers to enter the industry relatively cheaply and shoot off a few thousand photos at a wedding at a relatively negligible cost. Those photographers rapidly developed their skills, and left studios with massive million dollar Kodak digital chemical printers in the dust. I know because I was working at one.
If you remember, this was in the time where the studio owned your negatives ostensibly forever, and you had to pay for reprints or enlargements. What were amateur photographers could enter this high-margin market, produce images of an acceptable quality, charge far less and provide far more.
I'm not able to say whether this will happen to software development, but the democratization of professional photography absolutely shook the somewhat complacent industry to its core.
In that case it had nothing to do with contempt for creative people, it was the opposite, anyone who wanted to be creative now could be.
I can give you the real example of recently needing to translates ancient 90's era manufacturing files into modern ones, while also generating companion automation files from it (which needs to by done manually but with tooling to facilitate the process).
I found a company that sells software capable of doing this. A license is $1000/yr/usr.
The next day I was able to get claude 3.7 to make a program that does all that, translate the files, and then have a GUI that renders them so an engineer can go through and manually demarcate points, which then are used to calculate and output the automation file. This took about 45 minutes and is what the department is using now. We have thousands of these files that will need to get modernized as they get used.
I see this everywhere now, and I have been building bespoke programs left an right. Whether it be an audio spectrum analyzer that allows me to finely tune my stereo equalizer visually based on feedback for any given track, or an app on my phone that instantly calculates futures prices and margin requirements for given position sizes and predicted market movements.
People think LLMs will be a paradigm shift, I can tell you the shift has already happened, it's just a matter of awareness now.
That sounds like something for which one should be spending the money on professionally developed and well-tested software. What's the expression? Penny wise, pound foolish.
It’s taken her a month to get up and running.
Everyone needs to realize this is not a matter of AI stealing dev jobs but the supply of low skilled developers exploding and eating into the market.
https://pluralistic.net/2023/01/21/potemkin-ai/#hey-guys
I don't doubt that LLMs can make programmers more productive. It's happening today, and I expect it will continue to improve, but it requires knowing what code they should generate, what the actual goals are, and how it should be tested. It can generate standard solutions to standard problems with standard bugs. That's fine they're a tool.
What the inexperienced expect them to do is read their mind, and implement what they want without testing (other than did it crash the first time I used it). Unfortunately, knowing the questions to ask is at least half of the problem, which by definition the inexperienced don't know how to do. You can already see that with vibecoding prompts to "write clear comments", "don't write bugs", and "use best practices".
So why does it lead to the enshitification of the programming experience? Because regular folks will be led to believe (Startrek movie Wargames hacker style) that this is how things are done. The will accept and expect rehashed garbage UI and implementations without security or corner case checking, because that's what they always get when they press a button and wait a minute. Now, why can't YOU stupid programmer, get the same results faster? I told you I wanted a cool game that would make me lots of money fast with no bugs!
I do have hope that some people will learn to be more clear in their descriptions of things, but guess what, english isn't really the language for that.
I'm talking about people talking in english to an AI on one screen, and compiled functioning programs appearing on the other. An "app playground" where you just tell your phone what you need an app to do, and a new bespoke app is now in your app draw.
Forget about UIs too. They won't be that important. You don't need a tree of options and menus, tool bars and buttons. You would just tell the program what you want it to do..."Don't put my signature on this email"..."wrap the text around this image properly"(cough msword cough)..."split these two parts and move the red one to the top layer"...or even "Create a button that does this and place it over there".
I think part of what you want is voice applications, because deleting your signature by hand is probably easier than trying to build a program that does it. Maybe the app could just search help and tell you what feature already does what you're asking for. Certainly, context sensitive voice recognition has gotten a LOT better with the latest LLMs. Not sure I'm looking forward to the guy on the train narrating to his laptop for an excel page, though.
But using AI to generate something in that style doesn't make you an artist. It isn't art, it's just a product.
Celebrating the 'democratization' of these skills is just showing adversity to basic learning and thinking. I'm not gonna celebrate a billion dollar corp trying to replace fundamentals of being human.
The reality is that you cannot become an expert in everything. I have songs I'd love to compose in my head, but it would be totally impractical for me to go through the hundreds/thousands of hours of training that would be needed to realize these songs in reality. Nor am I particularly motivated to pay someone else to sit there for hours trying to compose what I am telling them.
This is true for hundreds of activities. Things I want to do, but cannot devote the time to learn the intermediate steps to get there.
So the alternative is that you'll pay a tech company instead -- to use their model trained on unlicensed and uncredited human works to generate a mishmash of plagiarized songs, the end result of which nobody will ever want to listen to?
You don't have to though. Anyone who's spent a decent amount of time in a creative hobby will tell you they sucked when they started but they enjoyed the process of learning and exploring. I think you're depriving yourself of the mental benefits of learning a new skill and being creative. It flexes your mind in new ways.
If you just want something to exist, sure, but when you can press buttons and have a magic box spit out whatever you want with no effort, how much are you actually going to value it?
This is probably how people felt at the advent of calculators
But... using a calculator doesn't make you a mathematician either. And one could argue that society has born real negative consequences from the inability of most people to do even basic math because of the ubiquity of calculators. There is a big difference between using a tool and having the tool do everything for you.
Do you really believe that society will benefit when most people don't know how to express themselves creatively in any way other than asking the magic box to make a pretty thing for them?
Generative AI forces us to reconsider what original means because it's producing a "remix" of what it has seen before with no credit going to those who created those original works.
Our current laws are made to handle AI
The can for sure maintain and iterate on smaller projects -- which millions of people are happily feeding into them as training data.
Going from a project with 1,000 to 1,000,000 lines of code is a tiny leap compared to going from 0 to 1000. Once your argument is based on scale then you pretty much lost to the robots.
I'm not saying they are going to invent some free energy source (other than using humans as batteries) anytime soon but when the argument is "you're going to be blindsided" and the response shows, and I'm not trying to be insulting or anything like that, willful ignorance of the facts on the ground I'm going to say you probably will be blindsided when they take your app writing job or whatever.
I'm not attacking you at all just saying that there's a bunch of people who chose to keep their heads in the sand and hope this all just goes away.
Are you sure the leap is tiny? It's a much easier problem to get only 1,000 lines of code to be correct, because the lines only have to be consistent with each other.
And yet I feel more secure in my job today than I did a year ago because I'm constantly hitting the limits of what a language model can do. I've realized that the decisions I make every day as a senior engineer aren't mostly about what lines of code usually come after each other.
Can you please explain this contraction?
And 90% of what in business software? Ideas? Features? Implementation? I doubt 90% is the number for any of that, or LLM is good at any of those. Based on my own recent experience of LLMs (yes, switching between several different models) and seeing their disastrous performance on writing code for business logic that is just a little bit more specific, I am not convinced.
And another thing to consider, is if you are copying software from another business you need to compete in some way. Yours needs to have more polish (LLMs are bad at this), or a unique feature, or a different focus. LLM copying another business will only allow you to compete on price, but not those other things.
The fallacy that people are making is they look at the current state of things as 'the way it'll always be' while the AI companies, and I mean all the AI companies, are looking to corner the market so have no moral issues with taking out a whole swath of skilled jobs.
The cost to compete with another business's software is a high-end GPU.
1. Used https://gemini.google.com Gemini 2.5 Pro canvas mode for Gemini Advanced subscription account to ask it to build Atari missile command game in HTML5. Tweeted about it at https://x.com/George_SLiu/status/1906236287931318747 almost one shotted but after I passed level 1, it jumped to level 182 and pulverised my cities and bases LOL
2. Iterations in subsequent chat sessions in Gemini web to fix it up, add in-game store, leaderboard and end-hame AI analysis and when Gemini 'something went wrong' messages, jumped over to Claude 3.7 Sonnet via Claude Pro subscription web interface to finish off the task
3. A few days ago, I tried Cline with VSC for the first time with Openrouter AI API key and Gemini 2.5 Pro free LLM model has helped me with further development. Also now trying out Google Firebase for more Gemini 2.5 Pro access https://studio.firebase.google.com/ ^_^
The bar is not that high though, and is not the same for everyone depending on the content.
It's a similar religious matter?
> I also checked your profile and again, wondered why you are putting this: "Only admins see your email below. To share publicly, add to the 'about' box." as your bio!
Why not? I have "Error retrieving about text" in my "About" field on a certain sms replacement chat app...