Top
Best
New

Posted by jannesan 4/12/2025

The Bitter Prediction(4zm.org)
215 points | 178 commentspage 3
DeathArrow 4/13/2025|
>Why bother playing when I knew there was an easier way to win?

>This is the exact same feeling I’m left with after a few days of using Claude Code.

For me what matters is the end result, not the mere act of writing code. What I enjoy is solving problems and building stuff. Writing code is a part.

I would gladly use a tool to speed up that part.

But from my testing, unless the task is very simple and trivial, using AI isn't always a walk in the park, simple and efficient.

weinzierl 4/13/2025||
"But I predict software development will be a lot less fun in the years to come, and that is a very bitter prediction in deed."

Most professional software development hasn't been fun for years, mostly because of all the required ceremony around it. But it doesn't matter, for your hobby projects you can do what you want and it's up to you how much you let AI change that.

coolThingsFirst 4/12/2025||
Still think amazement of ai tools as harsh as it sounds signals incompetence of the user. They are useful don’t get me wrong but just today Claude wrote code that literally wouldnt run.

Thought it’s ok to use new for object literal in JS.

gadilif 4/12/2025||
I can really relate to the feeling described after modifying save files to get more resources in a game, but I wonder if it's the same kind of 'cheating'. Doing better in a game has its own associsted feeling of achievement, and cheating definitely robs you of that, which to me explains why playing will be less fun. Moving faster on a side project or at work doesn't feel like the same kind of shortcut/cheat. Most of us no longer program in assembly language, and we still maintain a sense of achievement using elite languages, which naturally abstract away a lot of the details. Isn't using AI to hide away implementation details just a natural next step, where instead of lengthy error prone machine level code, you have a few modern language instructions?
lloeki 4/12/2025|
> Moving faster on a side project or at work doesn't feel like the same kind of shortcut/cheat.

Depends whether you're in it for the endgame or the journey.

For some the latter is a means to the former, and for others it's the other way around.

gadilif 4/12/2025||
I see your point, and tend to agree. However, at least for the time being, I see the AI tools not inherently different than refactoring tools which were available over a decade ago. It helps me move faster, and I feel like it's one more tool I need to master, so it will be useful in my toolbox.
jwblackwell 4/12/2025||
The author is essentially arguing that fewer people will be able to build software in the future.

That's the opposite of what's happened over the past year or two. Now many more non-technical people can (and are) building software.

walleeee 4/12/2025||
> The author is essentially arguing that fewer people will be able to build software in the future.

Setting aside the fact that the author nowhere says this, it may in fact be plausible.

> That's the opposite of what's happened over the past year or two. Now many more non-technical people can (and are) building software.

Meanwhile half[0] the students supposed to be learning to build software in university will fail to learn something important because they asked Claude instead of thinking about it. (Or all the students using llms will fail to learn something half the time, etc.)

[0]: https://www.anthropic.com/news/anthropic-education-report-ho...

> That said, nearly half (~47%) of student-AI conversations were Direct—that is, seeking answers or content with minimal engagement.

wobfan 4/12/2025||
No, he never states this and is not true.

The author tell his experience regarding his joy programming things and figuring stuff out. In the end he says that AI made him lose this joy, and he compares it to cheating in a game. He does not say one word about societal impact and or the amount of engineers in the future, it's what you interpreted yourself.

jwblackwell 4/12/2025||
“ In some countries, more than 90% of the population lives on less than $5 per day. If agentic AI code generation becomes the most effective way to write high-quality code, this will create a massive barrier to entry”
wobfan 4/13/2025||
> The author is essentially arguing that fewer people will be able to build software in the future.

You comment is talking about ability to build software, vs. the article (in only a single sentence that references this topic, while the other 99% circles around something else) talks about the job market situation. If what you wanted so say "The author is arguing that people will probably have a harder time getting a job in software development", that would have been correct.

> That's the opposite of what's happened over the past year or two. Now many more non-technical people can (and are) building software.

You're (based on the new comment) explicitly saying that people without technical knowledge are getting jobs in software development sector. Where did you get that info from? Would be an interesting read for sure, if it's actually true.

freb3n 4/12/2025||
The financial barrier point is really great.

I feel the same with a lot of points made here, but hadn't yet thought about the financial one.

When I started out with web development that was one of the things I really loved. Anyone can just read about html, css and Javascript and get started with any kind of free to use code editor.

Though you can still do just that, it seems like you would always drag behind the 'cool guys' using AI.

M4v3R 4/12/2025||
You still don’t need AI to write software, but investing in it will make you more productive. More money enables you to buy better tools, that was always true for any trade. My friend is a woodworker and his tools are 5-10x more expensive than what I have in my shack, but are also more precise, more reliable and easier to use. AI is the same, I would even argue it gives you a bigger productivity boost with less money (especially given that local models are getting better literally every week).
zwnow 4/12/2025||
Incredible take considering using AI robs new learners of off real learning. There is a reason lots of experienced devs are dropping it from their editors. Using AI will not make you a better dev, it simply accelerates you building a failing product faster, because ultimately you wont understand your own product. Most devs that use AI blindly trust it instead of questioning what it produces.
falcor84 4/12/2025||
> Most devs that use AI blindly trust it instead of questioning what it produces.

Without the punctuation, I first read it tautologically as "Most devs that use AI blindly, trust it instead of questioning what it produces". But even assuming you meant "Most devs that use AI, blindly trust it instead of questioning what it produces", there's still a negative feedback loop. We're still at the early experimentation phase, but if/when AI capabilities eventually settle down, people will adapt, learning when and when not they can trust the AI coder and when to take the reins - that would be the skill that people are hired for.

Alternatively, we could be headed towards an intelligence explosion, with AI growing in capabilities until it surpasses human coders at almost all types of coding work, except perhaps for particular tasks which the AI dev could then delegate to a human.

zwnow 4/12/2025||
A dystopia in which ill look for a new career. Using AI to generate code sucks the joy out of the job.
tasuki 4/12/2025||
> A dystopia in which ill look for a new career.

What makes you think that will be necessary?

zwnow 4/12/2025||
Because I dont want to work with AI agents? I like my work to be fun, as in "I could bear working 8 hours a day with this." I like thinking about the problems and solutions and how that translates to code. I like implementing it with my own hands. Substitute that with writing prompts and I'll look for a different career thats actually fun.
tasuki 4/14/2025||
What makes you think you having a career will be a thing by that point?
zwnow 4/14/2025||
Because it already is a thing? Some companies already force AI upon their devs. And weird, I still need to pay my bills so yea, I need a career to do so.
qingcharles 4/13/2025||
These platforms all feel like they are being massively subsidized right now. I'm hoping that continues and they just burn investor cash in a race to the bottom.
pornel 4/12/2025||
AI will be cheap to run.

The hardware for AI is getting cheaper and more efficient, and the models are getting less wasteful too.

Just a few years ago GPT-3.5 used to be a secret sauce running on the most expensive GPU racks, and now models beating it are available with open weights and run on high end consumer hardware. Few iterations down the line good-enough models will run on average hardware.

When that Xcom game came out, filmmaking, 3D graphics, and machine learning required super expensive hardware out of reach of most people. Now you can find objectively better hardware literally in the trash.

cardanome 4/12/2025|
I wouldn't be so optimistic.

Moore's law is withering away due to physical limitations. Energy prices go up because of the end of fossil fuels and rising climate change costs. Furthermore the global supply chain is under attack by rising geopolitical tension.

Depending on US tariffs and how the Taiwan situation plays out and many other risks, it might be that compute will get MORE expensive in the future.

While there is room for optimization on the generative AI front we are still have not even reached the point were generative AI is actually good at programming. We have promising toys but for real productivity we need orders of magnitude bigger models. Just look how ChatGPT 4.5 is barely economically viable already with its price per token.

Sure if humanity survives long enough to widely employ fusion energy, it might become practical and cheap again but that will be a long and rocky road.

pornel 4/12/2025|||
LLMs on GPUs have a lot of computational inefficiencies and untapped parallelism. GPUs have been designed for more diverse workloads with much smaller working sets. LLM inference is ridiculously DRAM-bound. We currently have 10×-200× too much compute available compared to the DRAM bandwidth required. Even without improvements in transistors we can get more efficient hardware for LLMs.

The way we use LLMs is also primitive and inefficient. RAG is a hack, and in most LLM architectures the RAM cost grows quadratically with the context length, in a workload that is already DRAM-bound, on a hardware that already doesn't have enough RAM.

> Depending on US tariffs […] end of fossil fuels […] global supply chain

It does look pretty bleak for the US.

OTOH China is rolling out more than a gigawatt of renewables a day, has the largest and fastest growing HVDC grid, a dominant position in battery and solar production, and all the supply chains. With the US going back to mercantilism and isolationism, China is going to have Taiwan too.

joshjob42 4/13/2025||||
Costs for a given amount of intelligence as measured by various benchmarks etc has been falling by 4-8x per year for a couple years, largely from smarter models from better training at a given size. I think there's still a decent amount of headroom there, and as others have mentioned dedicated inference chips are likely to be significantly cheaper than running inference on GPUs. I would expect to see Gemini Pro 2.5 levels of capability in models that cost <$1/Mtok by late next year or plausibly sooner.
jamil7 4/13/2025|||
I think there’s a huge amount of inefficiency all the way through the software stack due to decades of cheap energy and rapidly improving hardware. I would expect with hardware and energy constraints that we will need to look for deeper optimisations in software.
HarHarVeryFunny 4/13/2025||
Coding itself can be fun, perhaps especially when one is trying to optimize in some way (faster, less memory usage, more minimal, etc), but at least for me (been S/W eng for 45+ years) I think the real satisfaction is conquering the complexity and challenges of the project, and ultimately the ability to dream about something and conjure it up to become a reality. Maybe coding itself was more fun back in the day of 8-bit micros where everything was a challenge (not enough speed or memory), but nowadays typically that is not the case - it's more about the complexity of what is being built (unless it's some boilerplate CRUD app where there is no fun or challenge at all).

With today's AI, driven by code examples it was trained on, it seems more likely to be able to do a good job of optimization in many cases than to have gleaned the principles of conquering complexity, writing bug-free code that is easy and flexible to modify, etc. To be able to learn these "journeyman skills" an LLM would need to either have access to a large number of LARGE projects (not just Stack Overflow snippets) and/or the thought processes (typically not written down) of why certain design decisions were made for a given project.

So, at least for time being, as a developer wielding AI as a tool, I think we can still have the satisfaction of the higher level design (which may be unwise to leave to the AI, until it is better able to reason and learn), while leaving the drudgework (& a little bit of the fun) of coding to the tool. In any case we can still have the satisfaction of dreaming something up and making it real.

JKCalhoun 4/13/2025||
> In some countries, more than 90% of the population lives on less than $5 per day. If agentic AI code generation becomes the most effective way to write high-quality code, this will create a massive barrier to entry … Don't even get me started on the green house gas emissions of data centers...

My (naive?) assumption is that all of this will come down: the price (eventually free) and the energy costs.

Then again, may daughters know I am Pollyanna (someone has to be).

gitfan86 4/12/2025|
I'm not following the logic here. There are tons of free tier AI products available. That makes the world more fair for people in very poor countries not less.
ben_w 4/12/2025||
Lots of models are free, and useful even, but the best ones are not.

I'm not sure how much RAM is on the average smartphone owned by someone earning $5/day*, but it's absolutely not going to be the half a terabyte needed for the larger models whose weights you can just download.

It will change, but I don't know how fast.

* I kinda expect that to be around the threshold where they will actually have a smartphone, even though the number of smartphones in the world is greater than the number of people

More comments...