Top
Best
New

Posted by mltvc 4 hours ago

You Are Here(brooker.co.za)
63 points | 80 comments
stephenlf 3 hours ago|
> The cost of turning written business logic into code has dropped to zero

Didn’t realize this was science fiction.

geetee 3 hours ago||
I appreciate the author making that the first sentence.
AstroBen 2 hours ago|||
I swear all of these are coming from the prompt "hey chatgpt rewrite this article that got a lot of views"

I've seen non-technical people vibe code with agents. They're capable of producing janky, very basic CRUD apps. Code cost definitely ain't zero

bopbopbop7 2 hours ago|||
I think the author forgot that code has to compile and be useful.

And how much is technical debt worth?

simonw 1 hour ago|||
What coding agent are you using where the code doesn't even compile!?
bopbopbop7 1 hour ago|||
The one that cursor used to build their famous browser.
whynotminot 1 hour ago|||
The person you're replying to's account appears to be entirely used for bad-faith takes on LLMs.

I'm not sure carpet bombing every thread with AI misinformation is going to stop the forward progress of this technology, but they're giving it their best shot.

bopbopbop7 1 hour ago||
AI misinformation? Please do provide some examples.

Your whole history is AI psychosis btw, seek help.

whynotminot 1 hour ago||
> AI misinformation? Please do provide some examples.

Like saying they can't generate compiling code in this very thread?

bopbopbop7 1 hour ago||
Go look at the 40k failing CI/CD runs on the famous cursor browser.
whynotminot 1 hour ago||
Ah, it couldn't generate a 100% complete working browser from scratch in a week. I guess the technology is cooked.
canadiantim 2 hours ago|||
Depends if I can bundle the technical debt, get a triple AAA rating on it and then sell it
heliumtera 2 hours ago||
But it is true, the cost is effectively zero. There will be, for a long time, free models available and any one of them will give you code back, always!

They never refuse. Worst case scenario the good models ask for clarification.

The cost for producing code is zero and code producers are in a really bad spot.

dt3ft 2 hours ago|||
I beg to differ. Let's say you're right. Code producers should turn to agriculture and let their managers and product owners prompt AI to produce code. How about code maintainers? Ever heard the mantra "You build it, you run it"? Lets say that AI can build it. Can it run it though? All alone, safely, securely and reliably? No. It can't. We can keep dreaming though, and when will AI code production services turn profitable? Is there a single one which turned profitable?
heliumtera 2 hours ago||
Calm down buddy, maybe you're confusing code producers with something else. It's 2026 we don't bother with maintenance no more, we /new to keep context clean and start over. Just don't forget to comment - never delete - old code. Always keep dead code around to please shareholders, line numbers up always. We produce code, that is the main thing, never forget.

One could argue we could achieve the same goals by appending \n to a file in a loop, but this is inefficient nowadays with generous token offerings (but could change in the future than I highly suggest just outputting \n to a file an call it productivity increase)

I didn't understand your point about product owners. Who the fuck would ever need one when code produces itself?

dullcrisp 1 hour ago|||
Right but memory is expensive now so where do I keep all of this new code that I’ve produced??
bopbopbop7 2 hours ago||||
Because who cares about correct and compilable code, any code will do!
heliumtera 2 hours ago||
Exactly!
ThrowawayR2 1 hour ago||||
Good grief, it's like watching people consuming Soylent meal substitute shakes and proclaiming that chefs and cooking are obsolete.
autoexec 2 hours ago|||
> The cost for producing code is zero

Zero as long as your time is worth nothing, and bad code and security issues cost you nothing maybe.

"Getting code" has always been dead simple and cheap. Getting actually good code that works and doesn't turn into a problem for you down the road is the expensive part

chasd00 2 hours ago||
> Zero as long as your time is worth nothing

i can't remember who said it but a long time ago i remember reading "Linux is free if your time is worthless". Now we all use Linux one way or the other.

autoexec 2 hours ago||
That's still very much true, but at least in the case of Linux the cost is getting lower and lower all the time. The time investment for many has reached about the same as the cost needed for Windows and as a result we see more and more people using linux. At this point it's a perfectly viable gaming platform!

Maybe one day LLMs will eventually make good code at a low cost, and that will allow non-programmers to write programs with few problems but the cost will never be zero, and I think we're a long long way from making human programers obsolete.

All of the intelligence that LLMs mimic came directly from the work of human minds which got fed into them, but what LLMs output is a lossy conversion filled with error and hallucination.

My guess is that the LLMs producing code will improve for a short time, but as they start to slurp up more and more of their own slop they'll start performing worse.

mohsen1 3 hours ago||
I am thinking about this a lot right now. Pretty existential stuff.

I think builders are gonna be fine. The type of programmer were people would put up with just because they could really go in their cave for a few days and come out with a bug fix that nobody else on the team could figure out is going to have a hard time.

Interestingly AI coding is really good at that sort of thing and less good at fully grasping user requirements or big picture systems. Basically things that we had to sit in meetings a lot for.

ericpauley 2 hours ago||
This has been my experience too. That insane race condition inside the language runtime that is completely inscrutable? Claude one-shots it. Ask it to work on that same logic to add features and it will happily introduce race conditions that are obvious to an engineer but a local test will never uncover.
wiseowise 2 hours ago|||
> The type of programmer were people would put up with just because they could really go in their cave for a few days and come out with a bug fix that nobody else on the team could figure out is going to have a hard time.

Amen. It was a good time while it lasted.

oytis 2 hours ago|||
All software engineers become pretty much the same in this world though. Anyone can sit in the meetings.
falloutx 2 hours ago|||
meetings hardly reach anywhere. most of the details are eventually figured out by developers when interacting with the code. If all ideas from PMs are implemented in a software, it would eventually turn into bloatware before even reaching MVP stage.
diob 1 hour ago||
Not really, in my experience you still have to be good at solving problems to use it effectively. Claude (and other AI) can help folks find a "fix", but a lot of times it's a band-aid if the user doesn't understand how to debug / solve things themselves.

So the type of programmers you're talking about, who could solve complex problems, are actually just enhanced by it.

ossa-ma 3 hours ago||
With all due respect to the author, this is a lot of words for not much substance. Rehashing the same thoughts everyone already thinks but not being bold enough to make a concrete prediction.

This is the time for bold predictions, you’ve just told us we’re in a crucible moment yet you end the article passively….

YZF 2 hours ago||
Predictions

- Small companies using AI are going to kick the sh*t out of large companies that are slow to adapt.

- LLMs will penetrate more areas of our lives. Closer to the STTNG computer. They will be agents in the real life sense and possibly in the physical world as well (robots).

- ASICs will eat nVidia's lunch.

- We will see an explosion of software and we will also see more jobs for people who are able to maintain all this software (using AI tools). There is going to be a lot more custom software for very specific purposes.

falloutx 2 hours ago||
> Small companies using AI are going to kick the sh*t out of large companies that are slow to adapt.

Big companies are sales machines and their products have been terrible for ages. Microsoft enjoys the top spot in software sales only due to their sales staff pushing impossible deals every year.

YZF 2 hours ago||
It's true the big company products have been terrible but they also enjoyed a moat that made it harder for competitors to enter.

With this moat reduced I think you'll find this approach doesn't work any more. The smaller companies will also hire the good sales people away.

jdub 56 minutes ago||
History suggests otherwise, and there's nothing particularly special about this moment.

Microsoft survived (and even, for a little while, dominated) after missing the web. Netscape didn't eat its lunch.

Then Google broke out on a completely different front.

Now there's billions of dollars of investment in "AI", hoping to break out like the next Google... while competing directly with Google.

(This is why we should be more ambitious about constraining large companies and billionaires.)

jdjdndbdhsjsb 3 hours ago||
Here is my bold prediction: 2026 is the year where companies start the lay offs.

2026 is the year where we all realise that we can be our own company and build the stuff in our dreams rather than the mundane crap we do at work.

Honestly I am optimistic about computing in general. Llms will open things up for novices and experts alike. We can move into I the fields where we can use our brain power... But all we need is enough memory and compute to control our destiny....

IhateAI_2 3 hours ago|||
The one-shotted mind is truly hilarious.
jdjdndbdhsjsb 2 hours ago||
I'm human?
jdjdndbdhsjsb 2 hours ago|||
Oytis: I can't reply to you directly, but yes I am sure I am human.

Not sure how to prove it to you.

oytis 2 hours ago|||
Are you sure?
Muromec 3 hours ago||||
>Here is my bold prediction: 2026 is the year where companies start the lay offs.

Start? Excuse moi

jdjdndbdhsjsb 2 hours ago||
Yeah fair... But now it is different I.e. they won't regret it
layer8 1 hour ago||
What gives you the idea that they regretted it?
AIorNot 3 hours ago||||
I don't know, its a bit of a hellscape in tech right now as thousands of people with deep domain knowledge and people knowledge and business knowledge (ie experienced engineers managers and product owners), were laid off by C Suites desperate to keep the AI funded mandates going

Do you know how hard it to make a successful company or even make money? Its like saying any actor can goto hollywood and be a star

VCs wont fund everyone

Nobody is sure of anything

jdjdndbdhsjsb 2 hours ago||
Yes it is. But I am an optimist for human nature. I personally believe smaller companies doing different things is the future... Scaling as they need. It is a hellscape but people can and will adapt.

> Do you know how hard it to make a successful company or even make money?

Yes I have failed to do it before. I get this.

> VCs wont fund everyone

And? Do you need VCs? Economics mean that scale matters but what if we don't need it. What if we can make efficient startups with our own funding??

falloutx 2 hours ago|||
Except it started in 2023, we are in the middle of layoff waves.
singpolyma3 1 hour ago||
If all you were doing is taking requirements from someone else and poorly coding them up (and yes I know a decent % of the industry matches close to this) then yes you are obsolete. Something just as useless but much faster now here.

If you are part of the requirements process. If you find problems to solve and solve them. If you push back on requirements when they are not reasonable. Etc. Then you still have a career and I don't see anything coming for you soon.

lbreakjai 1 hour ago||
> If all you were doing is taking requirements from someone else and poorly coding them up

So, in your entire career, you've always worked in companies where you were a subject matter expert on everything the company did? Always knew the business domain inside out? You were running the numbers, sitting with customers, and determining yourself what they really wanted?

> If you push back on requirements when they are not reasonable. Etc

I did, because the requirements had a cost, which I had to balance with limited resources.

If widget A would make 10 customers happy, but would cost two weeks of work, that could be better spent making widget B that'd make 20 customers happy, then it would not be reasonable.

If widget A and B are free, then it becomes unreasonable to say no.

haolez 1 hour ago||
Recent models are pretty good at pointing out unreasonable requirements.
bopbopbop7 2 hours ago||
Has there been any good and useful software created with LLMs or any increase in software quality that we can actually look at?

So far it's just AI doom posting, hype bloggers that haven't shipped anything, anecdotes without evidence, increase in CVEs, increase in outages, and degraded software quality.

oytis 2 hours ago|
Software quality has been degrading for decades without LLMs though.

I only have anecdotal evidence from some engineers I know that they don't write software by hand any more. Provided the software they are working on was useful before, we can say that LLMs are writing useful software now.

szczepano 1 hour ago||
For sure, because copy pasting from stackoverflow was already to difficult. Everyone loves writing business logic in computer understandable words. Name it LLM, programming language, calculator, there is still parser and interpreter. The only switch is attempt to replace deterministic machines with not deterministic machines by reducing machine error rate to acceptable percentage.
Xiol 3 hours ago||
> The cost of turning written business logic into code has dropped to zero

Tokens are free now?

TheCoreh 3 hours ago||
> Or, at best, near-zero.
blamarvt 2 hours ago||
I mean, lots of numbers are near zero depending on your definition of near.
heliumtera 2 hours ago||
Yes. Every platform offers free tokens generously.

That is a true statement. Might not be much, but is enough for you to produce some code and shit out a readme and then show on hacker news that your capable of pushing to git with the help of llms

input_sh 1 hour ago||
It's pretty much like drugs or gambling.

Sure, the tiny rushes of dopamine are free, but that's just enough to get your foot into the door and pretty soon they won't hit the same. If you don't quit while you're ahead, it's either gonna financially ruin you or turn you into a zombie that never uses its brain for an extended period of time and you end up writing posts like this one to self-rationalise your completely irrational behaviour because you've been scared into thinking your career is gonna be ruined if you don't keep diving head-first into this addiction.

Not a coincidence that this is happening at the same time as "bet on anything" apps.

tolerance 1 hour ago||
There is something poignant about the change taking place in the software industry occurring alongside mass deportations. I can’t put my finger on just what it is that makes it so...

The Pollyanas have a point but overstate it. The naysayers should be more cautious though.

pvtmert 2 hours ago||
> The cost of turning written business logic into code has dropped to zero

It hasn't. Large enterprises currently footing the bill, essentially subsidizing AI for now.

I constantly see comparisons between the 200$ Claude-Code Max subscription vs 6-figure (100k$) salary of an engineer.

The comparison here is, first of all, not apples-to-apples. Let's correct CC subscription to the yearly amount first; 12x200=2400$. Still more than 10x difference compared to the human engineer.

Although when you have the human engineer, you also pay for the experience, responsibility, and you somewhat transfer liability (especially when regulations come into play)

Moreover, creation by a human engineer, unless stolen IP or was plagiarized, owned by you as the employer/company. Meanwhile, whatever AI generated is guaranteed to be somewhat plagiarized in the first place. The ownership of the IP is questionable.

This is like when a layman complains when the electrician comes to their house, identifies the breaker problem, replaces the breaker which costs 5$ and charges 100$ for 10-minute job. Which is complete underestimation of skill, experience, and safety. A wrong one may cause constant circuit-breaks, causing malfunction in multitude of electronic devices in the household. Or worse, may cause a fire. When you think you paid 100$ for 10-minutes, in fact it was years of education, exams, certification, and experience you had paid for your future safety.

The same principle applies to the AI. It seems like it had accumulated more and more experience, but failing at the first prompt-injection. It seems like getting better at benchmarks because they are now part of their dataset. All these are hidden-costs 99% does not talk about. All these hidden costs are liabilities.

You may save an engineer's yearly salary today, at the cost of losing ten times more to the civil-lawsuits tomorrow. (Of course, depending on a field/business)

If your business was not that critical to get a civil-lawsuit in the first place, then you probably didn't needed to hire an engineer yourself. You could hire an agency/contractor to do that in much cheaper way, while still sharing liability...

ausbah 2 hours ago|
blogs like this seem like they’re in the right direction with LLMs being “here to stay” and a near indispensable part of people’s daily toolkit, but the near certainty that programming as a job or skillset is dead in the water seems just wrong?

like ok the cost for anyone to generate almost always working code has dropped to zero but how does a lay person verify the code satisfies business logic? asking the same set to generate tests to that just seems to move the goalposts

or like what happens when the next few years of junior engineers (or whatever replaces programming as a field)who’ve been spoon fed coding through LLMs need to actually decipher LLM output and pinpoint something the machine can’t get right after hours of prompting? a whole generation blindly following a tool they cant reliably control?

but maybe I am just coping because it feels like the ladder on the rest of my already short career , but some humility m

hippo22 1 hour ago|
I think you’re ignoring the growth trajectory. Surex AI generates almost always working code now, but what till it generate in 1-5 years?
More comments...