Top
Best
New

Posted by milkglass 12 hours ago

The West forgot how to make things, now it’s forgetting how to code(techtrenches.dev)
884 points | 553 commentspage 5
j45 1 hour ago|
One thing you can't really rule out American ingenuity is deciding to do something.

What America did with developing Shale Oil to become viable, so quickly is one example.

jmull 5 hours ago||
I don’t see it as so dire.

Software developers have been learning what they needed to know to do the job the whole time. That’s pretty much the job description.

What you need to know has changed a lot recently. Like always.

> The combination of technical skill and the judgment to know when the AI is wrong barely exists in the market anymore.

That’s certainly not true. I’d take a hard look at my hiring process if it was performing this inefficiently.

netfortius 10 hours ago||
This is why a comprehensive computer science degree is necessary. Seeing and working only with the trees leads to destroying some forests, eventually.
bit1993 10 hours ago||
Yes. Just like globalization created companies like TSMC, AI will do the same. Software engineers who don't rely on LLM code generators will have a moat because they can do it cheaply and sustainably.

Another reason is that LLMs train on the existing code we already know, don't expect new programming languages or frameworks this means that the software engineering skills that exist today will be relevant for a long time.

zelphirkalt 9 hours ago|
I am not so much convinced by your last point, that point of new languages and frameworks. I think the cutoff date is closing in on our current now. If models cannot easily become bigger, they will likely advertise using "up-to-date-ness". Maybe they will be merely a few days behind. Or bigger models will make use of smaller but more up-to-date models.

I think engineering skills will still remain relevant due to taste and proper judgement. A model trained on everything and the kitchen sink has probably not the fitting bias for given specific problems in my project. Accepting too much AI generated code without steering the ship will result in some drift of taste and ultimately make some mediocre project like done by people without good domain knowledge and without good taste. It might even be short term a business, but it lacks the long term excellence, that sets projects with good judgement apart from the common rabble.

bit1993 9 hours ago||
> I think the cutoff date is closing in on our current now. If models cannot easily become bigger, they will likely advertise using "up-to-date-ness". Maybe they will be merely a few days behind. Or bigger models will make use of smaller but more up-to-date models

But they will still rely on assembly, C, Rust, Linux, HTML, TCP/IP... Doesn't matter how up to date they are, they rely on existing code they have been trained on, they can't just create new languages without the training data.

matwood 5 hours ago||
Except they already have invented new languages.

https://medium.com/@trezzescience/the-ai-that-invented-its-o...

bit1993 3 hours ago||
That's not what I mean, even I can invent my own language that only I use, but if that language is not widely deployed and used by other people or LLM models, it's just a toy language.
eolgun 8 hours ago||
The Fogbank example is the most chilling part. It's not just that they lost the people — they lost the ability to know what they didn't know. Nobody could even write down what was missing because the knowledge was never formalized in the first place.

The junior hiring collapse compounds this. Senior engineers develop judgment partly by watching juniors make mistakes and correcting them. Remove that loop and you don't just lose future seniors — you quietly degrade the current ones.

The 0.18% recruiting conversion rate mentioned here tracks with what I see in compliance and security engineering too. "Can you tell when the AI is confidently wrong?" is now the most important interview question, and almost nobody can answer it well.

muragekibicho 8 hours ago|
The junior hiring collapse is all so bizarre. I graduated recently and my career prospects are jarringly limited.

I thought I'd go back for a Masters/PhD but then Trump mercurially defunded lots of STEM grad programs. Ngl, I found myself stuck. Zero job openings, zero PhD program openings. It's all so frustrating.

eolgun 8 hours ago||
[dead]
Scroll_Swe 11 hours ago||
"the west" ?

You mean the world?

Deepseek was being glazed here, Im sure chinese programmers use it like CC

Terr_ 9 hours ago|
To be charitable to TFA, there are a dearth of accurate and well-understood labels for the kind of X versus Y they want to make between national economies.

Even "First/Third world" has been fraying at the edges for decades since it was originally about political alignment.

Liftyee 8 hours ago||
I wonder if the real problem is short-term thinking in culture and incentivised by markets. By optimising next quarter's profits over investing in long-term growth and capability, things like this happen.
alecco 11 hours ago||
Speak for yourself. I now dare to code much harder problems and learning is bliss. No more having to sit down to dig needle-in-haystack through horrible documentation or random Stack Overflow posts.

LLMs are a magnificent tool if you use them correctly. They enable deep work like nothing before.

The problem is the education system focused on passivity (obeyance), memorization, and standardized testing. And worst of all, aiming for the lowest common denominator. So most people are mentally lazy and go for the easy win, almost cheating. You get school and interview cheating and vivecoders.

But it's not the only way to use LLMs.

Similarly, in Wikipedia you can spend hours reading banal pop-slop content or instead spend that time reading amazing articles about history, literature, arts, and science.

arexxbifs 4 hours ago||
Perhaps the approach to, and leverage from, using AI is different for someone who's been active on HN for two decades, and junior devs who've been brought up on iPhones in the flawed school system you're describing?

As TFA says, the problem is that accumulating knowledge takes time and effort, and the AI hype and expectations on LLM-assisted coding helps with rationalizing ever more short-sighted decisions that squander or hinder that process.

eszed 8 hours ago|||
> in Wikipedia you can spend hours reading banal pop-slop content or instead spend that time reading amazing articles about history, literature, arts, and science.

I'm not saying you're personally doing anything wrong, but there's a parallel here, when smart and curious people read articles about history and literature and art and science, rather than engaging directly with the real thing.

Or then the next level down, where creating amazing work in all of those domains depends on enough "slack" in the system for people to pursue deep work that will not be immediately profitable.

Do you see where I'm going with that? We (and I'm very much including myself: here I am on HN, instead of reading something more substantial) skim the (Wikipedia) surface, instead of diving truly deep. AIs (right now) are the ultimate surface-skimmers, and our fascination with and growing reliance on them reflects something in our current surface-skimming cultural mindset.

alecco 8 hours ago||
I meant it as a simple to understand parallel. Absolutely deep reading and thought is much better than Wikipedia or an LLM chat.
rglullis 9 hours ago||
> Speak for yourself.

Even if you are the absolute unicorn who gets paid to "code much harder problems" and "learning", the rest of the industry exists to deliver actual products and services.

So unless you nurture some type of https://xkcd.com/208/ fantasy, this is not just about you. The industry as a whole needs to find a way to work with LLMs without automating programming away entirely, and the industry as a whole needs to find a way to ensure that newcomers are able to be productive even if code-generation tools are taken away from them.

ianberdin 8 hours ago|
I don’t know. Partly true. I came to web development, when low level things solved: frameworks, ORMs, OSs, databases. I don’t know sql nor c++ well. But I can create a system, a value based on the abstractions. Everyone told me: Ruslan, you don’t know SQL, what a shame! Well I do not have problems and did not have about it.

Probably we are going to be fine with AI abstraction too. People will use it, stuck with problems, dig deeper, learn, improve, same as we had with frameworks and its source code.

More comments...