Top
Best
New

Posted by milkglass 10 hours ago

The West forgot how to make things, now it’s forgetting how to code(techtrenches.dev)
773 points | 463 commentspage 4
eolgun 7 hours ago|
The Fogbank example is the most chilling part. It's not just that they lost the people — they lost the ability to know what they didn't know. Nobody could even write down what was missing because the knowledge was never formalized in the first place.

The junior hiring collapse compounds this. Senior engineers develop judgment partly by watching juniors make mistakes and correcting them. Remove that loop and you don't just lose future seniors — you quietly degrade the current ones.

The 0.18% recruiting conversion rate mentioned here tracks with what I see in compliance and security engineering too. "Can you tell when the AI is confidently wrong?" is now the most important interview question, and almost nobody can answer it well.

muragekibicho 7 hours ago|
The junior hiring collapse is all so bizarre. I graduated recently and my career prospects are jarringly limited.

I thought I'd go back for a Masters/PhD but then Trump mercurially defunded lots of STEM grad programs. Ngl, I found myself stuck. Zero job openings, zero PhD program openings. It's all so frustrating.

eolgun 6 hours ago||
[dead]
Liftyee 6 hours ago||
I wonder if the real problem is short-term thinking in culture and incentivised by markets. By optimising next quarter's profits over investing in long-term growth and capability, things like this happen.
Scroll_Swe 9 hours ago||
"the west" ?

You mean the world?

Deepseek was being glazed here, Im sure chinese programmers use it like CC

Terr_ 8 hours ago|
To be charitable to TFA, there are a dearth of accurate and well-understood labels for the kind of X versus Y they want to make between national economies.

Even "First/Third world" has been fraying at the edges for decades since it was originally about political alignment.

booleandilemma 1 hour ago||
I'm not forgetting how to code. They're not gonna get me.

Did I forget everyone's phone numbers when cell phones came out? Yes.

But this is different. Coding is my passion. I was doing it before I got paid to do it and I'll be doing it after they no longer pay people to do it anymore.

ianberdin 6 hours ago||
I don’t know. Partly true. I came to web development, when low level things solved: frameworks, ORMs, OSs, databases. I don’t know sql nor c++ well. But I can create a system, a value based on the abstractions. Everyone told me: Ruslan, you don’t know SQL, what a shame! Well I do not have problems and did not have about it.

Probably we are going to be fine with AI abstraction too. People will use it, stuck with problems, dig deeper, learn, improve, same as we had with frameworks and its source code.

scotty79 2 hours ago||
I think comments on such posts have bimodal distribution. On one end there are people who see the utility of AI models for programming and are generally eager to see more capable models and ways of using them. On the other there are people who see AI destroying programming and have no idea how AI could change to be a force for good.

I had idea what might be the difference between the groups. I think for the latter group the code is important part of the goal. They see software as rather ends than means. Not entirely of course.

And the first group considers artifacts that the software produces to be the goal. So as long as AI written software is capable of producing valuable artifact they are willing and eager to go with it. And AI does that.

If the result of my code is finetuning of a neural network, I don't really care how it happened. I can benchmark it afterwards and know if the code that AI made for this purpose was good or not. I can inspect the code, investigate it, pinpoint ideas I don't like, suggest some ideas to try that I believe could give better results. I can restart, or try doing same thing few times in parallel trying different harnesses and models. All in service of the result, that is not code.

If you have a program that needs to do something and are willing to try AI to write it, think foremost about how you can rephrase the problem so that the output of AI written program becomes an artifact that can be independently verified, how to turn desired behavior into an artifact to evaluate.

mahrain 7 hours ago||
I don't agree with "the west forgot how to make things", it moved supply chains for cheap consumer goods to asia, but in the B2B space a lot of things are manufactured in Europe: companies like Bosch, Volkswagen, ASML, Alstom and Airbus are cranking out extremely complicated machines that last many years in demanding environments. It's just a different level of value-add vs. low cost electronics (for instance).
joker99 6 hours ago|
I remember Covid and the supply chain crisis that unfolded in Europe and the west. Most of the companies you’ve mentioned weren’t cranking out anything during that time as all of them realised that "low cost electronics" are not always readily available and that we forgot how to make them or don’t have the capacity to produce them in significant numbers anymore ourselves. A lot of basic electronic components were not available during that time and we still haven’t fully grasped the complexities of our supply chains and where they begin.

I also remember, that EE for a while stopped using the term "jellybean parts". Turns out that most jellybeans are produced in Asia.

alecco 9 hours ago||
Speak for yourself. I now dare to code much harder problems and learning is bliss. No more having to sit down to dig needle-in-haystack through horrible documentation or random Stack Overflow posts.

LLMs are a magnificent tool if you use them correctly. They enable deep work like nothing before.

The problem is the education system focused on passivity (obeyance), memorization, and standardized testing. And worst of all, aiming for the lowest common denominator. So most people are mentally lazy and go for the easy win, almost cheating. You get school and interview cheating and vivecoders.

But it's not the only way to use LLMs.

Similarly, in Wikipedia you can spend hours reading banal pop-slop content or instead spend that time reading amazing articles about history, literature, arts, and science.

arexxbifs 2 hours ago||
Perhaps the approach to, and leverage from, using AI is different for someone who's been active on HN for two decades, and junior devs who've been brought up on iPhones in the flawed school system you're describing?

As TFA says, the problem is that accumulating knowledge takes time and effort, and the AI hype and expectations on LLM-assisted coding helps with rationalizing ever more short-sighted decisions that squander or hinder that process.

eszed 6 hours ago|||
> in Wikipedia you can spend hours reading banal pop-slop content or instead spend that time reading amazing articles about history, literature, arts, and science.

I'm not saying you're personally doing anything wrong, but there's a parallel here, when smart and curious people read articles about history and literature and art and science, rather than engaging directly with the real thing.

Or then the next level down, where creating amazing work in all of those domains depends on enough "slack" in the system for people to pursue deep work that will not be immediately profitable.

Do you see where I'm going with that? We (and I'm very much including myself: here I am on HN, instead of reading something more substantial) skim the (Wikipedia) surface, instead of diving truly deep. AIs (right now) are the ultimate surface-skimmers, and our fascination with and growing reliance on them reflects something in our current surface-skimming cultural mindset.

alecco 6 hours ago||
I meant it as a simple to understand parallel. Absolutely deep reading and thought is much better than Wikipedia or an LLM chat.
rglullis 7 hours ago||
> Speak for yourself.

Even if you are the absolute unicorn who gets paid to "code much harder problems" and "learning", the rest of the industry exists to deliver actual products and services.

So unless you nurture some type of https://xkcd.com/208/ fantasy, this is not just about you. The industry as a whole needs to find a way to work with LLMs without automating programming away entirely, and the industry as a whole needs to find a way to ensure that newcomers are able to be productive even if code-generation tools are taken away from them.

agentbc9000 6 hours ago||
Chinese models run around $2 to $8 dollars per million tokens - Claude is 10x that cost - when will the bean counters move to chinese models, USA bans these models for national security reasons - Anthropic, Openai, Meta, X all move to China where the models will be cheaper
user2722 7 hours ago|
If the system treats you as a number, you should become a mercenary.

I love this articles who all the coders read but none of the management.

If possible, be a mercenary and put a high number on your expertise, so we can solve this management blind spot faster.

If you can't, let your life/work's passion be "not starving to death", and try to change it politics-side.

More comments...