Top
Best
New

Posted by milkglass 16 hours ago

The West forgot how to make things, now it’s forgetting how to code(techtrenches.dev)
1039 points | 712 commentspage 7
alecco 15 hours ago|
Speak for yourself. I now dare to code much harder problems and learning is bliss. No more having to sit down to dig needle-in-haystack through horrible documentation or random Stack Overflow posts.

LLMs are a magnificent tool if you use them correctly. They enable deep work like nothing before.

The problem is the education system focused on passivity (obeyance), memorization, and standardized testing. And worst of all, aiming for the lowest common denominator. So most people are mentally lazy and go for the easy win, almost cheating. You get school and interview cheating and vivecoders.

But it's not the only way to use LLMs.

Similarly, in Wikipedia you can spend hours reading banal pop-slop content or instead spend that time reading amazing articles about history, literature, arts, and science.

arexxbifs 8 hours ago||
Perhaps the approach to, and leverage from, using AI is different for someone who's been active on HN for two decades, and junior devs who've been brought up on iPhones in the flawed school system you're describing?

As TFA says, the problem is that accumulating knowledge takes time and effort, and the AI hype and expectations on LLM-assisted coding helps with rationalizing ever more short-sighted decisions that squander or hinder that process.

rglullis 13 hours ago|||
> Speak for yourself.

Even if you are the absolute unicorn who gets paid to "code much harder problems" and "learning", the rest of the industry exists to deliver actual products and services.

So unless you nurture some type of https://xkcd.com/208/ fantasy, this is not just about you. The industry as a whole needs to find a way to work with LLMs without automating programming away entirely, and the industry as a whole needs to find a way to ensure that newcomers are able to be productive even if code-generation tools are taken away from them.

eszed 12 hours ago||
> in Wikipedia you can spend hours reading banal pop-slop content or instead spend that time reading amazing articles about history, literature, arts, and science.

I'm not saying you're personally doing anything wrong, but there's a parallel here, when smart and curious people read articles about history and literature and art and science, rather than engaging directly with the real thing.

Or then the next level down, where creating amazing work in all of those domains depends on enough "slack" in the system for people to pursue deep work that will not be immediately profitable.

Do you see where I'm going with that? We (and I'm very much including myself: here I am on HN, instead of reading something more substantial) skim the (Wikipedia) surface, instead of diving truly deep. AIs (right now) are the ultimate surface-skimmers, and our fascination with and growing reliance on them reflects something in our current surface-skimming cultural mindset.

alecco 12 hours ago||
I meant it as a simple to understand parallel. Absolutely deep reading and thought is much better than Wikipedia or an LLM chat.
bsder 15 hours ago||
> Optimized for minimum cost with zero margin for surge. On paper, efficient. In practice, one bad day away from collapse.

I'm going to steal that one and add it to Stross': "Efficiency is the reciprocal of resilience."

californical 15 hours ago|
Yes that is one key that resonated with me. The author did a great job of putting these recurring concepts into their own words

The other that really resonated was something that I read before along the lines of… we think that once humanity learns something, that knowledge stays and we build on it. But it’s not true, knowledge is lost all the time. We need to actively work to keep knowledge alive

That’s why libraries and the internet archive are so important. Wikipedia, too

andai 6 hours ago||
Basically, we forgot the human neural nets must also be trained.
crusty 6 hours ago||
I feel sad that evening I read now has to pass through my "is this LLM slop?" filter, and if it gets caught, the content loses focus and the worthless puzzle of truth takes over.

So there was this: "I run engineering teams in Ukraine. My people lived the other side of this equation. Not the factory floor. The receiving end. While Raytheon was struggling to restart production from forty-year-old blueprints, the US was shipping thousands of Stingers to Ukraine. RTX CEO Greg Hayes: ten months of war burned through thirteen years’ worth of Stinger production. I’ve seen this pattern before. It’s happening in my industry right now."

The filter flashed the warning on the telltale signs and I stopped reading. Now I've got the puzzle I don't want to do. Did someone trying to argue against "AI assisted" coding use an LLM to author that argument?

But this is HN, I can also just move on to the next story.

Meirambek_VIDI 16 hours ago||
Do you think this is a tooling problem or more about incentives and how engineers are trained now?
great_psy 15 hours ago|
I think the article is making the point that it is a cultural problem about cost cutting and short term thinking.
Meirambek_VIDI 14 hours ago||
Yeah, agreed - short-term incentives seem to drive a lot of this. Do you think tools can help, or is it mostly cultural?
great_psy 8 hours ago||
Give me a python script that takes a string representing the output of a sha256 algorithm and a plain string and compares if the sha256 of plain strig matches the sha256 provided.
skybrian 15 hours ago||
There was a time when companies had terrible development practices and could forget how to build, test, and deploy software, but is anyone seeing that now? We have much better development practices nowadays.

It doesn’t seem much like defense industry problems.

disgruntledphd2 14 hours ago||
This still happens. Lots of my career has been figuring out what code is actually running in prod, and determining if it even works.
IronyMan100 12 hours ago||
IMHO, it's a people Thing. People developed better practices, talked about IT in conferences, maybe left the company. AS a result the knowlegde spread. On the other Hand, If the places where a skilled individual can work and honey their skills, the knowlegde become scarce, the knowlegde cannot spread anymore and it will vanish. If you only program with AI and 5 people do the Work of 100, then you end Up in such a scenario.
imrozim 15 hours ago||
How do you become a senior engineer if no one hires you as a junior anymore.
hkt 14 hours ago|
Talk confidently in your interview with non-technical managers when the last senior has left and there's nobody there to check your work.
blitzar 14 hours ago||
So the same as it is now, be a good salesperson.
bakugo 3 hours ago||
The West apparently also forgot how to write articles without AI.
wg0 15 hours ago||
>The combination of technical skill and the judgment to know when the AI is wrong barely exists in the market anymore.

I see a talent pipeline collapse in next 5 years. "Software engineering is over coding is a solved problem" as being chanted by semi literate media and the AI grifter's marketing departments would further scare away the allocation of human capital to software engineering easily commanding 3x rise in salaries due to resource shortage.

wiseowise 14 hours ago|
First it was “learn to code” and bazillion videos of TikTok schmucks showing off slacking at work, now everything is solved. The puzzle is complete.
chromacity 8 hours ago|
I just can't get over the incredible irony of so many of these "AI is bad for you, mmkay" articles being LLM-generated.

If the author sincerely believes the thesis that AI makes you vulnerable / dumb, they are either incredibly hypocritical. But more likely, they're just cynical and trying to get traffic to their website. And you're not getting back the time you spent reading this and arguing with it.

More comments...