Posted by milkglass 16 hours ago
LLMs are a magnificent tool if you use them correctly. They enable deep work like nothing before.
The problem is the education system focused on passivity (obeyance), memorization, and standardized testing. And worst of all, aiming for the lowest common denominator. So most people are mentally lazy and go for the easy win, almost cheating. You get school and interview cheating and vivecoders.
But it's not the only way to use LLMs.
Similarly, in Wikipedia you can spend hours reading banal pop-slop content or instead spend that time reading amazing articles about history, literature, arts, and science.
As TFA says, the problem is that accumulating knowledge takes time and effort, and the AI hype and expectations on LLM-assisted coding helps with rationalizing ever more short-sighted decisions that squander or hinder that process.
Even if you are the absolute unicorn who gets paid to "code much harder problems" and "learning", the rest of the industry exists to deliver actual products and services.
So unless you nurture some type of https://xkcd.com/208/ fantasy, this is not just about you. The industry as a whole needs to find a way to work with LLMs without automating programming away entirely, and the industry as a whole needs to find a way to ensure that newcomers are able to be productive even if code-generation tools are taken away from them.
I'm not saying you're personally doing anything wrong, but there's a parallel here, when smart and curious people read articles about history and literature and art and science, rather than engaging directly with the real thing.
Or then the next level down, where creating amazing work in all of those domains depends on enough "slack" in the system for people to pursue deep work that will not be immediately profitable.
Do you see where I'm going with that? We (and I'm very much including myself: here I am on HN, instead of reading something more substantial) skim the (Wikipedia) surface, instead of diving truly deep. AIs (right now) are the ultimate surface-skimmers, and our fascination with and growing reliance on them reflects something in our current surface-skimming cultural mindset.
I'm going to steal that one and add it to Stross': "Efficiency is the reciprocal of resilience."
The other that really resonated was something that I read before along the lines of… we think that once humanity learns something, that knowledge stays and we build on it. But it’s not true, knowledge is lost all the time. We need to actively work to keep knowledge alive
That’s why libraries and the internet archive are so important. Wikipedia, too
So there was this: "I run engineering teams in Ukraine. My people lived the other side of this equation. Not the factory floor. The receiving end. While Raytheon was struggling to restart production from forty-year-old blueprints, the US was shipping thousands of Stingers to Ukraine. RTX CEO Greg Hayes: ten months of war burned through thirteen years’ worth of Stinger production. I’ve seen this pattern before. It’s happening in my industry right now."
The filter flashed the warning on the telltale signs and I stopped reading. Now I've got the puzzle I don't want to do. Did someone trying to argue against "AI assisted" coding use an LLM to author that argument?
But this is HN, I can also just move on to the next story.
It doesn’t seem much like defense industry problems.
I see a talent pipeline collapse in next 5 years. "Software engineering is over coding is a solved problem" as being chanted by semi literate media and the AI grifter's marketing departments would further scare away the allocation of human capital to software engineering easily commanding 3x rise in salaries due to resource shortage.
If the author sincerely believes the thesis that AI makes you vulnerable / dumb, they are either incredibly hypocritical. But more likely, they're just cynical and trying to get traffic to their website. And you're not getting back the time you spent reading this and arguing with it.