Posted by obscurette 2 days ago
But the ~1980s corporation is no longer and it was driven by the hype cycle too, it's just not a recognizable one. You can google the adverts or read Soul of a New Machine.
If it really a were just division of labor, beneficial abstraction, shoulders of giants, etc, shouldn't we be able to distinguish genuinely new concepts from things we already had 40 years ago in a different context?
My Vectrex still worked last I checked.
The methods and algorithms powering advances in modern science, medicine, communications, entertainment, etc. would be impossible to develop, much less run, on something so rudimentary as a TI-99/4A. The applications we harness our technology for have become much more sophisticated, and so too must the technology stacks underpinning them, to the point that no single individual can understand everything. Take something as simple as real time video communication, something we take for granted today. There is no single person in the world who deeply understands every single aspect, from the semiconductor engineering involved in the manufacture of display and image sensors, to the electronics engineering behind the communication to/from the display/sensor, to the signal processing and compression algorithms used to encode the video, to the network protocols used to actually transmit the video, to the operating system kernel's scheduler capable of performing at sufficiently low-latency to run the videochat app.
By analogy, one can understand and construct every component of a mud hut or log cabin, but no single person is capable of understanding, much less constructing, every single component of a modern skyscraper.
He's criticizing the act of _not building_ on previous learnings. _It's in the damn title_.
Repeating mistakes from the past leads to a slow down in such advancements.
This has nothing to do with learning everything by yourself (which, by the way, is a worthy goal and every single person that tries knows by heart that it cannot be done, it's not about doing it).
This idea that we don’t understand the internals of anything anymore and nothing is reliable is a mix of nostalgic cherry-picking and willful ignorance of a lot of counter-examples.
Sure, a bunch of consumer appliances are nebulous, but they are designed for those tradeoffs. It’s not like your old VHS player was designed specifically to be easy to repair either.
The author is complaining about their of advanced networking feature breaking on a router intended for consumers. Why they haven’t upgraded to a prosumer setup is a mystery - OpnSense on a mini PC combined with some wireless access points is one way to go that offers a lot more configurablility and control.
Complaining that not everyone can understand low level hardware is ignorant of all the really cool low level hardware and maker communities that have exploded in recent years, and it’s ignorant of the fact that specialization existed back in the “good old days” as well. For example, we had separate transmission and body shop specialists in the mid-century, you couldn’t just go to any mechanic to fix any problem with your car.
I’d like to see someone in the VHS era design a printed circuit board using CAD software and get it printed on-demand, then design an enclosure and 3D print it in their house for pennies. You can design your own keyboards and other electronic gadgets and basically own a little factory in your own home these days. You can share designs with ease and many of the tools are open source. The amount of sophistication accessible to the average person is incredible these days.
Just yesterday I used an LLM to write some docs for me, and for a little bit where I mistakenly thought the docs were fine as they were (they weren't, but I had to read them closely to see this) it felt like, "wow if the LLM just writes all my docs now, I'm pretty much going to forget how to write docs. Is that something I should worry about?" The LLM almost fooled me. The docs sounded good. It's because they were documenting something I myself was too lazy to re-familiarize with, hoping the LLM would just do it for me. Fortunately the little bit of my brain that still wanted to be able to do things decided to really read the docs deeply, and they were wrong. I think this "the LLM made it convincing, we're done let's go watch TV" mentality is a big danger spot at scale.
There's an actual problem forming here and it's that human society is becoming idiocracy all the way down. It might be completely unavoidable. It might be the reason for the Fermi paradox.
>every extension is also an amputation
that said, it is up to society, and to a lesser extent individuals to determine which skills will be preserved --- an excellent example of a rational preservation of craft teaching in formal education is the northern European tradition of Sloyd Woodworking:
https://rainfordrestorations.com/tag/sloyd/
>Students may never pick up a tool again, but they will forever have the knowledge of how to make and evaluate things with ... hand and ... eye and appreciate the labor of others.
There's an education gap that needs to be addressed, but I don't know how it will get addressed. A lot of the web in the past few decades came from industry so industry had a way of training up people. Most of this ML stuff is coming from academia, and they aren't really the best at training up an army at all.
It's hard to know who to blame for all of this because it's kind of like not having an early warning asteroid detection system. HN or various communities did not have discussions even five years prior to GPT about the impending doom (no early warning at all). If you just take HN, we sat around here discussing a million worthless things across Rust/Javascript/startup b.s for years like headless chickens (right here on the frontpage) without realizing what was really to come.
Makes me wonder if the places I go for tech news are enough to be prepared. Which brings me back to what I quoted:
We’re creating a generation of developers and engineers who can use tools brilliantly but can't explain how those tools work.
We aren't creating them. They are the existing devs that had no idea AI was going to be a thing. Had anyone known it was to be such a thing, everyone would have ditched going to web development bootcamps in the mid 2010s.
This is called “good journalism”. It would be great if Elektor tried practicing it.