Top
Best
New

Posted by obscurette 2 days ago

We're not innovating, we're just forgetting slower(www.elektormagazine.com)
147 points | 129 commentspage 2
thadk 2 days ago|
idk, if you're going to center that good old TI computer, you gotta contrast the Conways law of that company, and the semi-solo person sharing 'grammable tidbits. The desired carrying capacity lies in that institution. Today it's in Shenzhen.

But the ~1980s corporation is no longer and it was driven by the hype cycle too, it's just not a recognizable one. You can google the adverts or read Soul of a New Machine.

xg15 1 day ago||
All the usual rebuttals in this thread, but I don't see anyone engaging with his assertion that we're forgetting and reinventing things.

If it really a were just division of labor, beneficial abstraction, shoulders of giants, etc, shouldn't we be able to distinguish genuinely new concepts from things we already had 40 years ago in a different context?

skybrian 2 days ago||
If it works, he's lucky. For example, Commodore 64's often had a dodgy power supply that would eventually damage the computer.

My Vectrex still worked last I checked.

forinti 2 days ago|
About every 15 years I have to change the caps on my Beeb. The third job is coming soon.
MontyCarloHall 2 days ago||
>That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code. It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

The methods and algorithms powering advances in modern science, medicine, communications, entertainment, etc. would be impossible to develop, much less run, on something so rudimentary as a TI-99/4A. The applications we harness our technology for have become much more sophisticated, and so too must the technology stacks underpinning them, to the point that no single individual can understand everything. Take something as simple as real time video communication, something we take for granted today. There is no single person in the world who deeply understands every single aspect, from the semiconductor engineering involved in the manufacture of display and image sensors, to the electronics engineering behind the communication to/from the display/sensor, to the signal processing and compression algorithms used to encode the video, to the network protocols used to actually transmit the video, to the operating system kernel's scheduler capable of performing at sufficiently low-latency to run the videochat app.

By analogy, one can understand and construct every component of a mud hut or log cabin, but no single person is capable of understanding, much less constructing, every single component of a modern skyscraper.

alganet 2 days ago|
You're misdirecting.

He's criticizing the act of _not building_ on previous learnings. _It's in the damn title_.

Repeating mistakes from the past leads to a slow down in such advancements.

This has nothing to do with learning everything by yourself (which, by the way, is a worthy goal and every single person that tries knows by heart that it cannot be done, it's not about doing it).

fusionadvocate 2 days ago||
Abstractions hide details, that does not mean they cease to exist. The problem with abstractions is that it makes it easier to create conflicts when making changes. Lots of hidden details are affected by a high level change.
dangus 2 days ago||
Ugh, another one of these articles.

This idea that we don’t understand the internals of anything anymore and nothing is reliable is a mix of nostalgic cherry-picking and willful ignorance of a lot of counter-examples.

Sure, a bunch of consumer appliances are nebulous, but they are designed for those tradeoffs. It’s not like your old VHS player was designed specifically to be easy to repair either.

The author is complaining about their of advanced networking feature breaking on a router intended for consumers. Why they haven’t upgraded to a prosumer setup is a mystery - OpnSense on a mini PC combined with some wireless access points is one way to go that offers a lot more configurablility and control.

Complaining that not everyone can understand low level hardware is ignorant of all the really cool low level hardware and maker communities that have exploded in recent years, and it’s ignorant of the fact that specialization existed back in the “good old days” as well. For example, we had separate transmission and body shop specialists in the mid-century, you couldn’t just go to any mechanic to fix any problem with your car.

I’d like to see someone in the VHS era design a printed circuit board using CAD software and get it printed on-demand, then design an enclosure and 3D print it in their house for pennies. You can design your own keyboards and other electronic gadgets and basically own a little factory in your own home these days. You can share designs with ease and many of the tools are open source. The amount of sophistication accessible to the average person is incredible these days.

zzzeek 2 days ago||
I see a bunch of "nobody knows everything, this old man needs to appreciate modern technology stacks" comments, and in some ways I blame the post for this because it kind of meanders into that realm where it gets into abstractions being bad and kids not knowing how to make op-amp circuits (FTR, I am from the "you have to know op-amps!" generation and I intentionally decided deep hardware hacking was not going to be my thing), but the actual core thing I think is important here is that working hard is being devalued - putting in the time to understand the general workings underpinnings of software, the hardware, using trial and error to solve an engineering problem as opposed to "sticking LEDs on fruit", the entire premise of knowing how things work and achieving some deep expertise is no longer what people assume they should be striving for, and LLMs, useful or not, are only accelerating this.

Just yesterday I used an LLM to write some docs for me, and for a little bit where I mistakenly thought the docs were fine as they were (they weren't, but I had to read them closely to see this) it felt like, "wow if the LLM just writes all my docs now, I'm pretty much going to forget how to write docs. Is that something I should worry about?" The LLM almost fooled me. The docs sounded good. It's because they were documenting something I myself was too lazy to re-familiarize with, hoping the LLM would just do it for me. Fortunately the little bit of my brain that still wanted to be able to do things decided to really read the docs deeply, and they were wrong. I think this "the LLM made it convincing, we're done let's go watch TV" mentality is a big danger spot at scale.

There's an actual problem forming here and it's that human society is becoming idiocracy all the way down. It might be completely unavoidable. It might be the reason for the Fermi paradox.

WillAdams 1 day ago|
Marshall Mcluhan called this out ages ago:

>every extension is also an amputation

that said, it is up to society, and to a lesser extent individuals to determine which skills will be preserved --- an excellent example of a rational preservation of craft teaching in formal education is the northern European tradition of Sloyd Woodworking:

https://rainfordrestorations.com/tag/sloyd/

>Students may never pick up a tool again, but they will forever have the knowledge of how to make and evaluate things with ... hand and ... eye and appreciate the labor of others.

bitwize 2 days ago||
I love how he talks about knowing thermal characteristics, etc. and then cites the TI-99/4A as an example of something designed by people who Really Knew What They Were Doing. The TI-99/4A was notorious for being prone to overheat due to its power supply. Munch Man and Parsec were complete non-starters for me when it got hot in July. This was even mentioned, specifically, in Halt and Catch Fire. The early microcomputer engineers were spitballing. You want to talk about the number of bodge wires that were in every TRS-80? Or the Apple III having no thermal vents per order of Steve Jobs, and the "you're holding it wrong" of the 80s being "drop it on your desk to fix it"? We know how to build better, more reliable computers for cheaper today than we ever did in the 80s. Then we fuck them up with things like Windows, but still.
ivape 2 days ago||
We’re creating a generation of developers and engineers who can use tools brilliantly but can't explain how those tools work.

There's an education gap that needs to be addressed, but I don't know how it will get addressed. A lot of the web in the past few decades came from industry so industry had a way of training up people. Most of this ML stuff is coming from academia, and they aren't really the best at training up an army at all.

It's hard to know who to blame for all of this because it's kind of like not having an early warning asteroid detection system. HN or various communities did not have discussions even five years prior to GPT about the impending doom (no early warning at all). If you just take HN, we sat around here discussing a million worthless things across Rust/Javascript/startup b.s for years like headless chickens (right here on the frontpage) without realizing what was really to come.

Makes me wonder if the places I go for tech news are enough to be prepared. Which brings me back to what I quoted:

We’re creating a generation of developers and engineers who can use tools brilliantly but can't explain how those tools work.

We aren't creating them. They are the existing devs that had no idea AI was going to be a thing. Had anyone known it was to be such a thing, everyone would have ditched going to web development bootcamps in the mid 2010s.

hluska 2 days ago|
> The same publications that use “disruptive AI” unironically are the ones that need to Google “what is a neural network” every time they write about machine learning.

This is called “good journalism”. It would be great if Elektor tried practicing it.

More comments...