Posted by obscurette 1 day ago
And the ones who can design a op-amp circuit can't manufacture the laminate their circuit is going to be printed on. And the ones who know how to manufacture the laminate probably doesn't know how to refine or synthesize the material from the minerals. And probably none of them knows how to grow and fertilize the crop to feed themselves.
No one knows everything. Collaboration has been how we manage complexity since we were biologically a different species than H. sapiens.
> To take an example, therefore, from a very trifling manufacture, but one in which the division of labour has been very often taken notice of, the trade of a pin-maker...a workman not educated to this business...could scarce, perhaps..make one pin in a day, and certainly could not make twenty. I have seen a small manufactory...where ten men only were employed...Those ten persons, therefore, could make among them upwards of forty-eight thousand pins in a day.
- An Inquiry into the Nature and Causes of the Wealth of Nations, Adam Smith, 1776
When you divide and specialize design, you get design by commitee.
If you design a desk lamp, it wasn't designed by a committee just because a person designed the screws, another designed the plate stamping machine, another designed the bulb socket and etc.
If you start designing hardware for AI, together with AI designed to run just on that hardware, and tie those design cycles together, you'll get a design by commitee. It is very likely that requirements will make an overall bad hardware (but slightly better for AI), and an overall bad AI (but slightly better in some hardware).
Eventually, these shortcuts lead to huge commitees. Sometimes they're not even formally defined.
The screw company should make good screws, not good screws for a specific desk lamp model. A good designer then _chooses_ to use those screws _because they are good_, not because they can make specific design requirements to the screw company.
The desk lamp can be designed around common parts to reduce the total number of unique parts in the world, so everything is reusable, replaceable, manufacturable in larger quantities so there's more resources to optimize the process, etc.
I am not in favor of ignoring those cognitive limitations. That wouldn't make the limitations go away; it would only keep us from taking measures to cope with them, such as dividing and specializing design into modular generic parts.
It's bad communication. You need to follow up the vague analogies with real examples to make a message stick. If you don't, people have no idea what you are saying.
All I see in the words is algebra where concrete terms authors use that refer to others, are variables that could sometimes be set to you.
In your counter-example, the design was not divided, and thus it is not a counter-example at all.
The lamp design clearly was divided -- the final designer did not design the screws, lightbulb, wiring, and perhaps many other components. Someone had to design those components that were then combined creatively into a lamp.
Dividing design into components that can be effectively reused is tricky, but it remains essential.
Last week I was learning about Itanium. It was a processor designed specifically for HP. Its goal was to replace both x86 and PowerPC.
HP would design the new systems to run on Itanium, and Intel would design the chip.
There was an attempt at specializing design here, with both companies running on design constraints from another. They formed a design comittee.
This was like the screw company making screws _specifically_ for one kind of desk lamp. It's division and specialization of design.
A natural specialization (one company gets very good at designing some stuff) is not divided, or orchestrated by a central authority.
In manufacture, it's the other way around. If you already have a good design, the more divisions alongside a main central figure, the better. You can get tighter tolerances, timing benefits, etc.
My argument is that these aspects are not transferrable from one concept to another. Design is different from manufacturing, and it gets worse if we try to apply the optimizations we often do with manufacturing to it.
It doesn't work in practice. CS graduates from my school are trained on git and Linux command lines. CE teaches none of this and students discover in 3rd year they cannot get an internship because they share all their code as IDE screenshots in Google Docs.
But we do know how the entire process of building a computer works, from quantum physics, semiconductor doping, npn junctions, CMOS logic, logic gates, hardware design languages, assembly, C, and Java.
If only all of this "important" knowledge didn't crowd out basic skills.
* the EE’s need to learn matlab or numpy, to use as a tool
* so do the computer engineering students, probably
* the computer engineering students also need to learn low level stuff, because they might reasonably end up writing drivers or embedded code
* separating out what should be in which classes is kind of tricky; keeping in mind that the students don’t necessarily know anything about programming at all, you probably need some kind of “intro to the general idea of programming” class
* realistically when they start the program they don’t know much about what the jobs look like, so it is good if they are able to switch paths, for the first couple years
* realistically a lot of people just want to get a stem degree and then go be a programmer anyway
https://ocw.mit.edu/courses/6-001-structure-and-interpretati...
but they've since switched to Python for reasons:
The misses remind me of technical debt vs shiny features. I wrote embedded assembly and did a bunch of labs on them. But due to time constraints and difficulty, I didn't go into much detail on interrupt-based programming. And our OS course focused more on rote memorization of concepts such as locks/mutexes/semaphores/deadlock avoidance algorithms/memory management.
But there's little detail on why I might use Linux syscalls over libc.
Which CE program did you study at? I've worked with Waterloo, UBC, and UT ECE grads and they have similar levels of knowledge of programming fundamentals as their CS grads. I would be shocked if a first or second year BS ECE cannot use Git or some alternative VCS - that means there are more fundamental issues with your university's engineering curriculum.
> I can design a simple op-amp circuit and deploy to a Kubernetes cluster because Canada has a "Computer Engineering" degree that's a hybrid between CS/Electrical Engineering.
Same in the States, ECE and EECS programs tend to teach both fairly equally, and there are plenty of top programs with a strong reputation in this (Cal, MIT, CMU, UIUC, UT Austin, UW, UCSD, UCLA, GT, etc)
The issue I have noticed though is the decline of "CSE" programs - CS programs with an added CompArch or OS internals focus. CS programs are increasingly making OS internals and CompArch optional at the undergrad level, and it is having an impact on the pipeline for adjacent fields like Cybersecurity, Distributed Systems, Database Internals, etc.
I've harped about this skills gap multiple times on HN.
As a farmer and software developer, with a electronics hobby (and it being a part of the job of being a farming these days), I can check off growing crops, op-amp circuits, and Kubernetes deployments.
I don't own, or have reasonable access to, the necessary capital for laminating circuit boards and synthesizing minerals.
> No one knows everything.
But, really, access to capital is the real limiting factor. Getting to know something isn't usually all that difficult in and of itself, but if you have no way to do it then you're never going to get to know it. Collaboration is less important to efficiency than optimizing use of capital. Its just that we don't have many good ideas about how to optimize use of capital without also introducing collaboration.
You don't have to know everything but BASIC understanding about the what is underneath would be nice.
The Little House on the Prairie books fictionalize the childhoods of Laura Ingalls Wilder and Almanzo Wilder in the US in the late 19th century. They expected their readers, whose grandparents had grown up in similar conditions, to believe that one or more of their parents knew how to shoot a bear, build a house, dig a well, poultice wasp stings, cast bullets, fertilize and grow crops, make cheese, whitewash walls, drive horses, run a business, read a book, play the fiddle, dance a jig, sing, keep bees, clear fields in forests, harvest honey, spin thread, weave cloth, thresh wheat, and many other activities. There were "store-bought" goods produced by the kind of specialization you're talking about, but Laura's family had a few durable goods of that sort (Pa's rifle and ax, the family Bible) and mostly they just did without.
More recently the Lykov family survived 40 years of total isolation from society, missing World War II completely, but did suffer some heartbreaking losses in material standard of living because they didn't know, for example, how to make ceramic or iron. Agafia Lykova is still living there on her parents' homestead, nearly a century later.
Specialization is indeed very efficient, but that answers the questions, "What can I do for others?" and "How can we survive?" Historical answers bespeaking specialization are archived in many of our surnames in the West: Cooper, Fuller, Goldschmidt, Herrero, Nailer, Roper, Molnar, and, of course, Potter.
But for those questions to matter, we also need to answer the questions, "How can I be happy?" and "How can we be happy?", and for thousands of years it has been at least widely believed that devoting your entire self to specialization runs counter to those goals—among other things, because it can open doors to the kinds of exploitation, unfreedom, and insecurity the article is lamenting. And sometimes regional specialization leads not to prosperity for every region but to impoverishment, and regaining the lost skills is the path out of the kind of abysmal poverty that produces regular famines; that's why there's a charkha on the Indian flag.
TI was no exemplar here; you can't even write your own machine code to run on the TI-99/4A, but the situation with Nest is in many ways far worse. I think it's worth distinguishing between situations where someone chooses not to learn about, modify, or repair artifacts, and situations like these where they are not permitted to learn, especially when the prohibition is established in order to exploit them economically, as in both the TI case and the Nest case, or as in medieval guilds.
Some specializations are thousands of years old; tin mining in Cornwall supported much of the Bronze Age, and silicosis was already known as an occupational disease of potters in Classical times. But 80 hours a week breaking rocks in a tin mine is not a path to human flourishing, nor to economic prosperity for the person doing it. Neither is buying thermostats you aren't allowed to understand. We shouldn't idealize it just because it's profitable.
For a mechanical approach to this, see the "Gingery" books which start with the basics of investment casting in the first volume, then using castings to make a lathe in the second (operating on the premise that a lathe is the only tool in a machine shop which can replicate itself), then using the lathe to make the balance of tools needed in a machine shop.
Well no, civilizations like the Maori are the exception, not the norm. Rigid class roles and specialization have featured prominently in essentially every Eurasian civilization from Egypt to Han China, which held the bulk of humanity and uts developments. Nor did questions of individual happiness matter, what concerned people at the times were questions of martial duty or religious worship.
And most people lived outside civilization entirely. They had very diverse lifestyles, but we can make some generalizations. Even when they didnt leave diaries for us to read, we can infer that they had much less specialization, both from economic principles and from archaeological evidence.
It's certainly true that people in civilizations are, and have always been, focused on martial duty, and everyone everywhere is concerned with religious worship, though they may call it something else for social reasons. But people have always been strongly interested in individual happiness, even in civilizations. The Buddha founded one of the most important religions 2500 years ago on the basis of individual happiness, to the point that after he died, one of the most contentious issues among his followers was whether holy people had any duty to help other people achieve happiness as well, the origin of the Mahayana bodhisattva vows. Epicurus's philosophy and Marcus Aurelius's writings are also centered on the pursuit of individual happiness, as is much of Plato and of course the Mohists. Even religions and philosophies that preached lifelong duty and sacrifice above all else, like Christianity and Islam, offer it as a path to individual happiness in the afterlife.
Economic productivity is an important means to happiness, because it sucks to go blind or subsist on soft foods because you can't get the necessary medical and dental treatments. And it's terrible to never see your parents again because you don't have the material means to visit them. But there's a point of diminishing returns beyond which sacrificing more of your happiness for continued economic gains amounts to cutting off your nose to spite your face.
John Economaki's "Pencil Precision" from Bridge City Tool Works:
https://bridgecitytools.com/products/pp-1-pencil-precision
I have the preceding "Chopstick Master v2" and it is a delight to use (and probably if there was a suitable set of instructions for collecting the materials for making a pencil lead and baking them, I'd probably have the successor).
https://www.youtube.com/@AppliedScience
If he can, what's stopping you?
There are extraordinary people doing extraordinary things all around you. Aiming for these things is important, and we need those kinds of people with ambitious learning goals.
Before, you said people _can't_ (in general, anyone that knows how to code cannot possibly learn how circuits work).
Now, you're saying that _you don't want to learn_. That's on you, buddy. Don't project your insecurities on the whole IT field. People can, and will, learn across many layers of abstraction.
People can learn across layers of abstraction, but specialisation is generally a good thing and creates wealth, a Scottish guy wrote a good book on it.
> That is because you are replying to two different people.
He chose to follow the argument of the previous dude, so, it's all the same for me. Everything I said still applies.
I think I made an excellent counterpoint that is not against specialization, but complementary.
This counterpoint is particularly important in an age where specialization is being oversold and mixed with snake oil.
This is like saying old software is so simple that updating a line of code can break an entire application. It's a silly thing to say. No matter how complex or how simple a piece of software is, you can easily break it. If you have a program that prints out "hello world", guess what? Updating a single character can break the entire application!
The world is more complex now. We've stood on the shoulders of giants who stood on the shoulders of giants. A few centuries ago a renaissance man could make advances in multiple fields. Now people are specialized. It's the same thing with software. Of course, people take it to an extreme. However, you go ahead and write your own crypto library, I'll use a vetted one created by experts.
To lend some credence to other folks points of view, there are arguments I can agree with that are adjacent:
- "We don't need that complex framework for our needs, we stick to a simpler, older library."
- "We decided to not use <shiniest_new_toolkit> it had performance issues that the maintainers are still sorting out."
- "Working with the new framework showed a lot of promise, but there is still a lot of instability in the API since it's so new. We couldn't commit to using a toolkit that hasn't been nailed down yet."
These are actual concerns and shows caution towards adopting new things until they match your use-case, dev-timelines, and performance requirements.
“I don’t have time to learn a new framework, I have things to do.” Everybody’s cool new abstraction is a cognitive burden for someone else.
Now if npm breaks it, or Claude breaks it, a developer might not even know what was broken.
He's talking about that kind of thing, not the resilience of code to take random character deletions.
IT is very much non-specialized compared to older disciplines. It's so young. Every single one of us is still a jack of all trades in some degree.
You're relying on the "don't roll your own crypto" popular quote to drop the mic. That's misguided. This advice comes from an argument of strength by numbers, not anything related to abstractions. It tells me you don't understand it.
This UI trend of denying access to under-the-hood complexity is deeply unsettling. It creates a cliff edge failure mechanism where the system (which often is essential) works and then suddenly doesn't. No warning approaching the failure state, no recourse on the far side, completely baffling how this became an industry standard.
now we are actually employing large numbers of people just to babysit these half-assed things, and everyone is fine with it because we all used 'industry standard' components, and thats really the best we can do isn't it. armies of on-call system resetters are just part of the picture now.
This seems like a pretty weird example, right? WiFi routers don’t connect to the internet. If your modem can’t connect to the internet, something has probably broken outside your house. That’s the sort of locally-unsolvable problem that everybody last century was familiar with; the crappy copper telephone wire that was never designed to let the Internet blast through it and it will eventually rebel and start giving you noise.
If your router doesn’t work, I don’t know. Cheap routers are not a new invention or sign of the times, I think.
VHS players, if I remember correctly, often died in mysterious ways (they have all sorts of little motors and finicky sensors in them).
When I used to use Google Wifi, it regularly struggled to connect or establish/maintain connectivity to the outside world, even though my modem was successfully connected. Similar to nest devices, you often have to power cycle them several times to get them into a good state
> If your modem can’t connect to the internet, something has probably broken outside your house
Of all the internet-only connectivity outages I’ve had that lasted for longer than a few minutes, nearly all of them were resolved by a modem or router reboot. These are ordinary, non-customized modem/routers from 3-4 ordinary ISPs serving ordinary apartments in a major US city, using ordinary mediums like DSL, cable, fiber.
The fact that a reboot resolved the issue means that the problem wasn’t outside the house. Of all the remaining, long and not-fixed-by-reboot outages, one was a hurricane, one was a bad amp … and all the remaining dozens were arcane technical issues with the modem, router, or devices on the network that required ISP troubleshooting to fix.
I suspect that this is not an uncommon distribution, which means that this isn’t the same problem folks in the last century faced; today, the shitware is coming from inside the house.
Romans tremendously progressed, despite using inferior "math" compared to the Greeks, and Americans compared to the English & French. Note how willingly China ships its top mathys to the US grad school while retaining its best engineers at all costs.
Only technologies that make it into tools/hardware (realware) will survive, the rest are destined to be e-wasted.
If the thesis is that we should understand the systems we work on, then sure, I can get behind that. At the same time, I wouldn't expect a mechanic to know how to process iron ore into an ingot.
They definitely don't "vibe machine" without thinking about underlying concepts.
Kind of a weird opposite meaning of book-burning.
https://x.com/WallStreetApes/status/1940924371255939236
Our software is like that. A small system will have a crazy number of packages and dependencies. It's not healthy and it's expensive and it's stupid.
Culture tends to drive drunk, swinging wide to extremes and then over-correcting. We're already fully in the wrong lane when we start to have discussions about thinking about the possibility of change.
https://www.youtube.com/watch?v=5ODzO7Lz_pw
It's not a software thing, it's just how humanity works.
The physical world is bound by rules that are unchanging (more-or-less). Above this layer we’ve also devised and agreed upon standards that remain unchanging, though it’s regional: Voltage, screw/bolt sizes, tolerance levels, materials, material measurements, etc.
At this layer, we’ve commoditized and standardized because it’s useful: It makes the components cost-effective and predictable.
In software and computing, I can only think of the low-level networking standard that remain stable. And even that has to be reinvented somewhat for each OS or each nee language.
Everything else seems to be reinvented or rewritten, and then versioned.
Imagine having to upgrade your nuts and bolts in your car to v3.01 or lose support?
Ingredients in the cookies? Yes. 100? No.