Posted by aphyr 10 hours ago
If not already, we will soon lose the ability to think if AI is helping humans (an overwhelming majority of them, not a handful), considering how we are steaming ahead in this path!
And that should be the core. There is a new, emergent technology, should we throw everything away and embrace it or there are structural reasons on why is something to be taken with big warning labels? Avoiding them because they do their work too well may be a global system approach, but decision makers optimize locally, their own budget/productivity/profit. But if they are perceived risks, because they are not perfect, that is another thing.
The reason you can't beat index funds is the people who build the market built a system that benefits them and them alone; the index fund is the pitchfork dividend (what you pay to avoid getting pitchforked). The reason you can't get your congressperson on the line is (mostly) they built a system where the only way to influence them is to enrich them; voting is the pitchfork dividend.
The way to build a society that runs on reality is to build it by whatever means possible, then defend it by any means necessary. The only societies that matter are the ones that survive.
I want to build it. I don't wanna build a fuckin crypto app, a stupid ass agent harness, or yet another insipid analytics platform. I want to build a society that furthers the liberation of humankind from the vicissitudes of nature, the predation of tyranny and the corruption of greed. I believe it is possible, and I want to prove it out.
"What do such machines really do? They increase the number of things we can do without thinking. Things we do without thinking-there's the real danger" - Frank Herbert, God Emperor of Dune
I always preferred this take:
“Civilization advances by extending the number of important operations which we can perform without thinking of them.” ― Alfred North Whitehead
It's both opposite and complementary to your Frank Herbert quote.
> “There is a cult of ignorance in the United States, and there has always been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.” ― Isaac Asimov
The easier society makes it to be unaware of the complexity of everything around us, the easier it becomes to assume everything is actually as simple as their surface-level understanding.
That said, there is no obvious reason to posit that the intergalactic feudal system, CHOAM, or the empire, came to be because of the butlerian jihad. The concrete side effects of the jihad were in fact hyper specialization of cognitive faculties in humans: mentats, guild navigators, and soldiers all possess super human specialized abilities.
I don't think feudalism is going away one way or another. It persists [in various forms] because of certain biological realities, ranging from genetics to loyalty engendered by familial relations. [This is merely an observation.]
In sum, the argument against current AI trends isn't that once addressed we will wake up in utopia. No. The point is that these natural tendencies of humans are hugely amplified and set in generational stone once the elite have control over thinking machines and lord it over a population that has experienced generationally diminished independent cognitive abilities.
p.s. All this somehow reminded me of 'Spock's Brain' episode of Startrek /g Note: the elite there were overcome because Kirk and his landing team were cognitive high performers ..
At the level of the text, none of those things you mentioned strike me as positive developments. They just siloed computation to a biological track and those biological resources are employed by those in power, which is the same problem in a different form.
This is an aside, but feudalism is not inevitable. The vestiges of it still exist, but capitalism largely upended it.
It may not prove to be effective or as momentous as the fictional one, but it began when I saw stickers slapped onto utility poles that read:
DEATH TO CLANKERS
BUTLERIAN JIHAD NOW
And I stopped to read them (because they were posted in a neighborhood where my people's cultural center is) and I pondered the intents and methods of those who were slapping up stickers. Surely this was more than just an in-joke or coy sci-fi reference?The next time I fell victim to the jihad was with a crop of Lime e-Scooters, again on a block where my people have established businesses. I wanted to rent a Lime. I found one with a full battery. I located it and tried to scan the QR. Guess what? The QR had been sanded completely clean. There was no code, no serial number, nothing to scan and no way to uniquely ID the conveyance. There was only a sticker slapped prominently onto its side:
DEATH TO CLANKERS
BUTLERIAN JIHAD NOW
At this point I began to suspect the initial aims and methods of the "real-life Butlerian Jihadis". It is sort of ironic that they should start so small, by denying micro-mobility to innocent consumers, but perhaps they will graduated to lighting Waymos and Teslas on fire.The only thing I've really taken from what Herbert himself said, not something a character in one of his books said, is distrust of messiahs and centralized power being an inherently corrupting force, even in the hands of good people.
Unfortunately, I would have to say right now my bets on the most plausible fictional future becoming reality is WALL-E.
On one hand I intuitively think this is correct, on the other hand these very concerns about technology have been around since the invention of... writing.
Here is an excerpt of Socrates speaking on the written word, as recorded in Plato's dialogue Phaedrus - "For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom"
Just because the loss of one skill to a supplanting technology led to one kind of societal change, does not mean that the loss of any skill to a supplanting technology will lead to the same kind of societal change. Assuming that to be true is a faulty generalization.
I think it wouldn't be hard to argue that writing has changed human society more profoundly than any other invention. Whether or not the change was positive is a matter of taste and likely unanswerable. The point though is there are plenty of other examples of new technologies that changed technology and deskilled humans, both mentally and physically, that changed society in radically different ways, compared to writing (looms, tractors, sails, calculators, computers, guns, and so on).
There's certainly a case to be made that, of major past technological advancements, the kind of deskilling we'd see due to heavy AI use is most comparable to the deskilling due to writing: presumably there were many day-to-day and essential activities that made use of the mental acuity people would lose due to reading, just as there are many day-to-day activities that one can imagine people becoming less skilled in due to AI use.
To me, the most dangerous difference though may be in what gets deskilled. If we only relinquish our ability to do certain menial and intellectual drudgery, that is one thing. But if what we actually relinquish and deskill is our agency and discernment, as a result of constant "delegation" to AI systems, I think we're in for a much worse time.
Hell, I would never have had the pleasure of arguing with you without it! :)
Having the "call your representatives" link be to your website as well isn't particularly helpful... I already can't get to it
That's the rub: if we build it later, our economy crashes in the meantime.