Top
Best
New

Posted by aphyr 13 hours ago

The future of everything is lies, I guess: Where do we go from here?(aphyr.com)
497 points | 551 commentspage 5
analog8374 12 hours ago||
We've recreated pre-enlightenment intellectual culture. Authority and logical consistency matter. Reality doesn't.
chungus_amongus 7 hours ago||
"carbon emissions" sneed
jimt1234 9 hours ago||
One of the "lies" that concerns me is AI-generated music and its deterioration of the personal connection between musician and listener. As MCA from the Beastie Boys said, "If you can feel what I feel then it's a musical masterpiece." The listener feels a connection to the musician (and other people) with sad songs because everyone has felt sad, or with love songs because everyone has fallen in love, and so on. The listener can still get a feeling from AI-gen'ed music, but is it the same? What is the connection? Or, has that "connection" between musician and listener always been bullshit? That is, has it always been just about music triggering your brain to make you feel a certain way, and the source of that feeling really isn't what people care about - just give me a feeling?
merb 7 hours ago||
What doomsayers or tech bros never really understand, you can’t be rich without an economy. Which basically means that if 90% of the people loose their jobs, their home, the system by itself will collapse even the stuff that the rich people are needing.

AI will basically either enrich our life like the loom did or it will outright kill the current economic system of the world which might stop poverty at all or it will sort of start a big collapse where people suffer at the beginning but than it will still have a positive outcome at the end.

Humankind always found a solution in the past and it will even do that in the future.

MrBuddyCasino 11 hours ago||
The Industrial Revolution - the greatest thing ever to happen - required the British govt to deploy more troops against Luddites than they had fighting Napoleon at the same time.

Damaging machinery was made a capital offense and they had dozens of executions, hundreds of deportations.

At every stage, the steady progress of civilization is fragile and in danger of being suffocated. Its opponents cloak themselves in moral righteousness, call themselves luddites, the green party, or AI safety rationalists. Its all the same corrosive thing underneath.

throw4847285 11 hours ago||
This kind of black and white moral thinking is corrosive to one's intelligence. You're allowed to talk about who benefits from massive society change and who suffers. You are allowed to talk about the ways that technology is implemented and how that leads to pros and cons. An attitude of "if we ever stop moving forward and think then the evil bad people win" is deeply anti-intellectual.
MrBuddyCasino 11 hours ago||
[flagged]
Gooblebrai 10 hours ago||
> The Industrial Revolution - the greatest thing ever to happen - required the British govt to deploy more troops against Luddites than they had fighting Napoleon at the same time

Source of this claim?

MrBuddyCasino 8 hours ago||
E.P. Thompson, „The Making of the English Working Class“.

It is admittedly a specific cherry picked point in time at which this was true, but useful to illustrate the issue.

mcguire 7 hours ago||
Out of curiosity, what if the "can be useful" part is Gell-Mann Amnesia?
nipponese 12 hours ago||
The conclusion was the takeaway. Everyone is getting bumped up a skill notch, not just bozo liars.
0xbadcafebee 6 hours ago||
> Some of our possible futures are grim, but manageable. Others are downright terrifying, in which large numbers of people lose their homes, health, or lives. I don’t have a strong sense of what will happen, but the space of possible futures feels much broader in 2026 than it did in 2022, and most of those futures feel bad.

Well, yes, the entire world order is currently being upended. The USA is completely unrolling its place in the global order and becoming isolationist (and soon an authoritarian single-party state). The Petrodollar is either dying or being converted to a Northwestern-Hemisphere-Petrodollar, with the Yuan in the ascendancy (so there goes the strong economy powering VC money). China, EU, and Russia are the new global leaders. The Middle East and its oil is being taken over by Israel. Taiwan will fall to China and thus the whole technological world follows. Countries that are friendly with China will have good renewable tech, countries that aren't will be doubling down on oil and coal. Fresh water will become as valuable as oil. A world war will decimate global productivity for decades. Most of the democracies in the world will be gone by the end of the century.

But none of that has to do with AI.

Bad things will always happen in the world. Good things will happen too. But you're only focusing on the bad. That's not good for your health, or others'.

> Refuse to insult your readers: think your own thoughts and write your own words. Call out people who send you slop. Flag ML hazards at work and with friends. Stop paying for ChatGPT at home, and convince your company not to sign a deal for Gemini. Form or join a labor union, and push back against management demands that you adopt Copilot [..] Call your members of Congress and demand aggressive regulation which holds ML companies responsible [..] Advocate against tax breaks for ML datacenters. If you work at Anthropic, xAI, etc., you should think seriously about your role in making the future. To be frank, I think you should quit your job.

He's freaking out, and rejecting AI completely, out of fear. And that's okay; we all get a little freaked out sometimes. But please try not to make other people freaked out as well? Just because you are scared of something doesn't mean the fear is justified or realistic.

What's going to happen now is the same thing that happened during the pandemic. A bunch of irrationally fearful people will decide that the only way they can cope with their fear, is to reject the basis of it. COVID deniers and anti-maskers/anti-vaxxers were essentially so terrified of the loss of control they had, that they refused to acknowledge it. They instead went full-bore in the opposite direction, defying government mandates and health warnings, in order to try to regain some semblance of control over their lives. And it did not go well.

That's what's now gonna happen with AI deniers. They're so freaked out about AI that they're going to reject it en-masse, not because it is actually doing anything to them, but because they're afraid it might. And the end result is going to be similar: extreme people do extreme things, and the end result isn't good. So please try to reign in the doomerism a bit, for all our sakes.

peacemosaic 2 hours ago|
[dead]
More comments...