Top
Best
New

Posted by Tenoke 4/3/2025

AI 2027(ai-2027.com)
949 points | 621 commentspage 7
snackernews 4/5/2025||
> Other companies pour money into their own giant datacenters, hoping to keep pace.

> estimates that the globally available AI-relevant compute will grow by a factor of 10x by December 2027 (2.25x per year) relative to March 2025 to 100M H100e.

Meanwhile, back in the real March 2025, Microsoft and Google slash datacenter investment.

https://theconversation.com/microsoft-cuts-data-centre-plans...

greybox 4/4/2025||
I'm troubled by the amount of people in this thread partially dismissing this as science fiction. From the current rate of progress and rate of change of progress, this future seems entirely plausible
moktonar 4/4/2025||
Catastrophic predictions of the future are always good, because all future predictions are usually wrong. I will not be scared as long as most future predictions where AI is involved are catastrophic.
jsight 4/4/2025||
I think some of the takes in this piece are a bit melodramatic, but I'm glad to see someone breaking away from the "it's all a hype-bubble" nonsense that seems to be so pervasive here.
bigfishrunning 4/4/2025|
I think the piece you're missing here is that it actually is all a hype bubble
h1fra 4/4/2025||
Had a hard time finishing. It's a mix of fantasy, wrong facts, American imperialism, and extrapolating what happened in the last years (or even just reusing the timeline).
Falimonda 4/4/2025|
We'll be lucky if "World peace should have been a prerequisite to AGI" is engraved on our proverbial gravestone by our forthcoming overlords.
scotty79 4/4/2025||
I think the idea of AI wiping out humanity suddenly is a bit far fetched. AI will have total control of human relationships and fertility through means so innocuous as entertainment. It won't have to wipe us. It will have minor trouble keeping us alive without inconveniencing us too much. And the reason to keep humanity alive is that biologically eveloved intelligence is rare and disposing of it without very important need would be a waste of data.
turtleyacht 4/4/2025||
We have yet to read about fragmented AGI, or factionalized agents. AGI fighting itself.

If consciousness is spatial and geography bounds energetics, latency becomes a gradient.

yonran 4/3/2025||
See also Dwarkesh Patel’s interview with two of the authors of this post (Scott Alexander & Daniel Kokotajlo) that was also released today: https://www.dwarkesh.com/p/scott-daniel https://www.youtube.com/watch?v=htOvH12T7mU
More comments...