Top
Best
New

Posted by mikece 6 days ago

Microsoft doubles down on small modular reactors and fusion energy(www.techradar.com)
Announcement release: https://world-nuclear.org/news-and-media/press-statements/wo...
208 points | 427 commentspage 2
ChrisArchitect 6 days ago||
Announcement release: https://world-nuclear.org/news-and-media/press-statements/wo...
spamjavalin 6 days ago||
Relevant - worth a listen, changed my mind about nuclear, also beautifully and compellingly read by the author.

'Going Nuclear - How the Atom will save the world' by Tom Gregory https://open.spotify.com/show/7l8wXWfPb3ZAUmd1pfUdv3?si=52fe...

Animats 6 days ago||
The article mentions Helion. Those guys were supposed to have their Polaris demo machine running by now. But they've become very quiet about that. Press releases about it in 2024, but not much in 2025.

Polaris is supposed to pass theoretical breakeven, and maybe even technical breakeven - more electrical power out than they put in. That would be a huge event.

ted_dunning 6 days ago|
Polaris has been operating at low power already. They have recently installed shielding in the walls to allow higher power operation.

There seems to be a fair bit of progress notes from them. They aren't obligated to tell us anything, of course.

Animats 5 days ago||
Here are their funding rounds.[1] Their funders, which include both YC and DoD, had better be getting info.

[1] https://tracxn.com/d/companies/helion/__fS6qGKScel2LE9EV85bG...

panick21_ 5 days ago||
Fusion is dumb I don't understand why people care about it. Fission is save, has cheap inputs and the energy density is already absurdly high. Fusion adds 1000x the complexity for little gain.
user____name 5 days ago||
Probably a good idea to have sufficient nuclear reactors on time, before we geoengineer stratospheric albedo and the output of renewables drops as a result.
rhdhfjej 6 days ago||
Remember when they told us in CS class that it's better to design more efficient algorithms than to buy a faster CPU? Well here we are building nuclear reactors to run our brute force "scaled" LLMs. Really, really dumb.
JumpCrisscross 6 days ago||
> Remember when they told us in CS class that it's better to design more efficient algorithms than to buy a faster CPU?

No? The tradeoff is entirely one between the value of labour versus the value of industry. If dev hours are cheap and CPUs expensive. If it’s the other way, which it is in AI, you buy more CPUs and GPUs.

estimator7292 6 days ago|||
This makes sense if and only if you entirely ignore all secondary and tertiary effects of your choices.

Things like massively increased energy cost, strain on the grid, depriving local citizens of resources for your datacenter, and let's not forget ewaste, pollution from higher energy use, pollution caused by manufacturing more and more chips, pollution and cost of shipping more and more chips across the planet.

Yeah, it's so cheap as to be nearly free.

yannyu 6 days ago|||
> Things like massively increased energy cost, strain on the grid

This is a peculiarly USA-localized problem. For a large number of reasons, datacenters are going up all over the world now, and proportionally more of them are outside the US than has been the case historically. And a lot of these places have easier access to cheaper, cleaner power with modernized grids capable of handling it.

> pollution from higher energy use

Somewhat coincidentally as well, energy costs in China and the EU are projected to go down significantly over the the next 10 years due to solar and renewables, where it's not so clear that's going to happen in the US.

As for the rest of the arguments around chip manufacturing and shipping and everything else, well, what do you expect? That we would just stop making chips? We only stopped using horses for transportation when we invented cars. I don't yet see what's going to replace our need for computing.

zekrioca 6 days ago||
Almost everything you wrote is incorrect, which is why you don’t provide sources for anything.

And in the end, the cherry: “yes, the world is ending, so what can we do? I guess nothing, let’s just continue burning it so it dies faster.”

JumpCrisscross 6 days ago||||
> it's so cheap as to be nearly free

Both chips and developer time are expensive. Massively so, both in direct cost and secondary and tertiary elements. (If you think hiring more developers to optimise code has no knock-on effects, I have a bridge to sell you.)

There isn't an iron law about developer time being less valuable than chips. When chip progress stagnates, we tend towards optimising. When the developer pipeline is constrained, e.g. when a new frontier opens, we tend towards favouring exploration over optimisation.

If a CS programme is teaching someone to always try to optimise an algorithm versus consider whether hardware might be the limitation, it's not a very good one. In this case, when it comes to AI, there is massive investment going into trying to find more efficient training and inference algorithms. Research which, ironically enough, generally requires access to energy.

codingrightnow 6 days ago|||
[flagged]
utyop22 6 days ago||||
"Which it is in AI, you buy more CPUs and GPUs."

Ermmm. what?

rhdhfjej 6 days ago|||
[flagged]
tootie 6 days ago|||
It's the difference between computer science and software engineering.
logicchains 6 days ago|||
And you would have been mocked by your peers for being so concieted that you'd dare to look down on other people for not inventing an algorithm that doesn't exist and for which there's no evidence it's even possible for one to exist.
rhdhfjej 6 days ago||
[flagged]
oatsandsugar 6 days ago||
Dig up
wmf 6 days ago|||
There's a ton of research into more efficient AI algorithms. We've also seen that GPT-5 has better performance despite being no bigger than previous models. GPU/ASIC vendors are also increasing energy efficiency every generation. More datacenters will be needed despite these improvements because we're probably only using 1% of the potential of AI so far.
zekrioca 6 days ago||
Interesting, if GPUS/ASICS have been improving in energy efficiency, then why is it that total consumption has been exponentially increasing?
bobthepanda 6 days ago||
Because so many people want to run the models.

You see this in other sectors where demand outstrips improvements in economy. Individual planes use substantially less fuel than they did 50 years ago, because there are now fewer engines on planes and the remaining engines are also more efficient; but the growth in air travel has substantially outpaced that.

keepamovin 6 days ago|||
You have a point. But it doesn't make sense to seek for the next unrealized breakthrough (low energy, brain-comparable power consumption) AI leap yet, when existing products are already so transformative.

It will come, give it time.

DeepYogurt 6 days ago|||
Big O? More like Big H2O (heavy water).... I'll see myself out.
infecto 6 days ago|||
Pretty exciting to me. Constraints breed innovation and it’s possible that the wave of AI leads to new breakthroughs on the green energy front.

Edit: Amazing how anti-innovation and science folks are on HN.

logicchains 6 days ago||
There's no efficient algorithm for simulating a human brain, and you certainly haven't invented one so you've got absolutely no excuse to act smug about it. LLMs are already within an order of magnitude of the energy efficiency as the human brain, it's probably not possible to make them much more efficient algorithmically.
rhdhfjej 6 days ago|||
Your brain has a TDP of 15W while frontier LLMs require on the order of megawatts. That's 5-6 orders of magnitude difference, despite our semiconductors having a lithographic feature size that's also orders of magnitude smaller than our biological neurons. You should do some more research.
adrian_b 6 days ago|||
The TDP of a typical human brain is not 15 W, but 25 W, so about the same as for many notebook or mini-PC CPUs, but otherwise your argument stands.

The idle power consumption of a human is around 100 W.

logicchains 6 days ago|||
>Your brain has a TDP of 15W while frontier LLMs require on the order of megawatts.

You should do some basic maths; the megawatts are used for serving many LLM instances at once. The correct comparison is the cost of just a single LLM instance.

rhdhfjej 5 days ago||
Yes, the cost figures published by LLM labs imply a power consumption measured in megawatts for each instance of top performance frontier models. Take the L.
irjustin 6 days ago|||
I'm really sad the core argument for the Matrix's existence doesn't hold up (it never did, just for me as a kid is all).
juliangamble 6 days ago||
They started with a different, more brilliant idea, of using human brains as a giant neural net, then backed away from that: https://news.ycombinator.com/item?id=12508832
irjustin 6 days ago||
oh wow that's cool. I do understand why they moved away from it. Battery is waaaayyyyy easier to understand for the layman and lay-kid (me).
deater 6 days ago||
dating myself here but I remember in the 90s reading a really funny spoof article about Microsoft announcing they had developed nuclear weapons. Didn't even seem that implausible at the time.

I would have linked it here but none of the search engines are turning up anything at all, and in fact I don't think it's even possible to find stuff like that with search engines anyore.

deater 6 days ago||
I had thought maybe it was on the old 0xdeadbeef mailing list, but no luck, but probably it was this from rec.humor.funny which in the end isn't quite as clever as I remember it being.

https://groups.google.com/g/rec.humor.funny/c/4zIyBq1-1_E/m/...

lumost 6 days ago||
The funny part about our 1990s memes on big tech, is that today’s big tech is 100-1000x larger.

NVidia is worth more than Germany.

krautsauer 6 days ago||
We're not for sale, but still… numbers? Nvidia's stock price isn't even a 10th of the gold reserve, as far as I can see?
umeshunni 6 days ago||
Also, comparing GDP (rate of production) to valuation (area under the curve) is silly. Like comparing velocity to distance.
flimflamm 6 days ago||
I wonder when Elon will go to nuclear business as green values have gone down (solar tiles anyone?)
talkingtab 6 days ago|
All I can think of is Microsoft === blue screen of death. Coming soon to your neighborhood.
More comments...