Posted by mikece 6 days ago
'Going Nuclear - How the Atom will save the world' by Tom Gregory https://open.spotify.com/show/7l8wXWfPb3ZAUmd1pfUdv3?si=52fe...
Polaris is supposed to pass theoretical breakeven, and maybe even technical breakeven - more electrical power out than they put in. That would be a huge event.
There seems to be a fair bit of progress notes from them. They aren't obligated to tell us anything, of course.
[1] https://tracxn.com/d/companies/helion/__fS6qGKScel2LE9EV85bG...
No? The tradeoff is entirely one between the value of labour versus the value of industry. If dev hours are cheap and CPUs expensive. If it’s the other way, which it is in AI, you buy more CPUs and GPUs.
Things like massively increased energy cost, strain on the grid, depriving local citizens of resources for your datacenter, and let's not forget ewaste, pollution from higher energy use, pollution caused by manufacturing more and more chips, pollution and cost of shipping more and more chips across the planet.
Yeah, it's so cheap as to be nearly free.
This is a peculiarly USA-localized problem. For a large number of reasons, datacenters are going up all over the world now, and proportionally more of them are outside the US than has been the case historically. And a lot of these places have easier access to cheaper, cleaner power with modernized grids capable of handling it.
> pollution from higher energy use
Somewhat coincidentally as well, energy costs in China and the EU are projected to go down significantly over the the next 10 years due to solar and renewables, where it's not so clear that's going to happen in the US.
As for the rest of the arguments around chip manufacturing and shipping and everything else, well, what do you expect? That we would just stop making chips? We only stopped using horses for transportation when we invented cars. I don't yet see what's going to replace our need for computing.
And in the end, the cherry: “yes, the world is ending, so what can we do? I guess nothing, let’s just continue burning it so it dies faster.”
Both chips and developer time are expensive. Massively so, both in direct cost and secondary and tertiary elements. (If you think hiring more developers to optimise code has no knock-on effects, I have a bridge to sell you.)
There isn't an iron law about developer time being less valuable than chips. When chip progress stagnates, we tend towards optimising. When the developer pipeline is constrained, e.g. when a new frontier opens, we tend towards favouring exploration over optimisation.
If a CS programme is teaching someone to always try to optimise an algorithm versus consider whether hardware might be the limitation, it's not a very good one. In this case, when it comes to AI, there is massive investment going into trying to find more efficient training and inference algorithms. Research which, ironically enough, generally requires access to energy.
Ermmm. what?
You see this in other sectors where demand outstrips improvements in economy. Individual planes use substantially less fuel than they did 50 years ago, because there are now fewer engines on planes and the remaining engines are also more efficient; but the growth in air travel has substantially outpaced that.
It will come, give it time.
Edit: Amazing how anti-innovation and science folks are on HN.
The idle power consumption of a human is around 100 W.
You should do some basic maths; the megawatts are used for serving many LLM instances at once. The correct comparison is the cost of just a single LLM instance.
I would have linked it here but none of the search engines are turning up anything at all, and in fact I don't think it's even possible to find stuff like that with search engines anyore.
https://groups.google.com/g/rec.humor.funny/c/4zIyBq1-1_E/m/...
NVidia is worth more than Germany.