Top
Best
New

Posted by troglo-byte 12/22/2025

Satellites reveal heat leaking from largest US cryptocurrency mining center(www.space.com)
162 points | 162 comments
theamk 12/22/2025|
"leaking" is the wrong word here - it implies some sort of inefficiency, process which is not working as well as it needs to. Leaky bucket, leaky faucet...

That's not the case here, that center is __dumping__ heat into environment - it is by design, all that electricity is being converted into the heat. By design, it's enormous electric heater.

dangalf 12/23/2025||
Technically it is inefficiency. The electricity should be doing computer things. Heat is wasted electricity. Just there's not much the data centre could do about it.
stouset 12/23/2025|||
Even if the computer does perfectly-efficient computer things with every Joule, every single one of those Joules ends up as one Joule of waste heat.

If you pull 100W of power out of an electric socket, you are heating your environment at 100W of power completely independent of what you use that electricity for.

spyder 12/23/2025|||
Only true for our current computers and not true with reversible computing. With reversible computing you can use electricity to perform a calculation and then "push" that electricity back into a battery or a capacitor instead of dumping it to the environment. It's still a huge challenge, but there is a recent promising attempt:

"British reversible computing startup Vaire has demonstrated an adiabatic reversible computing system with net energy recovery"

https://www.eetimes.com/vaire-demos-energy-recovery-with-rev...

https://vaire.co/

Short introduction video to reversible computing:

https://www.youtube.com/watch?v=rVmZTGeIwnc

flave 12/23/2025||
Actually pretty cool - I was about to comment “nice perpetual motion machine” but looked into a bit more and it’s much more interesting than that (well, a real perpetual motion machine would be interesting but…)

Thanks for posting. Pretty cool.

tatjam 12/23/2025||
This kind of stuff could trigger the next revolution in computing, as the theoretical energy consumption of computing is pretty insignificant. Imagine if we could make computers with near-zero energy dissipation! A "solid 3D" computer would then become possible, and Moore's law may keep going until we exhaust the new dimension ;)
thegrim000 12/23/2025||||
I read it as the inefficient part isn't the compute efficiency, the inefficient part is dumping all the resulting heat into the environment without capturing it and using it in some way to generate electricity or do work.

On a related/side note, when there's talk about seti and dyson spheres, and detecting them via infrared waste heat, I also don't understand that. Such an alien civilization is seemingly capable of building massive space structures/projects, but then lets the waste heat just pour out into the universe in such insane quantities that we could see it tens/hundreds of light years away? What a waste. Why wouldn't they recover that heat and make use of it instead? And repeat the recovering until the final waste output is too small to bother recovering, at which point we would no longer be able to detect it.

oasisaimlessly 12/23/2025|||
All energy inevitably changes into heat eventually, and in the steady state, power in = power out.

There is no way to get rid of heat. It has to go somewhere; otherwise, the temperature of the system will increase without bound.

thegrim000 12/24/2025||
For example, why couldn't you use the waste heat like a power plant? Use it to boil water, to turn turbines, to generate electricity, which gets sent and consumed elsewhere? Adding to the heat wherever the electricity is finally consumed. (Ignoring various losses along the way).
stouset 12/24/2025||
“Elsewhere” is still somewhere on the Dyson sphere.

Or if you magically beam 100% of the captured energy somewhere else, now that place gets to deal with shedding the heat from however many 1e26W+ of power were consumed. God help the poor planet you aim that ray of death at.

stouset 12/23/2025|||
> but then lets the waste heat just pour out

There is no other alternative! If I build a perfect Dyson sphere and capture the energy output of a star, all of that energy will become heat. The average surface temperature of my Dyson sphere will be (IIRC) the ratio of the surface area of the sphere to that of the contained star, multiplied by the star's effective surface temperature.

"Recovering heat and making use of it" requires a heat differential. You need a cold side and a hot side to use energy. Using that energy causes the cold side to heat and the hot side to cool, until they reach equilibrium. The further the difference, the more usable work you can do. The closer the two sides are, the less work you can do.

Someone else here said it best: waste heat is the graveyard of energy. Once you have used energy, it will become high-entropy, low-grade, diffuse heat which is difficult-to-impossible to extract further work from.

tasuki 12/23/2025||||
> every single one of those Joules ends up as one Joule of waste heat.

Yes it ends up as heat, but with some forethought, it could be used to eg heat people's homes rather than as waste.

TheSpiceIsLife 12/23/2025|||
You can say that about any waste heat.

In really, it’s not convenient to move all waste heat to where it’s more needed.

m4rtink 12/23/2025|||
Modern industrial scale insulated hot water district heating systems can do dozens of kilometers with the water cooling down only by a degree Celsius.
tremon 12/23/2025|||
It's always more convenient to ignore externalities. That doesn't mean we should be okay with only bottom-of-the-barrel solutions.
agumonkey 12/23/2025|||
These days it's not rare to have data center heated buildings. I guess crypto bros are just not thinking about this. But technically if could be done there too.
KellyCriterion 12/23/2025||
There was a startup in EU which explicitly sold heat from crypto mining to the local energy provider. IIRC it was also here on hacker news some time ago.
agumonkey 12/23/2025||
Qarnot maybe
KellyCriterion 12/23/2025||
I meant this team:

https://terahash.space/en/

agumonkey 12/23/2025||
oh nice, i didn't know about them
robkop 12/23/2025||||
Interesting question - how much will end up as sound, or in the ever smaller tail of things like storing a bit in flash memory?
Workaccount2 12/23/2025|||
Heat is the graveyard of energy. Everything that uses energy, or is energy, is actually just energy on it's way to the graveyard.

The energy of the universe is a pool of water a top a cliff. Water running off this cliff is used to do stuff (work), and the pool at the bottom is heat.

The "heat death of the universe" is referring to this water fall running dry, and all the energy being in this useless pool of "heat".

devsda 12/23/2025||
Do thermophotovoltaic cells operate on different kind of heat?

Is it impossible to convert heat into other forms of energy without "consuming" materials like in the case of steam, geothermal or even the ones that need a cold body to utilize thermoelectric effect.

LiamPowell 12/23/2025|||
TPVs don't rely solely on the temperature of an object being high, they instead rely on two objects on either side having different temperatures. As heat moves[1] from one side to the other some of the energy from that movement is turned in to electricity.

[1]: Technically the movement itself is heat, the objects don't contain heat, rather they contain internal energy, but the two get mixed up more often than not.

supermatt 12/23/2025||
That movement is effectively “consuming” the differential.
ajuc 12/23/2025|||
What thermal energy sources actually exploit is temperature difference, not heat. And in the end that difference averages out.
phil21 12/23/2025||||
Almost none. A long time ago a friend and I did the math for sound, photons (status LEDs), etc and it was a rounding error of 1% or something silly like that.

And that’s ignoring that sound and photon emissions typically hit a wall or other physical surface and get converted back to heat.

It all ends up as heat in the end, just depends on where that heat is dumped and if you need to cool it or not. Most watts end up being even more than the theoretical heat per watt due to said cooling needs.

There is literally no way around the fact that every watt you burn for compute ends up as a watt of waste heat. The only factor you can control is how many units of compute you can achieve with that same watt.

Terr_ 12/23/2025||
Well, at least until somebody devises a system that transports or projects it so that the heat ends up somewhere not-Earth. It'd still be heating the universe in general, of course, even in the form of sprays of neutrinos.

That reminds me of a sci-fi book, Sundiver by David Brin, where a ship is exploring the sun by firing a "refrigerator laser" to somehow pump-away excess heat and balance on the thrust.

mrDmrTmrJ 12/23/2025|||
All sound will end up as heat.
csomar 12/23/2025||||
Theoretically, if your computation is energy efficient, you won't need any electricity at all since the real computation costs zero energy.
majoe 12/23/2025||
That's not correct. For ordinary computers there is Landauer's principle, which gives a theoretical lower limit for the energy needed for computation [0].

I say "ordinary computers" because other comments mentioned "reversible computers" for which this limit doesn't apply.

According to the linked wikipedia page, this theoretical limit is around a billion times smaller than current computers use for an operation, so you may call me pedantic.

[0]: https://en.wikipedia.org/wiki/Landauer%27s_principle

anthonj 12/23/2025||||
This violates energy conservation principles. Some power will be "wasted" into heat, some other will be used for some other work.
stouset 12/23/2025||
If I use energy to move a block one foot over, I have performed useful work. But 100% of the energy used to perform that work is either already heat or shortly will be.
anthonj 12/25/2025||
If you launch a rocket at escape velocity the momentum and potential energy you create never dissipates.

Certain chemical reaction endotermic reaction require energy to start. This energy is absorbed to generate molecular bond.

Also in the generation and absorption of high energy radiation there are non-thermal processes that can transfer energy.

Even something like bending a metal bar is not 100% a thermal process.

usrnm 12/23/2025|||
If I turn my fan on and 100% of the electricity is converted to heat, where does the kinetic energy of moving fan blades come from? Even the Trump administration cannot just repeal the law of conservation of energy.
numb7rs 12/23/2025|||
Even if most of the energy goes into kinetic energy of the air, that air will lose momentum via turbulence and friction with the surrounding air, which will end up as... heat.
jo909 12/23/2025|||
While spinning, the blades store a miniscule amount of kinetic energy.

After removing power even that small amount ends up as heat through friction ( both in the bearing but mostly in the air turbulence). And the blades end up in the same zero energy state: sitting still.

So it is correct that a 100% "end up" as heat

usrnm 12/23/2025||
Most of that energy gets transfered to the air that's being moved by the blades, and who knows what that air does eventually. And we're not even talking about the plant growing light that might be sitting in my room near my house plants literally creating new life from electricity.
stouset 12/23/2025||
> who knows what that air does eventually

We do know what that air does eventually. Given no further inputs of energy, it swirls around generating friction, raising its temperature (heat!) as the currents slow down to nearly nothing.

mr_toad 12/23/2025||||
There’s a minimum level of energy consumption (and thus heat) that has to be produced by computation, just because of physics. However, modern computers generate billions of times more heat than this minimum level.

https://en.wikipedia.org/wiki/Landauer's_principle

RhysU 12/23/2025||
It'd be super fun to take that as an axiom of physics then to see how far upwards one could build from that. Above my skills by far.
UltraSane 12/23/2025|||
The minimum amount of energy needed to compute decreased asymptotically to 0 as the temperature of space goes to 0. This is the reason a common sci-fi trope where advanced civilizations hibernate for extremely long times so that they can do more computation with available energy.
ctmnt 12/23/2025||
That’s a common trope? Can’t say I’ve run into it. But I’d like to! What are some good examples?
Supermancho 12/23/2025|||
In the book Calculating God, a character notes that this is a common civilization-wide choice. Living in virtual reality, rather than trying to expand into the vast expanses of space, is a common trope as much as it's a logical choice. It neatly explains the Fermi Paradox. In some fiction, like The Matrix, the choice might be forced due to cultural shifts, but the outcome is the same. A relatively sterile low-energy state civilization doing pure processing.
ithkuil 12/23/2025||
I wonder if it's illogical to think that all civilizations must always pick the most logical of the options
Supermancho 12/25/2025|||
Logical and optimum are not the same.
yetihehe 12/23/2025|||
Those civilisations that make too much illogical choices probably die off.
ithkuil 12/23/2025||
True. But it's not a binary choice. All it takes is to make one sub-optmial choice for the universe to be filled up with von-neuman probes in all star systems
triMichael 12/23/2025||||
Kurzgesagt just made a video on it a couple months back: https://www.youtube.com/watch?v=VMm-U2pHrXE
UltraSane 12/23/2025||||
https://en.wikipedia.org/wiki/Aestivation_hypothesis

https://www.youtube.com/watch?v=v9sh9NpL4i8

https://mindmatters.ai/2020/10/researchers-the-aliens-exist-...

https://aleph.se/andart2/space/the-aestivation-hypothesis-po...

CamperBob2 12/23/2025|||
Here you go: https://pastebin.com/raw/SUd5sLRC

And it only cost 0.006 rain forests!

ruined 12/23/2025|||
it's called the first law of thermodynamics
RhysU 12/23/2025||
The first law involves cwork. The axiom I am thinking of involves information.
geoffschmidt 12/23/2025||||
Heat is not by itself waste. It's what electricity turns into after it's done doing computer things. Efficiency is a separate question - how many computer things you got done per unit electricity turned into heat.
anon84873628 12/23/2025||
How many computer things you got done per unit electricity, and how many mechanical things you do with the temperature gradient between the computer and its heat sync.

For example, kinda wasteful to cook eggs with new electrons when you could use the computer heat to help you denature those proteins. Or just put the heat in human living spaces.

(Putting aside how practical that actually is... Which it isn't)

eimrine 12/23/2025||
Good luck with collecting that heat from air.
YetAnotherNick 12/23/2025||||
No its not. It would be waste only if the there is a high temperature gradient, which is minimized in mining operation through proper cooling.

It's that computation requires electricity. And almost all of the heat in bitcoin mining comes from computation, technically changing transistor state.

anon84873628 12/23/2025||||
I think what they mean is that there is not a Carnot engine hooked up between the heat source and sink. Which theoretically something the data center could do about it.
charcircuit 12/23/2025||||
The electricity is doing computer things, building bitcoin blocks.
sixtyj 12/23/2025|||
They could make a second floor with eggs and newborn chicken. /s
kelnos 12/23/2025|||
I would definitely call that an inefficiency. Heat is wasted energy that in theory could be turned into useful work. The electricity used that created that heat (that is, not including the electricity that "went to" the computations themselves) ended up serving no useful purpose.

It would be wonderful if we could capture that waste heat and give it a useful purpose, like heating homes, or perhaps even generating new electricity.

(And this is before getting into the fact that I believe mining cryptocurrency is a wasteful use of electricity in the first place.)

tbrownaw 12/23/2025|||
> The electricity used that created that heat (that is, not including the electricity that "went to" the computations themselves) ended up serving no useful purpose.

Computational results do not contain stored potential energy. There is no such thing as energy being "used up" doing computation such that it doesn't end as waste heat.

stouset 12/23/2025||||
> wasted energy that in theory could be turned into useful work

Even if turned into useful work, the end result of that work is still ultimately heat.

cousin_it 12/23/2025||
Right, but if it's noticeably hotter than the environment, then that temperature difference could be used to drive a heat engine and get some more useful work. So the knee-jerk response "omg, we see the heat from space? it's gotta be wasteful!" is kind of correct, in theory.
anon84873628 12/23/2025|||
Some people are saying "waste heat" in the technical sense of "the heat my industrial process created and I need to get rid of" and others are saying "waste heat" as "heat humans are emitting into space without slapping at least one Carnot engine on it yet".
stouset 12/23/2025|||
If the heat being generated were economically worthwhile, the miners would be incentivized to use it to offset their costs. Since they aren't, we can somewhat reasonably assume that it would cost more to recapture than it's probably worth.
kiba 12/23/2025|||
All computations eventually becomes heat. There is no computers that doesn't generate heat.

We can generate less heat per computation but it eventually cannot be avoided.

userbinator 12/23/2025|||
I wonder if there's enough heat being produced for it to act as a district heating plant.
hephaes7us 12/23/2025|||
There absolutely is, but of course it's nonzero cost to capture.
adonovan 12/23/2025||
Also, the temperature is not high enough (compared to the steam coming out of a gas/oil/nuclear plant) to obtain much work from the waste heat.
tonyarkles 12/23/2025||
That is 100% the issue. This is really low quality heat. Making it better would require even more energy input (e.g. a heat pump) because we can’t safely run electronics hot enough to generate high quality process heat.
duskwuff 12/23/2025|||
Rockdale is a small town of ~5000 residents. Even if it were practical to install district heating - which I don't think it is - there certainly isn't demand for hundreds of megawatts of it.
Lerc 12/23/2025|||
Well It it's using solar power it's just moving heat from one place to another.

I guess, if it's using fossil fuel to generate power it's also just moving heat from one place to another, but really really slowly. The relevant factor there is that the long term storage was performing a important secondary function of holding a lot of co2.

It's in Texas, surely that's an area amenable to solar production. What are they actually using there.

EGreg 12/23/2025|||
Seriously, the only way I would accept Bitcoin mining is as a winter heating source that pays for itself. Why can't people sell those?
wmf 12/23/2025|||
Those exist but they're too expensive to pay back their cost. And heat pumps are 3x the efficiency of resistive heating.
yellowcake0 12/23/2025||||
The economics of bitcoin mining dictate that the work must have no other utility. If you increase its profitability by using it for auxiliary winter heating, then more people will mine bitcoin until there is an oversupply of heating and we return to the current equilibrium amount of "wasted" heat.
EGreg 12/23/2025||
This seems to indicate a serious misunderstanding of both bitcoin's economics as well as the feasibility of having an "oversupply of heating". Hundreds of millions to over a billion people globally lack reliable energy for lighting, heating, and cooking. And bitcoin's economics don't dictate that mining never have a side benefit.
anon84873628 12/23/2025|||
Unfortunately, Bitcoin mining is pretty hot as far as regular computer use goes, but not very hot at all compared to burning some fuel.
fainpul 12/23/2025|||
> By design, it's enormous electric heater.

You're right, it's not leaking, it's dumping excess heat on purpose.

However, I get triggered whenever someone uses the term "by design" wrongly. The generation of heat is not by design. It's an undesired side-effect of the computing being done. "By design" would mean that someone decided that there should be a certain amount of heat generation and made sure that it happens.

Most often I see this term misuse from developers who explain bugs as being "by design". It happens when two features interact in an undesired way that creates problems (a bug). Developers like to look at feature A in isolation, determine that it works as designed, then look at feature B, determine that it also works as designed, then they look at and understand the interaction between feature A and B and since they now understand what is happening, they claim it's "by design". However, nobody ever decided that feature A and B should interact this way. It was clearly an oversight and every normal person would agree that the interaction is undesired and a bug. But the developer says "won't fix, this is by design". Infuriating!

baruchel 12/23/2025||
When you compute some nice and elegant result, dissipated heat is an undesired side effect. But let's face it: we are speaking about proof of work. Proof of work means that a computed has run during some "required" time. In other words, you have to prove that enough heat has been dissipated. Waste of energy actually is "by design" here.
fainpul 12/23/2025||
I'm not sure if you're trolling. Of course that's nonsense. The work is entirely the (artificially complex) computations necessary to get to the result. If someone were to invent a 100% efficient computer, based on superconductors, which produces no heat at all, the proof of work (the final hash value) would still be equally valid. As I said, heat is an undesired, unavoidable side-effect. You don't show anybody the heat you produced, to convince them that you did the work, you show them the hash value, otherwise you could just burn some wood.
akimbostrawman 12/23/2025|||
this obvious disingenuous framing is clearly on purpose to manipulate
timeon 12/23/2025||
Not sure what your point is. With POW inefficiency is by design.
edoceo 12/23/2025|||
You got the point. It's "by design" - you've both said it.
nh23423fefe 12/23/2025|||
a home can leak heat to the environment because of bad insulation. a datacenter doesn't leak heat because leaking is normatively bad.
andai 12/23/2025||
So I don't have any context for this. The article says it uses as much power as 300,000 homes. Is that a little? Is that a lot? How much does one steel foundry use?

Edit: One steel foundry uses about 3,000 more than that, according to my napkin math

wat10000 12/23/2025||
3,000x would be 2.1 terawatts. That would require about 100 Three Gorges Dam, currently the largest generating station in the world, to power one foundry. Or about 2,100 nuclear power plants of typical size. I think your napkin math might be a bit off.
sowbug 12/23/2025|||
You can multiply your own power bill by 300,000 if money is a more relatable unit.
lostlogin 12/23/2025|||
These discussions often turn into a ‘is this per hour? Per year?’ Discussion, which I find irritating. And now I’m that guy.

Is this a daily usage thing? I test based on my home usage and the numbers seem way out. I use about 25kWh per day.

pests 12/23/2025|||
Isnt 3000x more per day, the same as 3000x more per year? Or 3000x more per unit price?

I have a smaller house, we use about 13kWh per day. 4kw highest spike during the day around 5-7pm when people are cooking and doing laundry.

lostlogin 12/23/2025||
I was trying to work backwards to see what they were using as their ‘per house’ measure.

Your usage is very low. Do you use electricity for water heating and cooking? If so, that’s impressive.

We do, and charge a car.

pests 12/23/2025||
Oh I see okay yeah.

Used to have one electric car but it was on a separate meter with unlimited charging for $40/mo (just looked, now its $46). Added a few hundred to the charger install originally.

We really don't do too much around the house. Three people. One TV running maybe two sometimes. Two desktops (well one is laptop with a dock). A random PC as a server. Everything electric (oven, range, water heater, filtration, etc) besides furnace (nat gas), although I will say they are all new and pretty energy efficient. Random lights (all LED, Hue). Someone turns on an electric heater or blanket here or there. Some outside heated cat house and water heaters and stuff. In Michigan so its pretty cold right now.

I recently bought a bunch of (used) solar panels and was doing our load calculations for peak draw and selecting battery size.

How much of your usage is the car? I could imagine that would be a lot. A single model 3 refill (57kwh) would be almost 5 days of my usage.

edit: I'm dumb. We replaced our electric water heater a few months ago with a tankless gas. I don't feel like rewriting this reply but just keep in mind.

I would eventually like to replace the furnace and the water heater with electric so I can end gas service to my house. I do feel its the safest and in the future we will be looked back on as backwards. "They used to pump a flammable gas directly into their houses!"

lostlogin 12/23/2025||
The car isn’t heavily used but averages about 6-8kWh per day.

With solar we are making a ton more power than we are using at the moment, it’s a sunny summer and we are managing to export something like 8x the power we are drawing from the grid.

pests 12/24/2025||
How much solar do you have? Any battery? What rate does your power company give you?

I recently bought 30 600w panels that were used, apparently they used to be on a GM parking lot canopy but got uninstalled during covid lockdown and were in storage for ~4 years. I got a great price and I've tested ~30% and they are all in spec.

We pay ~$0.23 per kwh all in (supply, distribution, etc). Our provider (DTE) only pays ~$0.08 per kwh we supply, and its a credit and maxed out at our bill amount. So if we spend $200 on energy, we can only get $200 off. Which does mean free electricity, but also means no profit.

We use about ~1000kwh a month, our 30 panels can generate about ~3000kwh a month. Could deploy just 10 panels and resell or do better with batteries but the resell market does not make sense for me.

manuelmoreale 12/23/2025|||
I know it’s not the point you’re making but I’ll never not be surprised by how much electricity some people consume. The highest month I have recorded over the past 3 years is ~9kWh/day.

Last month it was 7. And we’re in the winter. Over the summer is more like 5.

NewJazz 12/23/2025|||
A third of my bill is the "base services" charge.
dirkt 12/23/2025|||
Isn't "you can waste energy and heat up our planet and make live for everybody else harder just to make money for a few" enough?
jesse__ 12/23/2025|||
I think a large steel foundry uses approximately 1/10 of the power of this facility.

Arc furnace foundry : 500 kw/tonne

Production : 150 t/hr

500*150 = 75 MW/h

NewJazz 12/23/2025||
I'm not 100% but I think you may have confused watts and watt hours somewhere in there. Wouldn't it be X kWh/ton?
jesse__ 12/23/2025||
Yeah, I think you're right. Maff still maffs though. Units work out to MWh/h.. which makes sense to me
nullc 12/23/2025|||
Home energy usage is knocked down people that don't that don't do anything at home and where almost their energy use is externalized (at places that make the goods they use, or other places where they spend most of their their waking hours).

So it's a useful figure if you want to make a shocking headline. "Uses as much power as infinity of something that uses no power!"

driggs 12/23/2025||
Have you considered that it's used as a unit to represent capacity of our power grid?

As in, we have now have the energy capacity for 300,000 fewer homes given this operating data center.

So not only is it a relatable unit, but it's an incredibly meaningful unit for those who care about ensuring that energy availability actually support something of value (families) rather than something wasteful (crypto mining).

epolanski 12/23/2025|||
To solve crypto sudokus I would say it's a lot.
msisk6 12/23/2025||
This was previously the location of an Alcoa aluminum smelter which used something around 1000+ MW. And that's why the crypto farm is there -- it already had sufficient electrical capacity to the site.

Folks should be happy since the crypto operation is using far less power and dumping less heat into the environment that the industrial operation that was previously there, but datacenters seem to be a trendy thing complain about at the moment so here we are.

dajt 12/23/2025|||
Where is the upside here? An alu plant probably provided more jobs and produced something of actual utility. This is burning power for no benefit to society.

It's burning less power than before, but it's not producing anything of value.

The world cannot reasonbly run without alu, it got along better without crypto currencies.

msisk6 12/23/2025||
Oh, I agree. I lived nearby (working for ERCOT; the Texas Power Grid operator) when Alcoa was still there and was planning the shutdown. It seems about half the people in Rockdale worked for either Alcoa, the nearby coal power plant, or the nearby coal mine that fed the power plant.

I remember the local press going on about the crypto mining operation and how folks were going get high-tech jobs in this rural area of Texas. Of course it didn't go that way.

Aluminum smelting is an incredibly energy intensive operation. A lot of places in the US that used to host aluminum smelters now host large datacenters, include the Google data center in The Dalles, Oregon on the Columbia river near a hydro dam. It's a shame that Rockdale didn't get something useful like these other places.

As far as Al smelting in the US; I don't know. I'd imagine it produces a lot of air pollution by itself and uses huge amounts of power that is usually generated by cheap methods like burning rocks (coal) or large hydro operations nearby to minimize transmission costs. Then you gotta get ore to the site. The only Al smelter I recall being left in the US is up near Puget Sound in Bellingham, WA and I think it's currently shutdown.

bigiain 12/23/2025|||
I've heard of aluminium referred to as "Frozen Electricity". (yes, I know, but that's the .au spelling)

Relatively speaking, bauxite is practically worthless. But mix it in with a few gigawatt hours and you get out a fairly valuable commodity.

duskwuff 12/23/2025|||
> I remember the local press going on about the crypto mining operation and how folks were going get high-tech jobs in this rural area of Texas. Of course it didn't go that way.

That's a disappointingly common crypto industry lie. Cryptocurrency mining involves very little labor beyond initial construction; it's certainly not a major source of permanent employment.

troglo-byte 12/23/2025||||
A cryptominer is a "datacenter" in the same way that a chop shop is an automotive parts supplier.
msisk6 12/23/2025||
Well, yeah. Both crypto and AI require places with cheap power to rack and stack compute and GPUs.

It remains to be seen if AI will end up being about as useful as crypto in the long run.

drivingmenuts 12/23/2025||||
How many people did the smelter employ vs how many people do the bitcoin miners employ?

The smelter was providing jobs that fed money into the local economy. I'm sure much less money is coming out of the mining operation.

cruffle_duffle 12/23/2025||
Not to mention the aluminum plant was making something actually useful to society at large. What is there now is a giant space heater used to scam people.
lolc 12/23/2025||||
I do get utility out of aluminium.
whatsupdog 12/23/2025||||
Yes, and I'm assuming the power plant that was providing electricity to the aluminum plant also wants to recover their investment.

It would be cool if all this residual heat could be concentrated to smelt aluminum!

mmooss 12/23/2025|||
A mercury refining plant or uranium enrichment facility would also be worse neighbors, but that has nothing to do with the benefits and costs of the crypto farm.
tempodox 12/23/2025||
Climate pollution on this scale should be rated a crime against humanity.
t0bia_s 12/23/2025||
Then what?
ajjahs 12/24/2025||
[dead]
JungleGymSam 12/23/2025||
[flagged]
epolanski 12/23/2025||
Why such a stance? We're going through global warming, such a scale of waste to solve crypto sudokus can definitely raise eyebrows.
e-dant 12/23/2025||
This life is needlessly absurd.

Why does this even exist?

And yes, I get what mining is and I get what the blockchain does. I’m saying that proof of work is absurd.

rixed 12/23/2025||
Those datacenters are the moai of our time; if you step back, it's interesting to watch.
Aerbil313 12/23/2025||
Now that's certainly a take.
epolanski 12/23/2025|||
Worse of all, many different coins have proved you can have different proofing methods and those could be applied to Bitcoin but core developers will pull any shenanigan to avoid this.
phil21 12/23/2025|||
The authoritarian state of KYC/AML might be even more absurd to some.

Welcome to our cashless society. It’s naive to think no one would fight back.

It may have devolved to useless speculation and gambling for now, but the genie cannot be put back into the bottle very easily at this point.

jesse__ 12/23/2025||
well put
0xbadcafebee 12/23/2025||
We stare at screens full of text and pictures every day. We had screens full of text and pictures 20 years ago. Yet somehow we have justified re-creating every single component multiple times over, spending hundreds of trillions of dollars, to get the same thing we had 20 years ago.

We've been able to talk to machines, have them understand that speech, and do work based on it, for decades. But we're all still typing into keyboards.

We've had devices which can track our eyes to move a mouse pointer for 37 years, but we all still use our hands/thumbs to move a mouse.

We had mobile devices which had dedicated keys for input which allowed us to input without looking, and we replaced those with mobile devices with no dedicated keys (so we have to look to provide input) and bodies made of glass so they would shatter when dropped and required additional plastic coverings to protect them. Even automobiles, where safety is a high priority, also adopted input devices which require looking away from the road.

Our world includes a government which is indented to be led via decisions from all the people, and could easily be overthrown by all the people, but only a select few people actually get to make decisions, and they don't have to listen to the people, and basically do whatever they want (wrt the other few people who get to make decisions).

Yes, life is needlessly absurd. It's best not to think about it unless you wanna end up in a padded room.

SauntSolaire 12/23/2025||
Very mundane explanations for all of these things -- you could basically make a similar argument about anything at anytime if you were to phrase it similarly.
kanemcgrath 12/23/2025||
I always wondered if anybody has calculated how much of our global heating could be attributed, if any at all, to every electronic device, server, and engine outputting heat as a byproduct.
hetspookjee 12/23/2025||
I have the feeling that particular energy output does not so much, really. For example this plant in the image is about 700x400m and when multiplied with the suns peak output you already get a potential energy of 280MW. And this site almost triples that. The sun shines practically everywhere, though.

Humans produce about 20TW globally at this time (ChatGPT), while the sun adds about 174000TW of energy to the earth.

I guess you could argue that our waste heat does something, but I think the greenhouse gases that trap this enormous energy more effectively have a far bigger effect.

dehrmann 12/23/2025||
I think that works out to 0.01%? There's some hand-waving around solar radiation in the atmosphere vs. on the surface and double counting some that goes to solar power, but the number looks smaller than the variation in solar output over the solar cycle.
csomar 12/23/2025|||
Negligible, almost invisible. Now the emissions (CO2) coming from the gas/oil/coal energy generation so you can run your device in the first place, that's very high.
quickthrowman 12/23/2025|||
It’s 0%. Solar radiation is about 1.4kW per square meter.

We use a fraction of the sun’s total energy output each year, orders of magnitude more energy are in sunlight radiating onto the Earth than in the heat rejected from buildings with air conditioning.

TGower 12/23/2025||
I did a quick alalysis and it actually matched the ~1.5 degree celcius rise pretty accurately. It required a bunch of incorrect simplifying assumptions, but it was still interesting how close it comes.

Estimated energy production from all combustion and nuclear from the industrial revolution onwards, assumed that heat was dumped into the atmosphere evenly at once, calculate temperature rise based on atmosphere makeup. Ignores the impact of some of that heat getting sinked into the ground and ocean, and the increased thermal radiation out to space over that period. In general, heat flows from the ground and ocean into the atmosphere instead of the other way around, and the rise in thermal radiation isn't that large.

On the other hand, this isn't something that the smart professionals ever talk about when discussing climate change, so I'm sure that the napkin math working out so close to explaining the whole effect has to be missing something.

marcosdumay 12/23/2025||
Your math is completely wrong.

We use ~20 TW, while solar radiation is ~500 PW and just the heating from global warming alone is 460TW (that is, how much heat is being accumulated as increased Earth temperature).

TGower 12/23/2025||
Well the math is correct, the methodology has obvious flaws some of which I pointed out. If you took all the energy that has been released by humanity burning things since the industrial revolution and dumped it into a closed system consisting of just the atmosphere, it would rise by about 1.5 C.
quickthrowman 12/23/2025||
The discussion thread (and original question) you are participating in is about heat being rejected to the atmosphere through vapor-compression refrigeration or evaporative cooling, not CO2 or emissions from combustion. Reread the top level comment.

The amount of heat rejected to the atmosphere from electronic devices is negligible.

arprocter 12/23/2025||
The article mentions Riot Platforms, but the image seems to show Bitdeer Technologies

They're pretty much adjacent, but the orientation of the buildings is different:

https://www.google.com/maps/place//@30.5678809,-97.0740152,2...

vondur 12/23/2025||
Wow. Is there still enough money to be made in crypto to justify this kind of investment?
whatsupdog 12/23/2025|
You are kidding right? I'm sorry for assuming. But you have to be kidding. Bitcoin has been around 100k USD all this year. And about 450 Bitcoin are mined everyday, and this doesn't even include the Bitcoin miners get from transaction fees. So, there's about 45 million USD worth of Bitcoin being mined per day. I can go into profit calculations, but at the end of the day it comes down to how much you are paying for electricity.
sershe 12/23/2025||
New business idea: can they mine crypto in my kitchen, it's an old house and the heating is uneven. Also there are whole countries that run on central heating where hot water is pumped from a central power plant like facility to houses and apartments. Probably inefficient, but something they could do.
mnemotronic 12/23/2025||
Can infrared energy be reflected like light? What would a good reflector be? (THEORETICALLY) Would a parabolic dish over this facility be able to focus the heat to a single point and, if so, what would the temp be? Is it additive? Like X joules of Y degrees over Z square meters focused down to 1 Sq cm?
doctor_phil 12/23/2025|
Yes. Metal. No.

You want to read about "conservation of etendue" for a technical explanation. For an easier explanation, look for xkcd's excellent "Fire from Moonlight".

More comments...