Top
Best
New

Posted by delaugust 11 hours ago

AI is a front for consolidation of resources and power(www.chrbutler.com)
168 points | 125 commentspage 3
tptacek 4 hours ago|
My new thing with articles like these: just search for the word "water".

I think that what is really behind the AI bubble is the same thing behind most money, power, and influence: land and resources. The AI future that is promised, whether to you and me or to the billionaires, requires the same thing: lots of energy, lots of land, and lots of water. Datacenters that outburn cities to keep the data churning are big, expensive, and have to be built somewhere. The deals made to develop this kind of property are political — they affect cities and states more than just about any other business run within their borders.

philipkglass 10 hours ago||
I think that what is really behind the AI bubble is the same thing behind most money, power, and influence: land and resources. The AI future that is promised, whether to you and me or to the billionaires, requires the same thing: lots of energy, lots of land, and lots of water.

If you just wanted land, water, and electricity, you could buy them directly instead of buying $100 million of computer hardware bundled with $2 million worth of land and water rights. Why are high end GPUs selling in record numbers if AI is just a cover story for the acquisition of land, electricity, and water?

bix6 10 hours ago||
But with this play they can inflate their company holdings and cash out in new rounds. It’s the ultimate self enrichment scheme! Nobody wants that crappy piece of land but now it’s got GPUs and we can leverage that into a loan for more GPUs and cash out along the way.
exceptione 9 hours ago|||
Valid question. What the OP talks about though is that these things were not for sale normally. My takeaway from his essay is that a few oligarchs get a pass to take over all energy, by means of a manufactured crisis.

  When a private company can construct what is essentially a new energy city with no people and no elected representation, and do this dozens of times a year across a nation to the point that half a century of national energy policy suddenly gets turned on its head and nuclear reactors are back in style, you have a sudden imbalance of power that looks like a cancer spreading within a national body. 

He could have explained that better. Try to not look at the media drama the political actors give you each day, but look at the agenda the real powers laid bare

- Trump is threatening an oil rich neighbor with war. A complete expensive as hell army blowing up 'drug boats' (claim) to make help the press sell it as a war on drugs. Yeah right.

- Green energy projects, even running ones, get cancelled. Energy from oil and nuclear are both capital intensive and at the same time completely out-shined by solar and battery tech. So the energy card is a strong one to direct policy towards your interests.

If you can turn the USA into a resource economy like Russia, than you can rule like a Russian oligarch. That is also why the admin sees no problem in destroying academia or other industries via tariffs; controlling resources is easier and more predictable than having to rely on an educated populace that might start to doubt the promise of the American Dream.

amunozo 8 hours ago||
I did not think about it that way, but it makes perfect sense. And it is really scary. It hasn't even been a year since Trump's second term started. We still have three more years left.
kjkjadksj 10 hours ago||
Because then you can buy calls on the GPU companies
w_for_wumbo 8 hours ago||
This is what I wonder to, what is the end game? Advance technology so that we can have anything that we want, whenever we want it. Fly to distant galaxies. Increase the options available to us and our offspring. But ultimately, what will we gain from that? Is it to say that we did it or is it for the pleasure of the process? If it's for pleasure, then why have we made our processes so miserable for everyone involved? If it's to say that we did it, couldn't we not and say that we did? That's the whole point of fantasy. Is Elon using AI to supplement his own lack of imagination?

I could be wrong, this could be nonsense. I just can't make sense of it.

JohnMakin 8 hours ago||
> Fly to distant galaxies

Unless AI can change the laws of physics, extremely unlikely.

w_for_wumbo 6 hours ago||
I see, Fly was perhaps the wrong word to use here. Phase-Shift to new galaxies is probably the right term. Where you change your entire system's resonant frequency, to match what exists in the distant galaxy. Less of transportation, and more of a change of focus.

Like the way we can daydream about a galaxy, then snap-back to work. It's the same mechanism, but with enhanced focus you go from not just visualising > feeling > embodying > grounding in the new location.

We do it all the time, however because we require belief that it's possible in order to maintain our location, whenever we question where we are - we're pulled back into the reality that questions things (it's a very Earth centric way of seeing reality)

walterbell 2 hours ago||
Any favorite movies or TV episodes on the above themes?
akomtu 3 hours ago||
If things were left to their own devices, the end game would a civilization like stroggos: the remaining humans will choose to fuse with machines, as it would give them an advantage. The first tactical step will be to nudge people to give up more and more agency to AI companions. I doubt this future will materialise, though.
Animats 9 hours ago||
It's pretty clear that the financialization aspect of AI is a bubble. There's way too much market cap created by trading debt back and forth. How well AI will work remains an open question at this point.
milesskorpen 9 hours ago|
It's a big number - but still less than tech industry profits.
Octoth0rpe 9 hours ago||
That is true, but not evenly distributed. Oracle for example: https://arstechnica.com/information-technology/2025/11/oracl...

Also, it may be true that these companies theoretically have the cash flow to cover to spending, but that doesn't mean that they will be comfortable with that risk, especially as that risk becomes more likely in some kind of mass extinction event amongst AI startups. To concretize that a bit, the remote possibility of having to give up all your profits for 2 years to payoff DC investment is fine at 1% chance of happening, but maybe not so ok at a 40% chance.

qoez 10 hours ago||
Best case is hardly a bubble. I definitely think this is a new paradigm that'll lead to something, even if the current iteration won't be the final version and we've probably overinvested a slight bit.
layer8 8 hours ago||
The author thinks that the bubble is a given (and doesn’t have to spell doom), and the best case is that there isn’t anything worse in addition.
threetonesun 9 hours ago||
Same as the dot-com bubble. Fundamentals were wildly off for some businesses, but you can also find almost every business that failed then running successfully today. Personally I don't think sticking AI in every software is where the real value is, it's improving understanding of huge sets of data already out there. Maybe OpenAI challenges Google for search, maybe they fail, I'm still pretty sure the infrastructure is going to get used because the amount of data we collect and try to extract value from isn't going anywhere.
coffeebeqn 9 hours ago||
Something notable like pets.com is literally chewy just 20 years earlier
carlosjobim 10 hours ago||
Let's take the highest perspective possible:

What is the value of technology which allows people communicate clearly with other people of any language? That is what these large language models have achieved. We can now translate pretty much perfectly between all the languages in the world. The curse from the tower of Babel has been lifted.

There will be a time in the future, when people will not be able to comprehend that you couldn't exchange information regardless of personal language skills.

So what is the value of that? Economically, culturally, politically, spiritually?

Herring 10 hours ago||
Language is a lot deeper than that. It's like if I say "we speak the same language", it means a lot more than just the ability to translate. It's talking about a shared past and worldview and hopefully future which I/we intend to invest in.
carlosjobim 9 hours ago||
Then are you better off by not being able to communicate anything?
blauditore 9 hours ago|||
You could make the same argument about video conferencing: Yes, you can now talk to anyone anywhere anytime, and it's amazing. But somehow all big companies are convinced that in-person office work is more productive.
4ndrewl 10 hours ago|||
Which languages couldn't we translate before? Not you, the individual. We, humanity?
carlosjobim 9 hours ago||
Machine translation was horrible and completely unreliable before LLMs. And human translators are very expensive and slow in comparison.

LLM is for translation as computers were for calculating. Sure, you could do without them before. They used to have entire buildings with office workers whose job it was to compute.

gizajob 9 hours ago||
Google translate worked great long before LLMs.
doug_durham 9 hours ago|||
I disagree. It worked passably and was better than no translation. The depth, correctness, and nuance is much better with LLMs.
verdverm 9 hours ago||||
LLMs are not they only "AI"
Kiro 9 hours ago||||
I don't think you understand how off that statement is. It's also pretty ignorant considering Google Translate barely worked at all for many languages. So no, it didn't work great and even for the best possible language pair Google Translate is not in the same ballpark.
dwedge 8 hours ago||||
Not really long before, although I suppose it's relative. Google translate was pretty garbage until around 2016-2017 and then it started really improving
carlosjobim 9 hours ago|||
It really didn't. There were many languages which it couldn't handle at all, just making completely garbled output. It wasn't possible to use Google Translate professionally.
bix6 10 hours ago||
We could communicate with people before LLMs just fine though? We have hand gestures and some people learn multiple languages and google translate was pretty solid. I got by just fine in countries where I didn’t know the language because hand gestures work or someone speaks English.

What is the value of losing our uniqueness to a computer that lies and makes us all talk the same?

Kiro 9 hours ago|||
Incredible that we happen to be alive at the exact moment humanity peaked in its interlingual communication. With Google Translate and hand gestures there is no need to evolve it any further.
carlosjobim 9 hours ago|||
You can maybe order in a restaurant or ask the way with hand gestures. But surely you must be able to take a higher perspective than your own, and realize that there's enormous amounts of exchange between nations with differing language, and all of this relies on some form of translation. Hundreds of millions of people all over the world have to deal with language barriers.

Google Translate was far from solid, the quality of translations were so bad before LLMs that it simply wasn't an option for most languages. It would sometimes even translate numbers incorrectly.

Profan 8 hours ago||
LLMs are here and Google Translate is still bad (surely, if it was easy as just plugging the miraculous perfect llms into it, it would be perfect now?), I don't think people who think we've somehow solved translation actually understand how much it still deals extremely poorly with.

And as others have said, language is more than just "I understand these words, this other person understands my words" (in the most literal sense, ignoring nuance here), but try getting that across to someone who believes you can solve language with a technical solution :)

carlosjobim 5 hours ago||
What argument are you making? LLM translating is available to anybody to try and use right now, and you can use services like Kagi Translate or DeepL to see the evidence for yourself that they make excellent translations. I honestly don't care what Google Translate does, because nobody who is serious about translation uses it.

> And as others have said, language is more than just "I understand these words, this other person understands my words" (in the most literal sense, ignoring nuance here), but try getting that across to someone who believes you can solve language with a technical solution :)

The kind of deeply understood communication you are demanding is usually impossible even between people who have the same native tongue, from the same town and even within the same family. And people can misunderstand each other just fine without the help of AI. However, is it better to understand nothing at all, then to not understand every nuance?

dvcoolarun 9 hours ago||
I believe it’s a bubble. Every app interface is becoming similar to ChatGPT, claiming they’ll “help you automate,” while drifting away from the app’s original purpose.

Most of this feels like people trying to get rich off VC money — and VCs trying to get rich off someone else’s money.

crazygringo 9 hours ago||
> There is a vast chasm between what we, the users, and them, the investors, are “sold” in AI. We are told that AI will do our tasks faster and better than we can — that there is no future of work without AI. And that is a huge sell, one I’ve spent the majority of this post deconstructing from my, albeit limited, perspective. But they — the people who commit billions toward AI — are sold something entirely different. They are sold AGI, the idea of a transformative artificial intelligence, an idea so big that it can accommodate any hope or fear a billionaire might have.

> Again, I think that AI is probably just a normal technology, riding a normal hype wave. And here’s where I nurse a particular conspiracy theory: I think the makers of AI know that.

I think those committing billions towards AI know it too. It's not a conspiracy theory. All the talk about AGI is marketing fluff that makes for good quotes. All the investment in data centers and GPU's is for regular AI. It doesn't need AGI to justify it.

I don't know if there's a bubble. Nobody knows. But what if it turns out that normal AI (not AGI) will ultimately provide so much value over the next couple decades that all the data centers being built will be used to max capacity and we need to build even more? A lot of people think the current level of investment is entirely economically rational, without any requirement for AGI at all. Maybe it's overshooting, maybe it's undershooting, but that's just regular resource usage modeling. It's not dependent on "coding consciousness" as the author describes.

ninetyninenine 1 hour ago|
AI is not overhyped. It's like saying going to the moon is overhyped.

First of all this AI stuff is next level. It's as great, if not greater than going to space or going to the moon.

Second the rate at which is improving makes it such that the hype is relevant and realistic.

I think what's throwing people off are two things. First people are just over exposed to AI. So the overexposure is causing people to feel AI is boring and useless slop. Investments are heavy into AI but the people who throw that money around are a minority, overall the general public is actually UNDER hyping AI. Look at everyone on this thread. Everyone and I mean Everyone isn't overly optimistic about AI. instead the irony is... Everyone and I mean everyone again strangely thinks the world is overhyped about AI and they are wrong. This thread and practically every thread on HN is a microcosm of the world and the sentiment is decidedly against AI. Think about it like this, if Elon Musk invented a car that cost 1$ and this car could travel at FTL speeds to anywhere in the universe, than interstellar travel will be routine and boring within a year. People will call it overhyped.

Second the investment and money spent on AI is definitely overhyped. Right? Think about it. If we quantify the utility and achievement of what AI can currently do and what it's projected to achieve the math works out. If you quantify the profitability of AI the math suddenly doesn't work out.

spaqin 55 minutes ago|
Seems like an apt comparison; it was a massive money sink and a regular person gained absolutely nothing from the moon landing, it's just the big organization (NASA, US government) that got the bragging rights.