Posted by delaugust 11/19/2025
> I think that what is really behind the AI bubble is the same thing behind
> most money, power, and influence: land and resources. The AI future that is
> promised, whether to you and me or to the billionaires, requires the same
> thing: lots of energy, lots of land, and lots of water. Datacenters that
> outburn cities to keep the data churning are big, expensive, and have to be
> built somewhere. [...] When the list of people who own this property is as
> short as it is, you have a very peculiar imbalance of power that almost
> creates an independent nation within a nation. Globalism eroded borders by
> crossing them, this new thing — this Privatism — erodes them from within.
In my opinion, this is an irrationally optimistic take. Yes, of course, building private cities is a threat to democratic conceptions of a shared political sphere, and power imbalances harm the institutions that we require to protect our common interests.But it should be noted that this "privatism" is nothing new - people have always complained about the ultra-wealthy having an undue influence on politics, and when looking at the USA in particular, the current situation - where the number of the ultra-wealthy is very small, and their influence is very large - has existed before, during the Gilded Age. Robber barons are not a novel innovation of the 21st century. That problem has been studied before, and if it was truly just about them robber barons, the old solutions - grassroots organization, economic reform and, if necessary, guillotines - would still be applicable.
The reason that these solutions work is that even though Mark Zuckerberg may, on paper, own and control a large amount of land and industrial resources, in practice, he relies on societal consent to keep that control. To subdue an angry mob in front of the Meta headquarters, you need actual people (such as police) to do it for you - and those people will only do that for you for as long as they still believe either in your doing something good for society, or at least believe in the (democratic) societal contract itself. Power, in the traditional sense, always requires legitimization; without the belief that the ultra-powerful deserve to be where they are, institutions will crumble and finally fail, and then there's nobody there to prevent a bunch of smelly new-age Silicon Valley hippies from moving into that AI datacenter, because of its great vibrations and dude, have you seen those pretty racks, I'm going to put an Amiga in there, and so on.
However, again, I believe this to be irrationally optimistic. Because this new consolidation of power is not merely over land and resources by means of legitimized violence, it's also about control over emerging new technologies which could fundamentally change how violence itself is excercised. Palantir is only the first example to come to mind of companies that develop mass surveillance tools potentially enabling totalitarian control in an unprecedented scale. Fundamentally, all the "adtech" companies are in the business of constructing surveillance machines that could not only be used to predict whether you're in the market for a new iPhone or not, but also to assess your truth to party principles and overall danger to dear leader. Once predictive policing has identified a threat, of course, "self-driving", embodied autonomous systems could be automatically dispatched to detain, question or neutralize it.
So why hasn't that happened yet? After all, Google has had similar capabilities for decades now, why do we still not go to our knees before weaponized DJI drones and swear allegiance to Larry Page? The problem, again, is one of "alignment" - for the same reason that police officers will not shoot protesters when the state itself has become illegitimate, "Googlers" will refuse to build software that influences election results, judges moral character or threatens bodily harm. What's worse, even if tech billionaires would find a small group of motivated fascist engineers to build those systems for them, they could never go for it, as the risk of being found out is way too severe: remember, their power (over land and resources) relies on legitimacy; that legitimacy would instantly be shaken if there was a plausible leak of plans to turn America into a dystopian surveillance state.
What you would really need to build that dystopian surveillance state, then, is agents that can build software according to your precise specifications, whose aligment you can control, that will follow your every order in the most sycophantic manner, and that are not capable of leaking what you are doing to third parties even when they do see that what they're doing is morally questionable.
First of all this AI stuff is next level. It's as great, if not greater than going to space or going to the moon.
Second the rate at which is improving makes it such that the hype is relevant and realistic.
I think what's throwing people off are two things. First people are just over exposed to AI. So the overexposure is causing people to feel AI is boring and useless slop. Investments are heavy into AI but the people who throw that money around are a minority, overall the general public is actually UNDER hyping AI. Look at everyone on this thread. Everyone and I mean Everyone isn't overly optimistic about AI. instead the irony is... Everyone and I mean everyone again strangely thinks the world is overhyped about AI and they are wrong. This thread and practically every thread on HN is a microcosm of the world and the sentiment is decidedly against AI. Think about it like this, if Elon Musk invented a car that cost 1$ and this car could travel at FTL speeds to anywhere in the universe, than interstellar travel will be routine and boring within a year. People will call it overhyped.
Second the investment and money spent on AI is definitely overhyped. Right? Think about it. If we quantify the utility and achievement of what AI can currently do and what it's projected to achieve the math works out. If you quantify the profitability of AI the math suddenly doesn't work out.
I could be wrong, this could be nonsense. I just can't make sense of it.
Unless AI can change the laws of physics, extremely unlikely.
Like the way we can daydream about a galaxy, then snap-back to work. It's the same mechanism, but with enhanced focus you go from not just visualising > feeling > embodying > grounding in the new location.
We do it all the time, however because we require belief that it's possible in order to maintain our location, whenever we question where we are - we're pulled back into the reality that questions things (it's a very Earth centric way of seeing reality)
> Where you change your entire system's resonant frequency, to match what exists in the distant galaxy.
This collection of words does not describe a physical reality.
If you tell me, though, that "We installed AI in a place that wasn't designed around it and it didn't work" you're essentially complaining that your horse-drawn cart broke when you hooked it up to your HEMI. Of course it didn't work. The value proposition built around the concept of long dev cycles with huge teams and multiple-9s reliability deliverables is not what this stuff excels at.
I have churned out perfectly functional MVPs for tens of projects in a matter of weeks. I've created robust frameworks with >90% test coverage for fringe projects that would never have otherwise gotten the time budget allotted to them. The boundaries of what can be done aren't being pushed up higher or down deeper, they're being pushed out laterally. This is very good in a distributed sense, but not so great for business as usual - we've had megacorps consolidating and building vertically forever and we've forgotten what it was like to have a robust hacker culture with loads of scrappy teams forging unbeaten paths.
Ironically, VCs have completely missed the point in trying to all build pickaxes - there's a ton of mining to do in this new space (but the risk profile makes the finance-pilled queasy). We need both.
AI is already very good at some things, they just don't look like the things people were expecting.