Posted by mossTechnician 1 day ago
I think a lot of the hardware of these "AI" servers will rather get re-purposes for more "ordinary" cloud applications. So I don't think your scenario will happen.
Consumer PCs and hardware are going to be expensive in 2026 and AI is primarily to blame. You can find examples of CEOs talking about buying up hardware for AI without having a datacenter to run it in. This run on hardware will ultimately drive hardware prices up everywhere.
The knock on effect is that hardware manufacturers are likely going to spend less money doing R&D for consumer level hardware. Why make a CPU for a laptop when you can spend the same research dollars making a 700 core beast for AI workloads in a datacenter? And you can get a nice premium for that product because every AI company is fighting to get any hardware right now.
You might be right, but I suspect not. While the hardware company are willing to do without laptop sales, data centers need the power efficiency as well.
Facebook has (well had - this was ~10 years ago when I heard it) a team of engineers making their core code faster because in some places a 0.1% speed improvement across all their servers results in saving hundreds of thousands of dollars per month (sources won't give real numbers but reading between the lines this seems about right) on the power bill. Hardware that can do more with less power thus pays for itself very fast in the data center.
Also cooling chips internally is often a limit of speed, so if you can make your chip just a little more efficient it can do more. Many CPUs will disable parts of the CPU not in use just to save that heat, if you can use more of the CPU that translates to more work done and in turn makes you better than the competition.
Of course the work must be done, so data centers will sometimes have to settle for whatever they can get. Still they are always looking for faster chips that use less power because that will show up on the bottom line very fast.
Do consumers understand that OEM device price increases are due to AI-induced memory price spike over 100%?
Now, for some who actually want to do AI locally, they are not going to look for "AI PCs". They are going to look for specific hardware, lots of RAM, big GPUs, etc... And it is not a very common use case anyways.
I have an "AI laptop", and even I, who run a local model from time to time and bought that PC with my own money don't know what it means, probably some matrix multiplication hardware that I have not idea how to take advantage of. It was a good deal for the specs it had, that's the only thing I cared for, the "AI" part was just noise.
At least a "gaming PC" means something. I expect high power, a good GPU, a CPU with good single-core performance, usually 16 to 32 GB of RAM, high refresh rate monitor, RGB lighting. But "AI PC", no idea.
On Linux it does nothing, on Windows it tells me I need an Office 365 plan to use it.
Like... What the hell... They literally placed a paywalled Windows only physical button on my laptop.
What next, an always-on screen for ads next to the trackpad?
But we've been there before. Computers are going to get faster for cheaper, and LLMs are going to be more optimized, cause right now, they do a ton of useless calculations for sure.
There's a market, just not right now.
Dell, Dell Pro, Dell Premium, Dell _Pro_ Premium Dell Max, Dell _Pro_ max... They went and added capacitive keys on the XPS? Why would you do this...
A lot of decisions that do not make sense to me.
Sure, the original numbering system did make sense, but you had to Google what the system meant. Now, it's kind of intuitive, even though the it's just a different permutation of the same words?
I've shied away from Dell for a bit because I had two XPS 15's that had swelling batteries. But the new machines look pretty sweet!
--------------
What we're seeing here is that "AI" lacks appeal as a marketing buzzword. This probably shouldn't be surprising. It's a term that's been in the public consciousness for a very long time thanks to fiction, but more frequently with negative connotations. To most, AI is Skynet, not the thing that helps you write a cover letter.
If a buzzword carries no weight, then drop it. People don't care if a computer has a NPU for AI any more than they care if a microwave has a low-loss waveguide. They just care that it will do the things they want it to do. For typical users, AI is just another algorithm under the hood and out of mind.
What Dell is doing is focusing on what their computers can do for people rather than the latest "under the hood" thing that lets them do it. This is probably going to work out well for them.
I actually do care, on a narrow point. I have no use for an NPU and if I see that a machine includes one, I immediately think that machine is overpriced for my needs.
Local speech recognition is genuinely useful and much more private than server based options.
But I use the 3Gb all day every day.
I built a personal voice agent