Top
Best
New

Posted by geox 12/28/2025

As AI gobbles up chips, prices for devices may rise(www.npr.org)
321 points | 513 commentspage 2
dizlexic 12/29/2025|
Alarmist silliness. If it's a bubble prices will drop. If it isn't production will adapt.
greensh 12/30/2025||
> In the long run we are all dead. Economists set themselves too easy, too useless a task if in tempestuous seasons they can only tell us that when the storm is past the ocean is flat again.

J. Keynes

He was specifically criticising this "the market will regulate itself after a while" attitude. Nice for you to have a big enough safety net, bot not everybody does.

polski-g 12/29/2025||
People love to panic for something that will be fixed in 24 months.
Ekaros 12/29/2025||
Outside say video and image editing and maybe lossless audio. Why is this much ram even needed in most use cases? And I mean actually thinking about using it. Computer code unless you are actually doing whole Linux kernel, is just text. So lot of projects probably would fit in cache. Maybe software companies should be billed for user's resources too...
system2 12/29/2025||
I have multiple apps using 300 GB+ PostgreSQL databases. For some queries, high RAM is required. I enable symmetrical NVMe swaps, too. Average Joe with gaming needs wouldn't need more than 64 GB for a long time. But for the database, as the data grows, RAM requirements also grow. I doubt my situation is relatable to many.
Ekaros 12/29/2025||
I understand servers. But why do actually average user need more than 2 or 4GB? For what actual data in memory at one time?
parrellel 12/29/2025||
Where have you seen 4 GB cut it in the last decade? 2 GB was enough to make Vista chug in 2007?

I've got old linux boxes that feel fine with a couple gig of DDR3, but can't think of a place where that would be acceptable outside of that.

Ekaros 12/29/2025||
My entire question is why can't whatever users do on computers actually work on 2GB of RAM? Like what is the true reason we are in state that it is for some reason not possible?

2 GB is huge amount information. So surely it should be enough for almost all normal users, but for some reason it is not.

vee-kay 12/29/2025|||
Quick.. list your favorite software and tell us how much GBs of space they use after installation and how many GBs of RAM they consume when running.

You will find most of your fave programs struggle badly with 2-4GB of RAM, even on Linux.

Over the years most software programs (even on mobile) have become bloated and slow due to "new features" (even if most people don't need them) and also because it is a nexus with the hardware manufacturers. Who will buy any expensive CPU, more RAM, larger capacity SSDs, bigger displays, etc., if there is no software program needing all that extra oomph of performance, bandwidth, and fidelity?

bloppe 12/29/2025||||
One potential reason: now that CPU clock speed is plateauing, parallelism is the main way to juice performance. Many apps try to take advantage of it by running N processes for N cores. For instance, my 22-core machine will use all 22 cores in parallel by default for builds with modern build systems. That's compiling ~22 files at once, using ~5x as much RAM as the 4-core machines of 15 years ago, all else being equal. As parallelism increases further, expect your builds to use even more memory.
parrellel 12/29/2025|||
Ah! Yes, I agree.
TrackerFF 12/29/2025|||
Electron apps hog memory. The vast, vast majority of computer users are Windows users. Using 8 GB of memory without really "using" it, is trivial. Chrome + some Microsoft office apps will spend that much.
vee-kay 12/29/2025|||
You must be surprised to learn that most of the personal/SOHO PC users use Windows as the default OS.

In fact, Microsoft and Intel made a cutthroat monopoly of the PC market by their long-term WinTel nexus (MS Windows optimized to run better on Intel CPUs, Intel CPU PCs being sold with MS WINDOW by default), until AMD upped the ante and stole the race by being first on the block with releasing x64/x32 bit processor so Microsoft chose to ditch Intel for AMD to usher in the new era of 54-bit Windows OSes.

AMD still dominates in server market and GPU market (where it has been innovating harder and giving better VFM than nVidia and Intel), so still struggling to dominate the PC market (PC assemblers/stores get better lucrative deals from Intel to sell Intel-based PCs, that's why we find fewer AMD-based PCs for sale in shops/stores.

And that doesn't bode well for PC users/customers. Because that WinTel+nVidia nexus will choose MS Windows over Linux any day.

As for why more RAM is needed, you must be again surprised to know that most people play video games on PCs and mobiles rather than expensive consoles.

But even casual gaming needs adequate RAM and some vRAM. Even heavy duty office work (e.g., opening/editing big Excel files or complex PDFs) is a problem in low-end PC. Engineering students and workers need to do complex CAD/CAM work on their PCs. Artists (including musicians) need to use powerful software tools to do design and art work. All these needs mandate more RAM (16GB at the minimum) because most of these tools need MS Windows (or alternatively, expensive Mac PCs, assuming MacOS has alternative apps to suit such needs).

After failing to beat AMDs versatility and VFM performance in the CPU & GPU market, nVidia and Intel have insteaf pivoted to AI to regain their stranglehold on the market. Their AI NPUs are dominating the PC market this year, but those new PCs are bad for the types of specific needs listed above.

This is also why Microsoft and its allies ensured that most video games are not ported to Linux (and Mac), until Valve finally started to change that status quo by focusing on Linux gaming (but out of self interest, as its money-maker Steam store became too heavily dependent on Microsoft for gaming).

So yeah, more RAM and better CPUs & GPUs please!

motbus3 12/29/2025||
For me, there is concerning a flag about all of this.

I know this is not always true, but on this case, crucial folks say the margins for end user are too low and they have demand for AI.

I suppose they do not intend to bring a new AI focused unit because it is not worth it or they believe the hype might be gone before it they are done. But what intrigues me is why they would allow other competitors to step up in a segment they dominate? They could raise the prices for the consumers if they are not worried about competition...

There is a whole "not-exactly" ai industry labeled as AI that received a capital t of money. Is that what they are going for?

Imustaskforhelp 12/29/2025||
So my understanding of the situation is that, Crucial folks had downsized their factory production (see my other comments for reference perhaps) but then AI involvement started demanding chips just at the time their factory production was at their lows.

So now for these AI companies, they got tons of money to burn so they are willing to pay a lot more, so now crucial only have a limited supply of ram and the thing is there isn't much difference between AI chip and consumer chip but the margins of AI chip are super higher compared to consumer chip

So earlier they would sell consumer chips and AI chips as well but then the AI companies still demanded even more and they would get insane profits selling them so what they did (atleast crucial) is that they stopped selling consumer chips just to sell AI chips for profit.

> I suppose they do not intend to bring a new AI focused unit because it is not worth it or they believe the hype might be gone before it they are done. But what intrigues me is why they would allow other competitors to step up in a segment they dominate? They could raise the prices for the consumers if they are not worried about competition...

Well as a consumer, I certainly hope so but I think that these companies did this case because their have been times they were -55% in stock prices and its just cash making money device at this point and there is a monopoly of fabs with just three key players.

So the answer to your question is "money" and "more money" short term. Their stock prices are already up I think and a company really loves short term rising stock prices

> They could raise the prices for the consumers if they are not worried about competition

Well, would you increase the prices 3-4x? Because supposedly thats how much the AI chips from what I've heard are... And due to this, the second hand market itself is selling these at a close-enough mark.

I don't know but I hope that new players come in the market, I didn't know that this ram industry was such monopolistic with there being only 3 key players and how that became a chokehold for the whole world economy in a way

motbus3 12/29/2025||
> Well, would you increase the prices 3-4x? (Text below is quite long, let me say it here, you made great points in your answer!)

It seems they could. They not only single handed caused it to double or more without trying :/ Not sure if it would trigger other sorts of regulatory issues though

It is my impression that there is a fabricated scarcity of all goods. That's a common practice in cloth retailers. In the 90s they thought for brand name and market share. They noticed it was silly because they could sell half for double of the price and as this means less logistics, it also meant higher margins. It is not a lunch free approach. Selling less means that you delegate at least the bottom portion of your clients to the market, and if there are options, they might just be gone. That's exactly what happened with Chevrolet, Ford, etc. they stopped investing and when a new competitor appeared, even if it was more marketing than product, they lots rivers of money and barely can keep the fight on (except for maybe making a puppet tell others that there is no such thing as climate change, but that's something else)

Technology space right now looks like it. We already see major brands stagnation allegedly because they did all that is possible and it will take some time until some nouvelle approach to appear.

As a consumer, I want to believe this won't take long to settle but I'm afraid money is going elsewhere

HexPhantom 12/29/2025||
Building a dedicated AI-focused consumer line is risky: long development cycles, uncertain demand, and the chance that today's hype cools before the product ships
johnea 12/28/2025||
"May rise"?

Prices are already through the roof...

https://www.tomsguide.com/news/live/ram-price-crisis-updates

piskov 12/29/2025||
Big companies secure long-term pricing (multi-year), so iPhones probably won’t feel this in 2026 (or even 2027).

2028 is another story depending on whether this frenzy continues / fabs being built (don’t know whether they are as hard as cpu)

Imustaskforhelp 12/28/2025||
Asus is ramping up production of ram...

So lets see if they might "save us"

jazzyjackson 12/29/2025|||
Asus doesn't operate fabs and has denied the rumor

https://www.tomshardware.com/pc-components/dram/no-asus-isnt...

Imustaskforhelp 12/29/2025||
Hey sorry, I didn't knew that. I had watched the short form content (https://www.youtube.com/shorts/eSnlgBlgMp8) [Asus is going to save gaming] and I didn't knew that It was a rumour.

My bad

CamperBob2 12/28/2025||||
Asus doesn't make RAM. That's the whole problem: there are plenty of RAM retail brands, but they are all just selling products that originate from only a couple of actual fabs.
nrp 12/29/2025||
Three major ones: Micron, Samsung, SK Hynix

And a couple of smaller ones: CXMT (if you’re not afraid of the sanctions), Nanya, and a few others with older technology

whaleofatw2022 12/29/2025||
Is this glofo's time to shine?
bee_rider 12/29/2025||
Do they make DRAM? I thought they made compute chips mostly.

If I recall correctly, RAM is even more niche and specialized than the (already quite specialized) general chip manufacturing. The structure is super-duper regular, just a big grid of cells, so it is super-duper optimized.

FastFT 12/29/2025||
They (GF) do not make DRAM. They might have an eDRAM process inherited from IBM, but it would not be competitive.

You’re correct that DRAM is a very specialized process. The bit cell capacitors are a trench type that is uncommon in the general industry, so the major logic fabs would have a fairly uphill battle to become competitive (they also have no desire to enter the DRAM market in general).

shevy-java 12/29/2025|||
So far all I am seeing is an increase in prices, so any company claiming it will "ramp up production" here is, in my opinion, just lying for tactical reasons.

Governments need to intervene here. This is a mafia scheme now.

I purchased about three semi-cheap computers in the last ~5 years or so. Looking at the RAM prices, the very same units I bought (!) now cost 2.5x as much as before (here I refer to my latest computer model, from 2 years ago). This is a mafia now. I also think these AI companies should be extra taxed because they cause us economic harm here.

trinsic2 12/29/2025||
Taxed extra is a good idea. But they bought our current administration so we all know that's not going to happen unless something big happens like Trump gets impeached, and all the criminals in congress go to prison. I'm wondering how likely that will happen. people need to get more directly involved in putting pressure on senators.
elthor89 12/29/2025||
If all manufacturers jump into serving the ai market segment.

Can this not be a opportunity for new entrants to start serving the other market segments?

How hard is it to start and manufacture memory for embedded systems in cars, or pc?

lexicality 12/29/2025||
If it was easy there would be more memory manufacturers, rather than 2-3 wholesalers who sell to the people who put badges & rgb on it
nice_byte 12/29/2025|||
No, because if you have the capacity to make e.g. ram chips it makes more economic sense to sell them for ai bucks. Serving the other market segment is an opportunity cost unless you're selling to them at same prices. In the long run though if enough players emerge the price will eventually come down just due to oversupply.
SunlitCat 12/29/2025||
Not quite. Making specialized DRAM chips for AI hardware needs, requires high tech components. Making low(er) end DRAM chips for consumer needs might be easier to get started with.

I am pretty sure, in the next year we will see a wave of low end ram components coming out of china.

Imustaskforhelp 12/29/2025|||
Yea I think the same too. China is notorious for price dumping but this might be good for the end consumer.
baobabKoodaa 12/29/2025|||
Best of luck with your folksy mom & pop DRAM factory.
Imustaskforhelp 12/29/2025|||
One might think so but the AI companies actually lose a lot and I mean a lot of money in these deals.

Even if they might sell their inference and everything, they still wouldn't be that much profitable.

So like, the key point is that a ram company can supply openAI ram and get some really high quick bucks which would be even more than if they were to create thier own datacenters,run open source models in them, provide inference in say open router.

Now you might ask: Is openAI or these AI companies mad for burning so much money?

And I think you might know the answer to that.

vee-kay 12/29/2025||
Good luck fabricating new microchips. It is a very expensive and difficult proposition.
loudandskittish 12/29/2025||
Love all the variations of "8GB of RAM should be enough for anybody" in here.
walterbell 12/29/2025|
AI PacMan eats memory, then promises to eat/write software so we need less memory.
memoriuaysj 12/28/2025||
the first stages of the world being turned into computronium.

next stage is paving everything with solar panels.

kylehotchkiss 12/29/2025||
Solar freaking roadways reborn!
netbioserror 12/28/2025||
Positive downstream effect: The way software is built will need to be rethought and improved to utilize efficiencies for stagnating hardware compute. Think of how staggering the step from the start of a console generation to the end used to be. Native-compiled languages have made bounding leaps that might be worth pursuing again.
yooogurt 12/29/2025||
Alternatively, we'll see a drop in deployment diversity, with more and more functionality shifted to centralised providers that have economies of scale and the resources to optimise.

E.g. IDEs could continue to demand lots of CPU/RAM, and cloud providers are able to deliver that cheaper than a mostly idle desktop.

If that happens, more and more of its functionality will come to rely on having low datacenter latencies, making use on desktops less viable.

Who will realistically be optimising build times for usecases that don't have sub-ms access to build caches, and when those build caches are available, what will stop the median program from having even larger dependency graphs.

linguae 12/29/2025||
I’d feel better about the RAM price spikes if they were caused by a natural disaster and not by Sam Altman buying up 40% of the raw wafer supply, other Big Tech companies buying up RAM, and the RAM oligopoly situation restricting supply.

This will only serve to increase the power of big players who can afford higher component prices (and who, thanks to their oligopoly status, can effectively set the market price for everyone else), while individuals and smaller institutions are forced to either spend more or work with less computing resources.

The optimistic take is that this will force software vendors into shipping more efficient software, but I also agree with this pessimistic take, that companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can’t afford tech at inflated prices.

I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.

yooogurt 12/30/2025||
> companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can't afford tech at inflated tech

These big companies are competing with each other, and they're willing and able to spend much more for compute/RAM than we are.

> I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.

A few ideas:

* Use/develop/optimise local tooling

* Pool resources with friends/communities towards shared compute.

I hope prices drop sooner than projects dev tools all move to the cloud.

It's not all bad news: as tooling/builds move to the cloud, they'll become available to those that have thus far been unable or unwilling to afford a fast computer to be mostly idle.

This is a loss of autonomy for those who were able to afford such machines though.

piskov 12/29/2025|||
Some Soviet humor will help you understand the true course of events:

A dad comes home and tells his kid, “Hey, vodka’s more expensive now.” “So you’re gonna drink less?” “Nope. You’re gonna eat less.”

ip26 12/29/2025||
I have some hope for transpiling to become more commonplace. What would happen if you could write in Python, but trivially transpile to C++ and back?
netbioserror 12/30/2025||
You've described Nim, Chicken, and Jank. Partially what I meant by "leaps made by native-compiled languages".
deadbabe 12/29/2025||
Are we finally going to be forced to use something like CollapseOS, when the supply chains can no longer deliver chips to the masses?
shmerl 12/29/2025|
> She said the next new factory expected to come online is being built by Micron in Idaho. The company says it will be operational in 2027

Isn't Micron stopping all consumer RAM production? So their factories won't help anyway.

terribleperson 12/29/2025|
Micron is exiting direct to consumer sales. That doesn't mean their chips couldn't end up in sticks or devices sold to consumers, just that the no-middleman Crucial brand is dead.

Also, even if no Micron RAM ever ended up in consumer hands, it would still reduce prices for consumers by increasing the supply to other segments of the market.

walterbell 12/29/2025||
> no-middleman Crucial brand is dead

It could be restarted in the future by Micron.

Crucial SSDs offer good firmware (e.g. nvme sanitize for secure erase) and hardware (e.g. power loss capacitors).

More comments...