Top
Best
New

Posted by geox 11 hours ago

As AI gobbles up chips, prices for devices may rise(www.npr.org)
162 points | 203 comments
torginus 2 hours ago|
It's somewhat alarming to see that companies (owned by a very small slice of society) producing these AI thingies (whose current economic is questionable value and actual future potential is up to hot debate), can easily price the rest of humanity out of computing goods.
palmotea 2 hours ago||
> It's somewhat alarming to see that companies (owned by a very small slice of society) ... can easily price the rest of humanity out of computing goods.

If AI lives up to the hype, it's a portent of how things will feel to the common man. Not only will unemployment be a problem, but prices of any resources desired by the AI companies or their founders will rise to unaffordability.

torginus 2 hours ago|||
I think living up to the hype needs to be defined.

A lot of AI 'influencers' love wild speculation, but lets ignore the most fantastical claims of techno-singularity, and let's focus on what I would consider a very optimistic scenario for AI companies - that AI capable of replacing knowledge workers can be developed using the current batch of hardware, in the span of a year or two.

Even in this scenario, the capital gains on the lump sump invested in AI far outpaces the money that would be spent on the salaries of these workers, and if we look at the scenario with investor goggles, due to the exponential nature of investment gains, the gap will only grow wider.

Additionally, AI does not seem to be a monopoly, either wrt companies, or geopolitics, so monopoly logic does not apply.

m_mueller 1 hour ago||||
and if you're unlucky to live close to a datacenter, this could include energy and water? I sure hope regulators are waking up as free markets don't really seem equipped to deal with this kind of concentration of power.
BoredPositron 24 minutes ago||||
AI probably will end up living up to the hype. It won't be on the generation of hardware they are now mass deploying. We need another tock before we can even start to talk about AGI.
walterbell 2 hours ago|||
> rise to unaffordability

Or require non-price mechanisms of payment and social contract.

testbjjl 1 hour ago|||
> It's somewhat alarming to see that companies (owned by a very small slice of society) producing these AI thingies (whose current economic is questionable value and actual future potential is up to hot debate)

Some might conclude the same for funds (hedge funds/private equity) and housing.

lpapez 1 hour ago||
Stop right there you terrorist antifa leftie commie scum! You are being arrested for thought crime!
jstanley 2 hours ago|||
You don't think that as prices go up, the supply might also go up, and the equilibrium price will be maintained?

And possibly even a lower equilibrium will be reached due to greater economies of scale.

marcyb5st 1 hour ago|||
That assumes that supply production means can scale up instantly. Fabs for high end chips don't and usually take years from foundations being laid to First chip out of the production line.

In the interim, yeah, they will force prices up.

Additionally those fabs cost billions. Given the lead time I mentioned a lot of companies won't start building them right away since the risk of demand going away is high and the ROI in those cases might become unreachable

gehatare 1 hour ago||||
Is there any reason to believe that this will happen? Prices of graphic cards only went down after the crypto boom went down again.
UltraSane 1 hour ago|||
Supply should increase as a response to higher prices, this bringing prices down.
InsideOutSanta 24 seconds ago|||
As far as I can tell, none of the companies producing memory chips are increasing production because they don't know if the current demand is sustainable.

Increasing memory production capacity is a multi-year project, but in a few years, the LLM companies creating the current demand might all have run out of money. If demand craters just as supply increases, prices will drastically decrease, which none of these companies want.

arnaudsm 44 minutes ago||||
That's economical theory, but the real world is often non-linear.

Crucial is dead. There's a finite amount of rare earth. Wars and floods can bankrupt industries, supply chains are tight.

tirant 10 minutes ago|||
Those are all temporary events and circumstances.

If the market is big enough, competitors will appear. And if the margins are high enough, competitors can always price-compete down to capture market-share.

lotsofpulp 33 minutes ago|||
The business that owns crucial is producing more chips than ever.

Rare earth metals are in the dirt around the world.

Supply and demand curves shifting, hence prices increasing (and decreasing) is an expected part of life due to the inability to see the future.

mschuster91 8 minutes ago||
> Rare earth metals are in the dirt around the world.

They are. The problem is, the machinery to extract and refine them, and especially to make them into chips, takes years to build. We're looking at a time horizon of almost a decade if you include planning, permits and R&D.

And given that almost everyone but the AI bros expects the AI bubble to burst rather sooner than later (given that the interweb of funding and deals more resembles the Habsburg family tree than anything healthy) and the semiconductor industry is infamous for pretty toxic supply/demand boom-bust cycles, they are all preferring to err on the side of caution - particularly as we're not talking about single billion dollar amounts any more. TSMC Arizona is projected to cost 165 billion dollars [1] - other than the US government and cash-flush Apple, I don't even know anyone willing to finance such a project under the current conditions.

[1] https://www.tsmc.com/static/abouttsmcaz/index.htm

abenga 48 minutes ago||||
How does this square with some companies just stopping sales to consumers altogether?
baq 44 minutes ago||
This is exactly it: supply of high margin products is increasing at the cost of low margin products. Expect the low end margin to catch up to the high end as long as manufacturing capacity is constrained (at least 1 year).
atq2119 18 minutes ago||||
As usual, the problem is: how fast does this happen?
dandanua 1 hour ago|||
in a fairy world
YetAnotherNick 1 hour ago||
No just stop being cynical. The reason almost every electronic item is cheaper now than 2 decades back is just becuase the demand(and thus supply) is higher.
01HNNWZ0MV43FF 2 hours ago||
Without regulation, money begets money and monopolies will form.

If the American voter base doesn't pull its shit together and revive democracy, we're going to have a bad century. Yesterday I met a man who doesn't vote and I wanted to go ape-shit on him. "My vote doesn't matter". Vote for mayor. Vote for city council. Vote for our House members. Vote for State Senate. Vote for our two Senators.

"Voting doesn't matter, capitalism is doomed anyway" is a self-fulling prophecy and a fed psy-op from the right. I'm so fucking sick of that attitude from my allies.

lanyard-textile 33 minutes ago|||
Jovially -- you simultaneously believe that they're a victim of a psy-op *and* that their attitude is self formed?

;) And you wanted to go ape shit on him... For falling for a psy-op?

My friend, morale is very very low. There is no vigor to fight for a better tomorrow in many people's hearts. Many are occupied with the problems of today. It doesn't take a psy-op to reach this level of hopelessness.

Be sick of it all you want, it doesn't change their minds. Perhaps you will find something more persuasive.

Yizahi 30 minutes ago||||
It is very likely that his vote for the parliament literally and legally doesn't matter, depending on the party allegiance of the candidates and the state he is in. All because of the non-democratic ancient first past the post system. Though in his place I would go to the station and at least deface a ballot as a sign of contempt.
tirant 12 minutes ago||||
What regulation are you expecting to be passed and why do you believe monopolies are bad?

If a monopoly appears due to superior offerings, better pricing and quicker innovation, I fail to see why it needs to be a bad thing. They can be competed against and historically that has always been the case.

On the other hand, monopolies appearing due to regulations, permissions, patents, or any governmental support, are indeed terrible, as they cannot be competed against.

llmslave2 2 hours ago||||
This is a common sentiment but it doesn't make any sense. Voting for the wrong politician is worse than not voting at all, so why is it seen as some moral necessity for everyone to vote? If someone doesn't have enough political knowledge to vote correctly, perhaps they shouldn't vote.
maeln 52 minutes ago||
Someone, I can't remember who, explained it better than me, but the gist of it is by not voting, you are effectively checking yourself out of politician consideration.

If we see politician as just a machine who's only job is to get elected, they have to get as many votes as possible. Pandering to the individual is unrealistic, so you usually target groups of people who share some common interest. As your aim is to get as many votes as possible, you will want to target the “bigger” (in amount of potential vote) groups. Then it is a game of trying to get the bigger groups which don't have conflicting interest. While this is theory and a simplification of reality, all decent political party do absolutely look at statistics and survey to for a strategy for the election.

If you are part of a group that, even though might be big in population, doesn't vote, politician have no reason to try to pander to you. As a concrete example, in a lot of “western” country right now, a lot of politician elected are almost completely ignoring the youth. Why ? Because in those same country the youth is the age group which vote the less.

So by not voting, you are making absolutely sure that your interest won't be defended. You can argue that once elected, you have no guarantee that the politician will actually defend your interest, or even do the opposite (as an example, soybean farmer and trump in the U.S). But then you won't be satisfied and possibly not vote for the same guy / party next election (which is what a lot of swing voters do).

But yeah, in an ideal world, everyone would vote, see through communication tactics and actually study the party, program and the candidate they vote for, before voting.

layer8 52 minutes ago||||
The thing is that while voting matters collectively, it’s insignificant individually: https://en.wikipedia.org/wiki/Paradox_of_voting

Nonvoters aren’t being irrational.

irjustin 2 hours ago|||
Just so we're clear the current voter base says this is exactly how it should be.
i80and 2 hours ago||
Just so we're clear, the voter base of over a year ago asked for this because they were actively lied to, and were foolish enough to believe said lies.

Current polling however says the current voter base is quite unhappy with how this is

nemomarx 1 hour ago||
People spend a lot more effort and money lying to the voter base during election years than during the rest of the time.
vee-kay 8 hours ago||
For last 2 years, I've noticed a worrying trend: the typical budget PCs (especially Laptops) are being sold at higher prices with lower RAM (just 8GB) and lower-end CPUs (and no dedicated GPUs).

Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.

New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.

Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).

It is as if the industry has decided to focus on AI and nothing else.

And this will be a huge setback for humanity, especially the students and scientific communities.

SXX 5 hours ago||
No dedicated GPU is certainly unrelated to whatever been happening for last two years.

It's just in last 5 years integrated GPUs become good enough even for mid-tier gaming let alone running browser and hw accel in few work apps.

And even before 5 years ago majority of dedicated GPUs in relatively cheap laptops was garbage barely better than intrgrated one. Manufacturers mostly put them in there for marketing of having e.g Nvidia dGPU.

amiga-workbench 5 hours ago|||
A dedicated GPU is a red flag for me in a laptop. I do not want the extra power draw or the hybrid graphics sillyness. The Radeon Vega in my ThinkPad is surprisingly capable.
iancmceachern 2 hours ago|||
For me it's a necessity to run the software I need to do my work (CAD design)
silon42 54 minutes ago||||
Same here... I do not wish for a laptop with >65W USB-C power requirements.
trinsic2 3 hours ago|||
Yea I agree it's not worth it to have a igpu a dedicated. If I'm correct in what you are talking about. There's always issues with that setup in laptops. But I'd stay away from all laptops at this point until we get an Adminstration that enforces anti trust. All manufactures have been cutting so many corners, your likely to have hardware problems within a year unless it's a MacBook or a business class laptop.
Fabricio20 3 hours ago||||
I'm gonna be honest thats not my experience at all. I got a laptop with a modern ryzen 5 CPU four years ago that had an iGPU because "its good enough for even mid-tier gaming!" and it was so bad that I couldn't play 1440p on youtube without it skipping frames. Tried parsec to my desktop PC and it was failing that as well. I returned it and bought a laptop with a nvidia dGPU (low end still, I think it was like a 1050-refresh-refresh equivalent) and haven't had any of those problems. That AMD Vega gpu just couldn't do it.
copx 50 minutes ago|||
No Ryzen 5 system should have any trouble playing YouTube videos, there must have been something wrong with your system.
Iulioh 3 hours ago|||
What processor are we talking about?

Your experience is extremely weird

trinsic2 3 hours ago|||
Yea mid-tier is a stretch. Maybe low-end gaming
SXX 1 hour ago||
Low end gaming ia 2d indie titles and they now run on toasters.

All the popular mass matket games work on iGPU: fortnite, roblox, mmos, arena shooters, battlen royales. Good chunk of cross platform console titles also work just fine.

You can play Cyberpunk or BG3 on damn Steam Deck. I wont call this low end.

Number of games that dont run to some extent without dGPU is limited to heavy AAA titles and niche PC only genres.

koito17 1 hour ago|||
This is what I find a bit alarming, too. My M3 Max MacBook Pro takes 2 full seconds to boot Slack, a program that used to literally be an IRC client. Many people still believe client-side compute is cheap and worrying about it is premature optimization.

Of course, some performance-focused software (e.g. Zed) does start near-instantly on my MacBook, and it makes other software feel sluggish in comparison. But this is the exception, not the rule.

Even as specs regress, I don't think most people in software will care about performance. In my experience, product managers never act on the occasional "[X part of an app] feels clunky" feedback from clients. I don't expect that to change in the near future.

999900000999 4 hours ago|||
Or, we can expect better from software. Maybe someone can fork Firefox and make it run better, hard cap how much a browser window can use.

The pattern of lazy almost non existent optimization combined with blaming consumers for having weak hardware, needs to stop.

On my 16GB ram lunar lake budget laptop CachyOS( Arch) runs so much smoother than Windows.

This is very unscientific, but using htop , running Chrome/YouTube playing music, 2 browser games and VS code having Git Copilot review a small project, I was only using 6GBs of ram.

For the most part I suspect I could do normal consumer stuff( filing paperwork and watching cat videos) on an 8GB laptop just fine. Assuming I'm using Linux.

All this Windows 11 bloat makes computers slower than they should be. A part of me hopes this pushes Microsoft to at least create a low ram mode that just runs the OS and display manager. Then let's me use my computer as I see fit instead of constantly doing a million other weird things.

We don't *need* more ram. We need better software.

walterbell 4 hours ago|||
> hopes this pushes Microsoft to at least create a low ram mode

Windows OS and Surface (CoPilot AI-optimized) hardware have been combined in the "Windows + Devices" division.

> We don't *need* more ram

RAM and SSDs both use memory wafers and are equally affected by wafer hoarding, strategic supply reductions and market price manipulation.

Nvidia is re-inventing Optane for AI storage with higher IOPS, and paid $20B for Groq LPUs using SRAM for high memory bandwidth.

The architectural road ahead has tiers of memory, storage and high-speed networking, which could benefit AI & many other workloads. How will industry use the "peace dividend" of the AI wars? https://www.forbes.com/sites/robtoews/2020/08/30/the-peace-d...

  The rapid growth of the mobile market in the late 2000s and early 2010s led to a burst of technological progress..  core technologies like GPS, cameras, microprocessors, batteries, sensors and memory became dramatically cheaper, smaller and better-performing.. This wave of innovation has had tremendous second-order impacts on the economy. Over the past decade, these technologies have spilled over from the smartphone market to transform industries from satellites to wearables, from drones to electric vehicles.
PunchyHamster 2 hours ago||
> RAM and SSDs both use NAND flash and are equally affected by wafer hoarding, strategic supply reductions and market price manipulation.

Why on earth you think RAM uses NAND flash ?

walterbell 2 hours ago||
Sorry, still editing long comment, s/NAND flash/memory wafers/.
viccis 4 hours ago||||
Yeah I'm sure that will happen, just like prices will go back down when the stupid tariffs are gone.
dottjt 4 hours ago||||
That's not happening though, hence why we need more ram.
ssl-3 3 hours ago||
Eh? As I see it, we've got options.

Option A: We do a better job at optimizing software so that good performance requires less RAM than might otherwise be required

Option B: We wish that things were different, such that additional RAM were a viable option like it has been at many times in the past.

Option C: We use our time-benders to hop to a different timeline where this is all sorted more favorably (hopefully one where the Ballchinians are friendly)

---

To evaluate these in no particular order:

Option B doesn't sound very fruitful. I mean: It can be fun to wish, but magical thinking doesn't usually get very far.

Option C sounds fun, but my time-bender got roached after the last jump and the version of Costco we have here doesn't sell them. (Maybe someone else has a working one, but they seem to be pretty rare here.)

That leaves option A: Optimize the software once, and duplicate that optimized software to whomever it is useful using that "Internet" thing that the cool kids were talking about back in the 1980s.

MangoToupe 2 hours ago|||
> I was only using 6GBs of ram.

Insane that this is seen as "better software". I could do basically the same functionality in 2000 with 512mb. I assume this is because everything runs through chrome with dozens more layers of abstraction but

kasabali 1 hour ago||
More like 128MB.

512MB in 2000 was like HEDT level (though I'm not sure that acronym existed back then)

anthk 37 minutes ago||
512MB weren't that odd for multimedia from 2002, barely a few years later. By 2002 256MB of RAM were the standard, almost a new low-end PC.

64MB = w98se OK, XP will swap a lot on high load, nixlikes really fast with fvwm/wmaker and the like. KDE3 needs 128MB to run well, so get a bit down. No issues with old XFCE releases. Mozilla will crawl, other browsers will run fine.

128MB = w98se really well, XP willl run fine, SP2-3 will lag. Nixlikes will fly with wmaker/icewm/fvwm/blackbox and the like. Good enough for mozilla.

192MB = Really decent for a full KDE3 desktop or for Windows XP with real life speeds.

256MB = Like having 8GB today for Windows 10, Gnome 3 or Plasma 6. Yes, you can run then with 2GB and ZRAM, but, realistically, and for the modern bloated tools, 8GB for a 1080p desktop it's mandatory. Even with UBlock Origin for the browser. Ditto back in the day. With 256MB XP and KDE3 flied and they ran much faster than even Win98 with 192MB of RAM.

GuB-42 1 hour ago|||
I think it is the result of specs like RAM and CPU no longer being the selling point it once was, except for gaming PCs. Instead people want thin laptops, good battery life, nice screens, premium materials, etc... We have got to the point where RAM and CPU are no longer a limiting factor for most tasks, or at least until software become bloated enough to matter again.

If you want a powerful laptop for cheap, get a gaming PC. The build quality and battery life probably won't be great, but you can't be cheap without making compromises.

Same idea for budget mobiles. A Snapdragon Gen 6 (or something by Mediatek) with UFS2.2 is more than what most people need.

zahlman 7 hours ago|||
I'm reading this thread on an 11-year-old desktop with 8GB of RAM and not feeling any particular reason to upgrade, although I've priced it out a few times just to see.

Mint 22.x doesn't appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes have increased.)

Groxx 5 hours ago|||
I've been enjoying running Mint on my terrible spec chromebook - it only has 3GB of RAM, but it rarely exceeds 2GB used with random additions and heavy firefox use. The battery life is obscenely good too, I easily break 20 hours on it as long as I'm not doing something obviously taxing.

Modern software is fine for the most part. People look at browsers using tens of gigabytes on systems with 32GB+ and complain about waste rather than being thrilled that it's doing a fantastic job caching stuff to run quickly.

callc 6 hours ago|||
Mint is probably around 0.05% of desktop/laptop users.

I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.

This wasn’t mentioned, but it’s a new thing for everyone to experience, since the general trend of computer hardware is it gets cheaper and more powerful over time. Maybe not exponentially any more, but at least linearly cheaper and more powerful.

OGEnthusiast 4 hours ago||
> I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.

A $999 MacBook Air today is vastly better than the same $999 MacBook Air 5 years ago (and even more so once you count inflation).

chii 4 hours ago||
The OP is not looking at the static point (of the price of that item), but the trend - ala, the derivative of the price vs quality. It was on a steep upward incline, and now it's flattening.
chii 4 hours ago|||
> Industry mandate should have become 16GB RAM for PCs

it was only less than 10 yrs ago that a high end PC would have this level of ram. I think the last decade of cheap ram and increasing core count (and hz) have spoiled a lot of people.

We are just returning back on trend. May be software would be written better now that you cannot expect the average low budget PC to have 32G of ram and 8 cores.

linguae 8 hours ago|||
I wonder what we can do to preserve personal computing, where users, not vendors, control their computers? I’m tired of the control Microsoft, Apple, Google, OpenAI, and some other big players have over the entire industry. The software has increasingly become enshittified, and now we’re about to be priced out of hardware upgrades.

The problem is coming up with a viable business model for providing hardware and software that respect users’ ability to shape their environments as they choose. I love free, open-source software, but how do developers make a living, especially if they don’t want to be funded by Big Tech?

Saris 8 hours ago||
Run a lightweight Linux distro on older hardware maybe?
ta9000 8 hours ago|||
This is it. Buy used Dell and HP hardware with 32 GB of RAM and swap the pcie ssd for 4 TB.
thatguy0900 5 hours ago|||
Exclusively using a ever dwindling stock of old hardware is not really a practical solution to preserving hardware rights in the long term
deadbabe 5 hours ago||
I think it will be a good thing actually. Engineers, no longer having the luxury of assuming that users have high end system specs, will be forced to actually write fast and efficient software. No more bloated programs eating up RAM for no reason.
rurp 5 hours ago|||
The problem is that higher performing devices will still exist. Those engineers will probably keep using performant devices and their managers will certainly keep buying them.

We'll probably end up in an even more bifurcated world where the well off have access to lot of great products and services that most of humanity is increasingly unable to access.

anigbrowl 4 hours ago|||
Have the laws of supply and demand been suspended? Capital is gonna pour into memory fabrication over the next year or two, and there will probably be a glut 2-3 years from now, followed by retrenchment and wails that innovation has come to a halt because demand has stalled.
kasabali 1 hour ago|||
We're not talking about growing tomatoes in your backyard.
re-thc 3 hours ago|||
> Have the laws of supply and demand been suspended?

There is the law of uncertainty override it eg trade wars, tariffs , etc.

No 1 is going all in with new capacity.

charcircuit 4 hours ago||||
If "performant" devices are not widespread then telemetry will reveal that the app is performing poorly for most users. If a new festure uses more memory and sugnificantly increases the crash rate, it will be disabled.

Apps are optimized for the install base, not for the engineer's own hardware.

DeepSeaTortoise 3 hours ago||
What is the point of telemetry if your IDE launching in under 10s is considered the pinnacle of optimization?

That's like 100B+ instructions on a single core of your average superscalar CPU.

I can't wait for maps loading times being measured in percentage of trip time.

charcircuit 1 hour ago||
Because you don't want to regress any of the substeps of such a loading progress to turn it back into 10+ seconds of loading.
hansvm 5 hours ago||||
Can confirm, I'm currently requesting as much RAM as can fit in the chassis and permission to install an OS not too divorced from what we run in prod.

On the bright side, I'm not responsible for the UI abominations people seem to complain about WRT laptop specs.

incompatible 5 hours ago|||
How many "great products and services" even need a lot of RAM, assuming that we can live without graphics-intensive games?
rhdunn 2 hours ago|||
Image, video, and music editing. Developing, running, and debugging large applications.
Ekaros 21 minutes ago||
The last three sounds to me like self-inflicted issues. If applications weren't so large, wouldn't less resources be needed?
TheDong 4 hours ago|||
Some open source projects use Slack to communicate, which is a real ram hog. Github, especially for viewing large PR discussions, takes a huge amount of memory.

If someone with a low-memory laptop wants to get into coding, modern software-development-related services are incredible memory hogs.

Insanity 5 hours ago||||
I'm not optimistic that this would be the outcome. You likely will just have poor running software instead. After all, a significant part of the world is already running lower powered devices on terrible connection speeds (such as many parts of Africa).
chii 3 hours ago||
> a significant part of the world is already running lower powered devices

but you cannot consider this in isolation.

The developed markets have vastly higher spending consumers, which means companies cater to those higher spending customers proportionately more (as profits demand it). Therefore, the implication is that lower spending markets gets less investment and less catered to; after all, R&D spending is still a limiting factor.

If the entirety of the market is running on lower powered devices, then it would get catered for - because there'd be no (or not enough) customers with high powered devices to profit off.

TheDong 5 hours ago||||
At the same time, AI has made it easier than ever to produce inefficient code, so I expect to rather see an explosion of less efficient software.
trinsic2 3 hours ago||||
I don't think it's going to happen in this day and age. Some smart people will but most barley know how to write there own code let alone write efficient code
laterium 4 hours ago||||
Why are you celebrating compute becoming more expensive? Do you actually think it will be good?
thatguy0900 5 hours ago||||
I think the actual outcome is they will expect you to rent servers to conduct all your computing on and your phone and pc will be a dumb terminal.
jghn 4 hours ago|||
Good luck with that.
goku12 48 minutes ago||
I understand the issue with all the devices. But what about the rest of the things that depend on these electronics, especially DRAMs? Automotive, Aircraft, Marine vessels, ATC, Shipping coordination, traffic signalling, rail signalling, industrial control systems, public utility (power, water, sewage, etc) control systems, transmission grid control systems, HVAC and environment control systems, weather monitoring networks, disaster altering and management systems, ticketing systems, e-commerce backbones, scheduling and rostering systems, network backbones, entertainment media distribution systems, defense systems, and I don't know what else. Don't they all require DRAMs? What will happen to all of them?
synack 34 minutes ago||
Industrial microcontrollers and power electronics use older process nodes, mostly >=45nm. These customers aren’t competing for wafers from the same fabs as bleeding edge memory and TPUs.

The world ran just fine on DDR3 for a long time.

goku12 29 minutes ago||
Okay, but what about the rest? The ones that aren't embedded in someway and use industrial grade PCs/control stations? Or ones with large buffers like network routers? I'm also wondering about the supply of the alternate nodes and older technologies. Will the manufactures keep those lines running? Was it micron that abandoned the entire retail market in favor of supplying the hyperscalers?
lysace 37 minutes ago||
A $100k EV has roughly the same amount of DRAM as a $1k phone.

The EV is a therefore, on a whole, a lot less sensitive to DRAM price increases.

goku12 26 minutes ago||
Okay, accepted. But are you sure that the supply won't be a problem as well? I mean, even if these products choose a different process nodes compared to the hyperscalers, will the DRAM manufactures even keep those nodes running in favor of these industries?
elthor89 2 hours ago||
If all manufacturers jump into serving the ai market segment.

Can this not be a opportunity for new entrants to start serving the other market segments?

How hard is it to start and manufacture memory for embedded systems in cars, or pc?

lexicality 1 hour ago||
If it was easy there would be more memory manufacturers, rather than 2-3 wholesalers who sell to the people who put badges & rgb on it
nice_byte 1 hour ago||
No, because if you have the capacity to make e.g. ram chips it makes more economic sense to sell them for ai bucks. Serving the other market segment is an opportunity cost unless you're selling to them at same prices. In the long run though if enough players emerge the price will eventually come down just due to oversupply.
SunlitCat 12 minutes ago||
Not quite. Making specialized DRAM chips for AI hardware needs, requires high tech components. Making low(er) end DRAM chips for consumer needs might be easier to get started with.

I am pretty sure, in the next year we will see a wave of low end ram components coming out of china.

jazzyjackson 9 hours ago||
Question: are SoCs with on die memory be effected by this?

Looks like the frame.work desktop with Ryzen 128GB is shipping now at same price it was on release, Apple is offering 512GB Mac studios

Are snapdragon chips the same way?

nrp 8 hours ago||
We’ve been able to hold the same price we had at launch because we had buffered enough component inventory before prices reached their latest highs. We will need to increase pricing to cover supplier cost increases though, as we recently did on DDR5 modules.

Note that the memory is on the board for Ryzen AI Max, not on the package (as it is for Intel’s Lunar Lake and Apple’s M-series processors) or on die (which would be SRAM). As noted in another comment, whether the memory is on the board, on a module, or on the processor package, they are all still coming from the same extremely constrained three memory die suppliers, so costs are going up for all of them.

mips_avatar 4 hours ago||
How do suppliers communicate these changes? Are they just like yep now it’s 3x higher? Im surprised you don’t have longer contracts
appellations 4 hours ago|||
Longer contracts are riskier. The benefit of having cheaper RAM when prices spike is not strong enough to outweigh the downside of paying too much for RAM when prices drop or stay the same. If you’re paying a perpetual premium on the spot price to hedge, then your competitors will have pricing power over you and will slowly drive you out of the market. The payoff when the market turns in your favor just won’t be big enough and you might not survive as a business long enough to see it. There’s also counterparty risk, if you hit a big enough jackpot your upside is capped by what would make the supplier insolvent.

All your competitors are in the same boat, so consumers won’t have options. It’s much better to minimize the risk of blowing up by sticking as closely to spot at possible. That’s the whole idea of lean. Consumers and governments were mad about supply chains during the pandemic, but companies survived because they were lean.

In a sense this is the opposite risk profile of futures contracts in trading/portfolio management, even though they share some superficial similarities. Manufacturing businesses are fundamentally different from trading.

They certainly have contracts in place that cover goods already sold. They do a ton of preorders which is great since they get paid before they have to pay their suppliers. Just like airlines trade energy futures because they’ve sold the tickets long before they have to buy the jet fuel.

baq 37 minutes ago||||
If you’re Apple, maybe that works, in this case we’re seeing 400% increases in price, instead of your RAM you’ll be delivered a note to pay up or you’ll get your money back with interest and termination fees and the supplier is still net positive.
chii 3 hours ago|||
> longer contracts

the risk is that such longer contracts would then lock you into a higher cost component for longer, if the price drops. Longer contracts only look good in hindsight if ram prices increased (unexpectedly).

addaon 9 hours ago|||
> Question: are SoCs with on die memory be effected by this?

SoCs with on-die memory (which is, these days, exclusively SRAM, since I don't think IBM's eDRAM process for mixing DRAM with logic is still in production) will not be effected. SiPs with on-package DRAM, including Apple's A and M series SiPs and Qualcomm's Snapdragon, will be effected -- they use the same DRAM dice as everyone else.

layer8 28 minutes ago|||
*affected
pixelpoet 9 hours ago||||
The aforementioned Ryzen AI chip is exactly what you describe, with 128 GB on-package LPDDR5X. I have two of them.

To answer the original question: the Framework Desktop is indeed still at the (pretty inflated) price, but for example the Bosgame mini PC with the same chip has gone up in price.

plagiarist 5 hours ago||
Are your two chips in Framework Desktops, or some other package? I'm interested in a unified memory setup and curious about the options.
zahlman 7 hours ago|||
https://en.wiktionary.org/wiki/die#Noun

"dice" is the plural for the object used as a source of randomness, but "dies" is the plural for other noun uses of "die".

piskov 9 hours ago|||
Apple secured at least a year-worth supply of memory (not in actual chips but in prices).

The bigger the company = longer the contract.

However it will eventually catch up even to Apple.

It is not prices alone due to demand but the manufacturing redirection from something like lpddr in iphones to hbm and what have you for servers and gpu

mirsadm 1 hour ago|||
Apple charges so much for RAM upgrades that they could probably not even increase prices and still be fine. They won't but they probably could.
layer8 26 minutes ago||
At the cost of reduced margins, which shareholders may not like.
greesil 4 hours ago||||
Apparently Google fucked up

https://www.google.com/amp/s/www.indiatoday.in/amp/technolog...

magicalhippo 2 hours ago||
Non-AMP link:

https://www.indiatoday.in/technology/news/story/ram-shortage...

SunlitCat 9 minutes ago||
To be honest, it starts to look more and more like a single company (we all know which one), is just buying up all DRAM capacities to keep others out of the (AI) game.
trollbridge 8 hours ago|||
I have a feeling every single supplier of DRAM is going to be far more interested in long-term contracts with Apple than with (for example) OpenAI, since there's basically zero possibility Apple goes kaput and reneges on their contracts to buy RAM.
saagarjha 8 hours ago||
Yes, but OpenAI wants $200 billion in RAM and Apple wants $10.
dehrmann 5 hours ago||
I would think so because fab capacity is constrained, and if you make an on-die SoC with less memory, it uses fewer transistors, so you can fit more on a wafer.
hvb2 34 minutes ago||
But bigger chips mean lower yields because there's just more room for errors?
zdc1 1 hour ago||
Thankfully we're at a stage where a 4 year old second-hand iPhone is perfectly usable, as are any M-series Macs or most Linux laptops. Sucks for anyone needing something particularly beefy for work; but I feel that a lot of purchases can be delayed for at least a year or two while this plays out.
Ekaros 2 hours ago||
Outside say video and image editing and maybe lossless audio. Why is this much ram even needed in most use cases? And I mean actually thinking about using it. Computer code unless you are actually doing whole Linux kernel, is just text. So lot of projects probably would fit in cache. Maybe software companies should be billed for user's resources too...
system2 46 minutes ago|
I have multiple apps using 300 GB+ PostgreSQL databases. For some queries, high RAM is required. I enable symmetrical NVMe swaps, too. Average Joe with gaming needs wouldn't need more than 64 GB for a long time. But for the database, as the data grows, RAM requirements also grow. I doubt my situation is relatable to many.
Ekaros 26 minutes ago||
I understand servers. But why do actually average user need more than 2 or 4GB? For what actual data in memory at one time?
l9o 2 hours ago||
It feels like a weird tension: we worry about AI alignment but also want everyone to have unrestricted local AI hardware. Local compute means no guardrails, fine-tune for whatever you want.

Maybe the market pricing people out is accidentally doing what regulation couldn't? Concentrating AI where there's at least some oversight and accountability. Not sure if that's good or bad to be honest.

walterbell 2 hours ago|
> market pricing people out

For now. Chinese supply chains include DRAM from CXMT (sanctioned) and NAND from YMTC (not sanctioned, holds patents on 3D stacking that have been licensed by Korean memory manufacturers).

loudandskittish 2 hours ago||
Love all the variations of "8GB of RAM should be enough for anybody" in here.
walterbell 2 hours ago|
AI PacMan eats memory, then promises to eat/write software so we need less memory.
compounding_it 5 hours ago|
Software has gotten bad over the last decade. Electron apps were the start but these days everything seems to be so bloated, right from the operating systems to browsers.

There was a time when apple was hesitant to add more ram to its iPhones and app developers would have to work hard to make apps efficient. Last few years have shown Apple going from 6gb to 12gb so easily for their 'AI' while I consistently see the quality of apps deteriorating on the App Store. iOS 26 and macOS 26 are so aggressive towards memory swapping that loading settings can take time on devices with 6gb ram (absurd). I wonder what else they have added that apps need purging so frequently. 6gb iphone and 8gb M1 felt incredibly fast for the couple of years. Now apparently they are slow like they are really old.

Windows 11 and Chrome are a completely different story. Windows 10 ran just fine on my 8th gen pc for years. Windows 11 is very slow and chrome is a bad experience. Firefox doesn't make it better.

I also find that gnome and cosmic de are not exactly great at memory. A bare minimum desktop still takes up 1.5-1.6gb ram on a 1080p display and with some tabs open, terminal and vscode (again electron) I easily hit 8gb. Sway is better in this regard. I find alacrity sway and Firefox together make it a good experience.

I wonder where we are heading on personal computer software. The processors have gotten really fast and storage and memory even more so, but the software still feels slow and glitchy. If this is the industry's idea of justifying new hardware each year we are probably investing in the wrong people.

heavyset_go 4 hours ago||
Firefox is set to allocate memory until a certain absolute limit or memory pressure is reached. It will eat memory whether you have 4GB of RAM of 40GB.

Set this to something you find reasonable: `browser.low_commit_space_threshold_percent`

And make sure tab unloading is enabled.

Also, you can achieve the same thing with cgroups by giving Firefox a slice of memory it can grow into.

anigbrowl 4 hours ago||
Fair. I installed a MIDI composition app recently that was 1.2 GB! Now, it does have some internal synthesis that like uses samples, but only a limited selection of sounds so I think 95% of the bulk is from Electron.
More comments...