Posted by speckx 1/15/2026
https://newsroom.intel.com/client-computing/ces-2026-intel-c...
So it's not available yet then?
Or they could buy out Intel and sell off their cpu design division
I wonder what will happen in future when we get closer to the physical "wall". Will it allow other fabs to catch up or the opposite will happen, and even small improvements will be values by customers?
At this point it would be corporate suicide if they were not outlining a strategy to own their own fab(s).
Apple has less cash available than TSMC plans to burn this year. TSMC is not spending 50 billion dollars just because it's fun to do so. This is how much it takes just to keep the wheels on the already existing bus. Starting from zero is a non-starter. It just cannot happen anymore. So, no one in their right mind would sell Apple their leading edge foundry at a discount either.
There was a time when companies like Apple could have done this. That time was 15+ years ago. It's way too late now.
[0]: https://www.wsj.com/business/earnings/tsmc-ends-2025-with-a-...
You're setting yourself up for making a huge part of your future revenue stream being set aside for ongoing chipfab capex and research engineering. And that's a huge gamble, since getting this all setup is not guaranteed to succeed.
As would almost innumerable others.
I really don't care about most new phone features and for my laptop the M1 Max is still a really decent chip.
I do want to run local LLM agents though and I think a Mac Studio with an M5 Ultra (when it comes out) is probably how I'm going to do that. I need more RAM.
I bet I'm not the only one looking at that kind of setup now that was previously happy with what they had..
The only reason that Thunderbolt exists is to expose DMA over an artificial PCI channel. I'd hope they've made progress on it, Thunderbolt has only been around for fourteen years after all.
I’m not saying this is bad or anything, it’s just another iteration of the centralized vs decentralized pendulum swing that has been happening in tech since the beginning (mainframes with dumb terminals, desktops, the cloud, mobile) etc.
Apple might experience a slowdown in hardware sales because of it. Nvidia might experience a sales boom because of it. The future could very well bring a swing back. Imagine you could run a stack of Mac minis that replaced your monthly Claude code bill. Might pay for itself in 6mo (this doesn’t exist yet but it theoretically could happen)
You don't have to imagine. You can, today, with a few (major) caveats: you'll only match Claude from roughly ~6 months ago (open-weight models roughly lag behind the frontier by ~half a year), and you'd need to buy a couple of RTX 6000 Pros (each one is ~$10k).
Technically you could also do this with Macs (due to their unified RAM), but the speed won't be great so it'd be unusable.
I wish I were in that situation, but I find myself able to use lots more compute than I have. And it seems like many others feel the same.
Data is saying demand >>>>> supply.