Posted by speckx 23 hours ago
Nvidia is the high-frequency trader hammering the newest node until the arb closes. Stability usually trades at a discount during a boom, but Wei knows the smartphone replacement cycle is the only predictable cash flow. Apple is smart. If the AI capex cycle flattens in late '27 as models hit diminishing returns, does Apple regain pricing power simply by being the only customer that can guarantee wafer commits five years out?
However, everyone knows that good faith reciprocity at that scale is not rewarded. Apple is ruthless. There are probably thousands of untold stories of how hard Apple has hammered it's suppliers over the years.
While Apple has good consumer brand loyalty, they arguably treat their suppliers relatively poorly compared to the Gold standard like Costco.
Changing fabs is non-trivial. If they pushed Apple to a point where they had to find an alternative (which is another story) and Apple did switch, they would have to work extra hard to get them back in the future. Apple wouldn't want to invest twice in changing back and forth.
On the other hand, TSMC knows that changing fabs is not really an option and Apple doesn't want to do it anyway, so they have leverage to squeeze.
At this level, everyone knows it's just business and it comes down to optimizing long-term risk/reward for each party.
There are rumours that Intel might have won some business from them in 2 years. I could totally see Apple turning to Intel for the Mac chips, since they're much lower volume. I know it sounds crazy, we just got rid of Intel, but I'm talking using Intel as a fab, not going back to x86. Those are done.
They did had the expertise building it after all. What would happen, if TSMC now would build a M1 clone? I doubt this is a way anyone wants to go, but it seems a implied threat to me that is calculated in.
> "I will spend my last dying breath if I need to, and I will spend every penny of Apple’s $40 billion in the bank, to right this wrong. I’m going to destroy Android, because it’s a stolen product. I’m willing to go thermonuclear war on this."
The falling out with Samsung was related, but more about the physical look of the phone
This is funny coming from Jobs.
- steve jobs
Samsung still makes the displays and the cameras for most iPhones. They continued to do business even while engaged in legal action. That they are still competitors wont stop them doing business when it suits them. Business doesn’t care about pride or loyalty; only money.
TSMC already makes them in their labs. They could tweak a few things, claim it is novel and just sell to the competition. (Apple would fight back of course with all they have and TSMC reputation would take damage)
China already has plenty of engineers who can make a chip, and experience with making CPUs. ARM licenses a lot of useful things for making a CPU (I don't know what). They would be better off in the long run making the chips they all ready understand better. Which is something they are doing. It takes longer and costs more, but because they understand they can also customize the next chip for something they think is good - if they are right they can be ahead of everyone else.
What China is lacking is the fabs to make a CPU. They have made good progress in building them, but there is a lot of technology that isn't in the chip that is needed to make a chip.
What do you mean by cloning? An exact copy of Apple SOC? What would that be useful for?
There are already other ARM SOCs that are as performant as Apple's, according to benchmarks.
This is false. Samsung competes with Apple on smartphones. Apple even filed a lawsuit against Samsung over smartphones.
Apple moved to TSMC because how can you trust someone to make chips for you containing your phone's core IP?
>I could totally see Apple turning to Intel for the Mac chips
I could totally see Apple will be wary turning their core IPs to Intel
Common manufacturer Samsung[2]
https://en.wikipedia.org/wiki/TSMC
Apple A6 which is fabricated with Samsung 32 nm HKMG (Hi dielectric K, Metal Gate) CMOS process
In the long run, competition (where via Intel, Samsung or geopolitical diversification) is the only path that benefits anyone other than TSMC
Fabless players' IPs are their entire business.
It'll be hard to trust Intel given Intel's past behavior, especially against AMD.
Anyone making a claim that trust will be 0% based on a single thing is obviously oversimplifying the situation. Trust is built on behavior, reputation, time, repeatability, etc.
Trust is subjective and relative. If Alice doesn’t trust Eve, that doesn’t automatically mean that Bob doesn’t trust Eve. That usually requires both Alice and Bob to similar experiences or Bob must have a trust relationship with Alice.
There are other factors than trust as well - the US government really wants intel fabs to take off and they may be applying pressure that we are not aware of. It could well be that Apple is willing to risk Intel because the US government will buy a lot of macs/iphones but only if they CPU is made in the US. (this would be a smart thing for the US todo for geopolitical reasons)
Why do they keep using Samsung for their customized screens despite LG and Chinese competitors being competitive?
Is there anyone who can match TSMC at this point for the top of the line M or A chips? Even if Intel was ready and Apple wanted to would they be able to supply even 10% of what Apple needs for the yearly iPhone supply?
> Not all of Apple‘s chips need to be fabbed at the smallest size, those could certainly go elsewhere.
When I saw that TSMC continues to run old fabs, I immediately thought about this idea. I am sure when Apple is designing various chips for their products, they design for a specific node based on available capacity. Not all chips need to be the smallest node size.Another thing: I am seeing a bunch of comments here alluding to Apple changing fabs. While I am not an expert, it is surely much harder than people understand. The precise process of how transistors are made is different in each fab. I highly doubt it is trivial to change fabs.
It the old days the leverage was that without Apple, no one is willing to pay for leading edge foundry development, at least not enough money to make it so compared to Apple. Now it is different. The demands for AI meant plenty of money to go around. And Nvidia is the one to beat, not Apple any more. The good thing for Apple is that as long as Nvidia continues to grow, their order can be spilt between them. No more relying on single vendor to pus.
And anyway consumers don't really need beefy devices nowadays. Running local LLM on a smartphone is a terrible idea due to battery life and no graphics card; AI is going to be running on servers for quite some time if not forever.
It's almost as if there is a constant war to suppress engineer wages... That's the only variable being affected here which could benefit from increased competition.
If tech sector is so anti-competitive, the government should just seize it and nationalize it. It's not capitalism when these megacorps put all this superficial pressure but end up making deals all the time. We need more competition, no deals! If they don't have competition, might as well have communism.
Also, how is nationalizing something pro-competition? Nationalized companies have a history of using their government connections to squash competition.
Are you talking about TSMC - because that is a single, albiet primary, node in a supply chain, that's also what you have to replicate. AMSL is another vital node.
So many people with "it's just a factory, how hard can it be". The answer is "VERY", as a few endavours have found out already - and they will probably find out even at TSMC Arizona.
I shall illustrate with Adrian Thompson's 1996 FPGA experiment at the University of Sussex.
Thompson used a genetic algorithm to evolve a circuit on an FPGA. The task was simple: get it to distinguish between a 1kHz tone and a 10kHz tone using only 100 logic gates and no system clock.
After about 4,000 generations of evolution, the chip could reliably do it but the final program did not work reliably when it was loaded onto other FPGAs of the same type.
When Thompson looked inside at what evolved, he found something baffling:
The plucky chip was utilizing only thirty-seven of its one hundred logic gates, and most of them were arranged in a curious collection of feedback loops. Five individual logic cells were functionally disconnected from the rest - with no pathways that would allow them to influence the output - yet when he disabled any one of them the chip lost its ability to discriminate the tones.
Welcome to building semi-conductors.
There was a great video recently on the company + techniques used for cutting-edge lithography.
I was expecting an Asianometry video from your link
https://www.youtube.com/@Asianometry
Pure Silicon Crystals for the wafer is another very specialist supplier you can't just decide to become - your local gravity will probably have an effect you need to tune into
Government jobs should only be an option if there are enough social benefits.
I've met many software engineers who call themselves communists. I can kind of understand. This kind of communist-like bureaucracy doesn't work well in a capitalist environment.
It's painful to work in tech. It's like our hands are tied and are forced to do things in a way we know is inefficient. Companies use 'security' as an excuse to restrict options (tools and platforms), treat engineers as replaceable cogs as an alternative to trusting them to do their job properly... And the companies harvest what they sow. They get reliable cogs, well versed in compliance and groupthink and also coincidentally full-blown communists; they're the only engineers remaining who actually enjoy the insane bureaucracy and the social climbing opportunities it represents given the lack of talent.
I'm going through a computer engineering degree at the moment, but I am thinking about pursuing Law later on.
Looking at other paths: Medicine requires expensive schooling and isn't really an option after a certain age and law, on the other hand, opened its doors too widely and now has a large underclass of people with third-tier law degrees.
Perhaps you can try to accept the realities of the system while trying to live the best life that you can?
Psyching yourself all the way, trying to find some sort of escape towards a good life with freedom later on...
The only way you don’t need to be versed in compliance or group think at a US firm as an employee is to either be
1) independently wealthy, so your job is a hobby you can walk away from
2) have some leverage on a currently in demand skill, but the second that leverage evaporates they will demand the compliance
Also I realized I undersold it, they aren’t just run as dictatorships/oligarchies, they are usually run as command economies as well.
The whole capitalist competition style behavior only happens with inter firm interactions, not internal ones
I spent most of my career working in companies with <50 employees, and only hit a couple of unpleasant founders. The few large companies that I worked in were always bureaucratic nightmares by comparison.
Small companies won't pay FAANG salaries, but they also won't make you feel like a meaningless cog in a vast unsympathetic, unproductive, machine.
I’ve worked for 3 companies like that. It was really great if your views aligned with the founder. If they didn’t, you got fucked.
I really enjoyed when a bunch of juniors were fired the day before Christmas because the founder heard them discussing the latest movies they watched and decided that they had bad opinions and shouldn’t work at his company since he’d be embarrassed if his peers heard their tastes. Not hyperbole, direct statements. We referred to it as the Red Christmas at the time.
I believe you got lucky, I don’t find your advice actionable.
I'm sorry you don't find it actionable. Please continue doing whatever you're doing now that is working for you.
Lol.
It doesn't work out because I don't have leverage, and tried to stand up for what I believe in. I also don't believe it would work for you unless you had views that aligned with the current oligarchical leadership that the entire US industry is operating under.
If you only have a good time when you found the "right" founder, because they will and are capable of harming your career or income when you disagree with them, and the law does effectively nothing to protect you from their ego driven tantrums, then you are a serf at best.
I'd agree with you if it was relatively common that employees who had differences of opinions with founders of companies, weren't forced out, but that is not my experience.
I do not find contentment out of accepting that some assholes are my Betters because they have more money than me.
Labor is the next option above slavery and indenture, and now that slavery and indenture are frowned upon, labor has absorbed that space as well.
If you want to have some control of your environment and destiny, you must be an independent agent, a contractor, entrepreneur, or consultant. A tradesman. You have special skills and expertise, your own tools, and a portfolio of masterpieces at the least.
There is nothing new in this space of human endeavour, it is as it has been, and I suspect will continue to be, for better or for worse. Sacrificing your agency for subservience is going to make you feel at the mercy of your “betters”. If you don’t want that, don’t do that. Labor law and other conventions have made it a little better, but the fundamental relationship is still master and servant.
If we go down this path, what can I say that doesn’t get my account banned and my speech suppressed for what what I would suggest doing to people with your opinion?
It’s not the way I think it -should be- but it is the way that it is. The incentive alignment keeps it at that local minima, and every attempt to move it to a new one so far has introduced so many perverse incentives that it ultimately causes the regression or even complete failure of the economies it is implemented in.
I don’t know what the answer is that maximises human happiness and minimises human misery, but I suspect it lies well outside of the paradigm of conventional market economics.
Within the dominant paradigm, It’s all a matter of risk management. With employment, you are paying your employer with your surplus value to handle the risks that you feel powerless to manage. Market risks, capital risks.
In exchange, you accept risks that your opinions and comfort won’t be prioritised, and in some cases even your physical well being.
In effect, you are betting against yourself being able to balance those risks against the risks posed by pursuing profitability.
The ability to manage risks is intersectional with your ability to manage discomfort and privation. When you run out of money, the house wins by default.
That’s why the foundational step for anyone should be to do whatever they must to obtain a safe fallback position. A place to be. A safety net. This is what enables risk accommodation. Without taking risk, there will be no advancement. If you don’t have a fallback plan, a safe spawn point, do everything in your power to create one, at least for your children.
Trump is using his DOJ to probe Jerome Powell with a bogus lawsuit because the Fed won't lower rates on demand.
An independent Fed is the most important body for the USA. Lowering rates should be based on facts, not dictated by some bankrupt casino CEO. And now you want our government to nationalize the tech sector?
They're Apple. If TSMC fucks around too much, they might just start working towards building their own fab.
That said, they did that for a sapphire glass supplier for the Apple Watch and when their machines had QC problems they dropped them like a rock and went back to Corning.
But is that really any different from any other supplier? And who tf do you think they’re going to drop TSMC for right now? They are the cock of the walk.
Don't look now: https://www.macrumors.com/2025/11/28/intel-rumored-to-supply...
The modern Cortex and Infiniverse designs are so pathetic that RISC-V might mature by the time ARM is the industry standard. And the smaller ARM IP hasn't been profitable since China mass-produced the clones. Courting Intel into buying an architectural license with a free IP bonus is a legitimately smart move for ARM's longevity, from Apple's POV.
Your underlying statement implies that whoever is replacing apple is a better buyer which I don't think is necessarily true.
The only complete package integrator that manages to make a relationship work with Nvidia is Nintendo.
I thought that was mainly due to bad thermals. I always got the impression that (like Intel) Nvidia only cared about performance, and damn the power consumption.
https://blog.greggant.com/posts/2021/10/13/apple-vs-nvidia-w...
And thats probably because Nintendo isn’t adding any pressure to neither TSMC nor Nvidia capacity wise; iirc Nintendo uses something like Maxwell or Pascal on really mature processes for Switch chips/socs.
I shot a video at CNET in probably 2011 which was a single touchscreen display (i think it was the APX 2500 prototype iirc?) and it has the precise dimensions to the switch 1.
Nintendo was reluctantly a hardware company... they're a game company who can make hardware, but they know they're best when they own the stack.
> According to Han, Nvidia has been difficult to work with for some time now. Like all other GPU board partners, EVGA is only told the price of new products when they're revealed to everyone on stage, making planning difficult when launches occur soon after. Nvidia also has tight control over the pricing of GPUs, limiting what partners can do to differentiate themselves in a competitive market.
https://www.gamespot.com/articles/evga-terminates-relationsh...
https://www.semiaccurate.com/2010/07/11/investigation-confir...
That means deprioritizing your largest customer.
Also theres the devil you know and the devil you dont know.
Companies have to be fairly large to be Costco suppliers. What suppliers lose in margin they more than make up for in scale. It's better to sell 10 million at 5% margin than 1 million at 10% margin.
And they don't require a % of supplier's business revenue as that would be illegal in the U.S. Most of the products found at Costco are generally found at other retailers, just in smaller packages or as different SKUs.
It definitely implies it though, I’m hopeful that competition is back.
Private companies can be nice to their suppliers. Owners can choose to stay loyal to suppliers they went to high school with, even if it isn't the most cost-efficient.
I’m not saying you’re wrong but you’re previous paragraph sounding like you were wondering if it was the case vs. here you’re saying it’s known. Is this all true? Do they have a reputation for hammering their suppliers?
The penalties for not delivering on timelines and production goals, and the scale being requested can mean substantial changes to your business. I remember a friend whose company was in talks with Apple telling me that there was some sense of relief when the deal fell through, just because of how much stress and risk and change the deal would entail.
However, a missing component could put tens of billions of dollars of revenue on the line for Apple. It is easier to say that any supplier Apple picks has to then quickly grow to the scale and process needed - and failing to do that successfully could very well be a fatal slip for the supplier.
Even in the iPod days, Apple often would invest in building out the additional capacity (factories) to meet their projected demand, and have a period of exclusivity as well. This meant that as MP3 player demand scaled up, they also wound up locking up production for the micro HDD and flash ram that competitors would need.
Apple has responded and has started moving a lot of manufacturing out of China. It just makes sense for risk management.
> China will remain the country of origin for the vast majority of total products sold outside the US, he added.
And international sales are a solid majority of Apple's revenue.
> Meanwhile, Vietnam will be the chief manufacturing hub "for almost all iPad, Mac, Apple Watch and AirPods product sold in the US".
> We do expect the majority of iPhones sold in US will have India as their country of origin," Mr Cook said.
Still not made in the US and no plan to change that. They will be selling products made in India/Vietnam domestically and products made in China internationally.
The tariffs are not bringing these jobs home.
You can buy modern CPUs made in Iowa - at about $60,000 each. You can buy one from an intel fab (I'm not sure where they are) for under $1000 that is likely better. the Iowa made CPU would be a one-off made under license from Intel. The companies that do this made just enough to prove they can in case Intel fabs are bombed. (I assume this means that you can't actually buy such a CPU if you tried, but they do make them and that is about the cost they would have to charge to break even)
Nvidia's willingness to pay exorbitant prices for early 2nm wafers subsidizes the R&D and the brutal yield-learning curve for the entire node. But you can't run a sustainable gigafab solely on GPUs...the defect density math is too punishing. You need a high-volume, smaller-die customer (Apple) to come in 18 months later, soak up the remaining 90% of capacity and amortize that depreciation schedule over a decade.
Apple OTOH operates at consumer electronics price points. They need mature yields (>90%) to make the unit economics of an iPhone work. There's also the binning factor I am curious about. Nvidia can disable 10% of the cores on a defective GPU and sell it as a lower SKU. Does Apple have that same flexibility with a mobile SoC where the thermal or power envelope is so tightly coupled to the battery size?
Sauce on the number?
iPhones are luxury goods with margins nowhere near typical for consumer electronics. Apple can easily stomach some short term price hikes / yield drops.
> They need mature yields (>90%) to make the unit economics of an iPhone work.
Can you share how you know this information? >90% seems very specific.For example the regular M4 can have 4 P-cores / 6 E-cores / 10 GPU cores, or 3/6/10 cores, or 4/4/8 cores, depending on the device.
They even do it on the smaller A-series chips - the A15 could be 2/4/5, 2/4/4, or 2/3/5.
For Apple, they have binning flexibility, with Pro/Max/Ultra, all the way down to iPads - and that’s after the node yields have been improved via the gazillion iPhone SoC dies.
NVIDIAs flexibility came from using some of those binned dies for GeForce cards, but the VRAM situation is clearly making that less important, as they’re cutting some of those SKUs for being too vram heavy relative to MSRP.
The Pro and Max chips are different dies, and the Ultra currently isn't even the same generation as the Max. And the iPads have never used any of those larger dies.
> NVIDIAs flexibility came from using some of those binned dies for GeForce cards
NVIDIA's datacenter chips don't even have display outputs, and have little to no fixed-function graphics hardware (raster and raytracing units), and entirely different memory PHYs (none of NVIDIA's consumer cards have ever used HBM).
Not binning an M4 Max for an iPhone, but an M4 Pro with a few GPU or CPU cores disabled is clearly a thing.
Same for NVIDIA. The 4080 is a 4090 die with some SMs disabled.
The desktop 4090 uses the AD102 die, the laptop 4090 and desktop 4080 use the AD103 die, and the laptop 4080 uses the AD104 die. I'm not at all denying that binning is a thing, but you and other commenters are exaggerating the extent of it and underestimating how many separate dies are designed to span a wide product line like GPUs or Apple's computers/tablets/phones.
Otherwise, yes, if a chip doesn't make M4 Max, it can make M4 Pro. If not, M4. If not, A18 Pro. If not that, A18.
And even all of the above mentioned marketing names come in different core configurations. M4 Max can be 14 CPU Cores / 32 GPU cores, and it can also be 16 CPU cores and 40 GPU cores.
So yeah, I'd agree that Apple has _extreme_ binning flexibility. It's likely also the reason why we got A19 / A19 Pro / M5 first, and we still don't have M5 Pro or M5 Max yet. Yields not high enough for M5 Max yet.
Unfortunately I don't think they bin down even lower (say, to S chips used in Apple Watches), but maybe in the future they will.
In retrospect, Apple ditching Intel was truly a gamechanging move. They didn't even have to troll everyone by putting an Intel i9 into a chassis that couldn't even cool an i7 to boost the comparison figures, but I guess they had to hedge their bet.
No, that's entirely wrong. All of those are different dies. The larger chips wouldn't even fit in phones, or most iPad motherboards, and I'm not sure a M4 Max or M4 Pro SoC package could even fit in a MacBook Air.
As a general rule, if you think a company might ever be selling a piece of silicon with more than half of it disabled, you're probably wrong and need to re-check your facts and assumptions.
There are two levels of Max Chip, but think of a Max as two pros on die (this is simplification, you can also think of as pro as being two normal cores tied together), so a bad max can't be binned into a pro. But a high-spec Max can be binned into a low-spec Max.
As you cut SMs from a die you move from the 3090 down the stack, for instance. That’s yield management right there.
If you want binning in action, the RTX ones other than the top ones, are it. Look for the A30 too, of which I was surprised there was no successor. Either they had better yields on Hopper or they didn't get enough from the A30...
Like smartphones, AI chips also have a replacement cycle. AI chips depreciate quickly -- not because the old ones go bad, but because the new ones are so much better in performance and efficiency than the previous generation. While smartphones aren't making huge leaps every year like they used to, AI chips still are -- meaning there's a stronger incentive to upgrade every cycle for these chips than smartphone processors.
I've heard that it's exactly that, reports of them burning out every 2-3 years. Haven't seen any hard numbers though.
people are holding onto their phones for longer: https://www.cnbc.com/2025/11/23/how-device-hoarding-by-ameri...
Heck if Apple wanted to be super cheeky, they could probably still pivot on the reserved capacity to do something useful (e.x. revised older design for whatever node they reserved where they can get more chips/wafer for cheaper models.)
NVDA on the other hand is burning a lot of good-will in their consumer space, and if a competitor somehow is able to outdo them it could be catastrophic.
Graphical fidelity is at the point that unless some new technology comes out to take advantage of GPUs, I don’t see myself ever upgrading the part. Only replacing it whenever it dies.
And that 1080 ti isn’t dead either, I passed the rig onto someone who wanted to get into PC gaming and it’s still going strong. I mostly upgraded because I needed more ram and my motherboards empty slots were full of drywall dust.
The phone I’m more liable to upgrade solely due to battery life degradation.
The 5090 I replaced it with has not been entirely worth it. Expensive GPUs for gaming have had more diminishing returns on improving the gaming experience than ever before, at least in my lifetime.
They really, absolutely, are not.
It's not about "will there be a new hardware", it's about "is their order quantity predictable"
I can certainly see Apple taking a large stake and board position in fabricators, but I can't see them being able to justify the ongoing investment in a closed fab.
But also; Apple is one of the very few companies at their size that seems to have the political environment to make, and more importantly succeed, at decade investments. The iPhone wasn't an obvious success for 5 or 6 years. They started designing their own iPhone chips ~the iPhone 4 iirc, and pundits remarked: this isn't a good idea; today, the M5 in the iPad Pro outperforms every chip made by EVERYONE else in the world, by 25%, at a tenth the power draw and no active cooling (e.g. 9950X3D). Apple Maps (enough said). We're seeing similar investments today, things we could call "failures" that in 10 years we'll think were obviously going to be successful (cough vision pro).
Definitely! But I'd recon they would want to bootstrap that part of their supply chain as soon as possible? Say China does invade Taiwan, suddenly their main supplier is gone and the Intel capacity mostly goes to military and other high margin segments. If they instead own Intel they not only control the narrative but also capitalize on the increase in Intel's value.
No, it does not. The core inside the M5 is faster than every other core design in single-threaded burst performance. That is common for small machines with a low core count and no hyperthreading.
The chip itself does not outperform every other chip in the world, nor is it 10x more efficient than the 9950X3D. That's not even napkin math at that point, you're making up numbers with no relation to relevant magnitude.
The comparison point was for single core performance, which certainly makes the TDP comparison unfair if interpreted together. The numbers are ballpark-correct.
No one else is remotely close to Apple. Apple could stop developing chips for four years, and it’s very likely they would still ship the most efficient core architecture, and sit in the top five in performance. If you’re quibbling over the semantics of this particular comparison, you are not mentally ready for what M5 Ultra is going to do to these comparisons in a few months.
I doubt it, particularly not four years.
Then again, Microsoft should have bought Intel: MS has roughly $102 billion in cash (+ short-term investments). Intel’s market value is approximately $176 billion. Considering Azure, Microsoft has heaps of incentive to buy Intel.
I would guess Google are more likely to greenfield develop their own foundry rather than try and buy Intel.
If Nvidia pays more, Apple has to match.
Not a system that necessarily works all that well if one player has a short-term ability to vastly outspending all the rest.
You can't let all your other customers die just because Nvidia is flush with cash this quarter...
Is the argument that Apple will go out of business? AAPL?
Wait,
> one player has a short-term ability to vastly outspending all the rest.
I assure you, Apple has the long-term and short-term ability to spend like a drunken sailor all day and all night, indefinitely, and still not go out of business. Of course they’d prefer not to. But there is no ‘ability to pay’ gap here between these multi-trillion-dollar companies.
Apple will be forced to match or beat the offer coming from whoever is paying more. It will cost them a little bit of their hilariously-high margins. If they don’t, they’ll have to build less advanced chips or something. But their survival is not in doubt and TSMC knows that.
TSMC isn't running a charity, it sells capacity to the highest bidder.
Of course customers as big as Apple will have a relationship and insane volumes that they will be guaranteed important quotes regardless.
If it takes 4 years to build a new fab and Apple is willing to commit to paying pay the price of an entire fab, for chips to be delivered in 4 years time - why not take the order and build the capacity?
But Nvidia has also spent billions/year in TSMC for more than a decade and this just keeps increasing.
Well yeah, people were identifying that back when Apple bought out the entirety of the 5nm node for iPhones and e-tchotchkes. It was sorta implicitly assumed that any business that builds better hardware than Apple would boot them out overnight.
It's not "build better hardware" though, it's "continue to ship said hardware for X number of years". If someone buys out the entire fab capacity and then goes under next year, TSMC is left holding the bag
It really is about making better hardware. Apple would be out-bidding Nvidia right now, but only if the iPhone had equivalent value-add to Nvidia hardware. Alas, iPhones are overpriced and underpowered, most people will agree.
I'd argue this from almost the opposite direction - there is no value-add for Apple because high-end smartphones exceeded the performance requirements of their user-base generations ago.
Nvidia has a pretty much infinite performance sink here (at least as long as training new LLMs remains a priority for the industry as a whole). On the smartphone side, there just isn't the demand for drastic performance increases - and in practice, many of us would like power and cost reduction to be prioritised instead.
The flat line prediction is now 2 years old...
I consider the start of this wave of AI to be approximately the 2017 Google transformer paper and yet transformers didn't really have enough datapoints to look exponential until GPT 3 in 2022.
The following is purely speculation for fun and sparking light-hearted conversation:
My gut feeling is that this generation of models transitioned out of the part of the sigmoid that looks roughly exponential after the introduction of reasoning models.
My prediction is that tranformer-based models will start to enter the phase that asymptotes to flatline in 1-2 years.
I leave open the possibility for a different form of model to emerge that is exponential but I don't believe transformers to be right now.
Sure maybe they do better in some benchmarks, but to me the experience of using LLMs is and has been limited by their tendency to be confidently incorrect which betrays their illusion of intelligence as well as their usefulness. And I don't really see any clear path to getting past this hurdle, I think this may just be about as good as they're gonna get in that regard. Would be great if they prove me wrong.
New and better things are coming. They will just take time to implement, and I doubt they cancel current training runs. So I guess it will take up to a year for the new things to come out
Can the bubble burst in this time, because people lose patience? Of course. But we are far from the end.
That's the take I would pursue if I were Apple.
A quiet threat of "We buy wafers on consumer demand curves. You’re selling them on venture capital and hype"
The reality is that TSMC has no competition capable of shipping an equivalent product. If AI fizzles out completely, the only way Apple can choose to not use TSMC is if they decide to ship an inferior product.
A world where TSMC drains all the venture capital out of all the AI startups, using NVidia as an intermediary, and then all the bubble pops and they all go under is a perfectly happy place for TSMC. In these market conditions they are asking cash upfront. The worst that can happen is that they overbuild capacity using other people's money that they don't have to pay back, leaving them in an even more dominant position in the crash that follows.
Business is a little more nuanced than this audience thinks, and it’s silly to think Apple has no leverage.
From TSMC's perspective, Apple is the one that needs financial assistance. If they wanted the wafers more than Nvidia, they'd be paying more. But they don't.
This is the "venture capital and hype" being referred to, not Nvidia themselves.
That line is purified cope.
I get why the numbers are presented the way they are, but it always gets weird when talking about companies of Apple’s size - percent increases that underwhelm Wall Street correspond to raw numbers that most companies would sacrifice their CEO to a volcano to attain, and sales flops in Apple’s portfolio mean they only sold enough product to supply double-digit percentages of the US population.
And ironically Apple acts like being a small contender the moment they feel some heat after a decade of relatively easy wins everywhere it seemed.
So finally there is a company that gives Apple some much needed heat.
That’s why I in absolute terms side with NVIDIA, the small contender in this case.
PS: I had one key moment in my career when I was at Google and a speaker mentioned the unit “NBU”. It stands for next billion units.
This is ten years ago and started my mental journey into large scale manufacturing and production including all the processes included.
The fascination never left. It was a mind bender for me and totally get why people miss everything that large.
At Google it was just a milestone expected to be hit - not one time but as the word next indicates multiple times.
Mind blowing and eye opening to me ever since. Fantastic inspiration thinking about software, development and marketing.
Apple hit 3 billion iphones in mid 2025.
Technically, there are billions of transistors in every tensor chip manufactured by Google
The giant conglomerates in Asia seem more able to do it.
Google has somewhat tried but then famously kills most everything even things that could be successful if smaller businesses.
Every time a CEO or company board says "focus," an interesting product line loses its wings.
I think Asian companies are much less dependent on public markets and have as strong private control (chaebols in South Korea for example - Samsung, LG, Hyundai etc).
If you look at US companies that are under "family control" you might see a similar sprawl, like Cargill, Koch, I'd even put Berkshire in this class even though it's not "family controlled" in the literal sense, it's still associated with two men and not a professional CEO.
* Search/ads
* YouTube
* Android/Play, Chrome, Maps
* Google Cloud, Workspace
* Pixel, Nest, Fitbit
* Waymo, DeepMind
* Google fiber
They're not a conglomerate like Alibaba but they're far from a one-trick pony, either :)
Shares are a short-term speculative gamble; you buy them in the hope that the price will rise and then you can sell them for a profit. Sometimes the gap between these two events is measured in milliseconds.
So the only thing that matters to Wall St is growth. If the company is growing then its price will probably rise. If it's not, it won't. Current size is unimportant. Current earnings are unimportant (unless they are used to fund growth). Nvidia is sexy, Apple is not, despite all the things you say (which are true).
Worringly for Nvidia, Apple is producing products people want and are provenly useful, thus a vast majority of its value is solid, so revenue streams for fabs Apple uses is solid.
Nvidia on the other hand, is producing tangible things of value, GPUs, but which are now largely used in unproven technologies (when stacked against lofty claims) that barely more than a few seem to want, so Nvidia's revenue stream seems flimsy at best in the AI boom.
The only proven revenue stream Nvidia has (had?) is GPUs for display and visualisation (gaming, graphics, and non-AI non-crypto compute, etc.)
It may still be profitable for TSMC to use NVidia to funnel all the juicy VC game money to themselves, but the statement about proven vs unproven revenue stream is true. It'll be gone with the hype, unless something truly market changing comes along quickly, not the incremental change so far. People are not ready to pay the full costs of AI, it's that simple right now.
For a statistical word salad generator that is _generally_ coherent, sure it's proven.
But for other claims, such as replacing all customer service roles[1], to the lament of customers[2], and now that a number of companies are re-hiring staff they sacked because 'AI would make them redundant'[3] still make me strongly assert that Generative AI isn't the trillion dollar industry it is trying to market itself as.
Sure it has a few tricks, and helps in a number of cases, therefore is useful in those cases, but it isn't an 'earth-shattering mass-human-redundancy' technology, that colossally stupid amounts of circular investments are being poured into it which, I argue, makes fabs mostly, if not solely, dedicating themselves to AI are now in a precarious position when the AI bubble collapses.
[1] https://www.cxtoday.com/contact-center/openai-ceo-sam-altman...
[2] https://www.thestreet.com/technology/salesforce-ai-faces-bac...
[3] https://finance.yahoo.com/news/companies-quietly-rehiring-wo...
6 million Blackwell GPUs.. have left NVIDIA’s warehouses.. 15.6GW of power is required to make the last four quarters of NVIDIA GPUs sold turn onSo report the facts but sentences like "What Wei probably didn’t tell Cook is that Apple may no longer be his largest client" make it personal, they make you take sides, feel sorry for somebody, feel schadenfreude... (as you can observe in the comments)
Okay, but this isn't a news article, it's an opinion piece on some guy's substack.
However this post and the comments really debunk that - here we have a clear example of the author turning these people into characters, archetypes of reality tv, and inviting the reader to have an emotional response to what is potentially interesting, but actually just the mundane business matter of dealing with demand spikes.
A normal conversation might take a step back, above the emotional baiting, and instead lament on how TSMC weren't able to develop sufficient supply capacity in time to maximise yield across not just these clients, but many others whom are looking to get involved in the AI hype train. Instead we're seeing something quite different, and quite uninformed. It's reading like a gossip post from an instagram thread.
I notice that HN is actually more vulnerable to these types of conversations. Maybe it's because HN likely weights towards an ASD audience, which has less experience in handling socially driven narratives. I do definitely see here more of the "one-sided" conversation that is typical of ASD.
How do you think it got in the LLM training set in the first place?
AFAIK only Apple has been commiting to wafers up to 3 years in the future. It would be a crazy bet for NVidia to do the same, as they don't know how big will be the business.
That would ruin TSMC and others' independence.
Nvidia already did buy Intel shares so it is a thing.
Nvidia did discuss with TSMC for more capacity many times. It's not about financing or minimum purchasing agreements. TSMC played along during COVID and got hit.
As far as I know there was never a demand dip at any point in there.
Which barely impacts TSMC. Most of their revenue and focus is on the advanced nodes - not the mature 1s.
> As far as I know there was never a demand dip at any point in there.
When did I imply there was a demand dip? I said they built out too much capacity.
https://www.manufacturingdive.com/news/intel-layoffs-25-perc...
> Apple-TSMC: The Partnership That Built Modern Semiconductors
In 2013, TSMC made a $10 billion bet on a single customer. Morris Chang committed to building 20nm capacity with uncertain economics on the promise that Apple would fill those fabs. “I bet the company, but I didn’t think I would lose,” Chang later said. He was right. Apple’s A8 chip launched in 2014, and TSMC never looked back.
https://newsletter.semianalysis.com/p/apple-tsmc-the-partner...
I don’t know the hedge to position against this but I’m pretty sure China will make good on its promise.
The 2027 date was a guideline for their military to be "ready", which they may not be either. That is a far cry from the decision to actually make a move. They will only do that if they're certain it will work out for them, and as things stand, it is very risky for Xi.
The most advanced ASML machines also cost something like $300-400M each and I am willing to bet if configured wrong can heavily damage themselves and the building they are in.
What's worse, China having a monopoly on production, or the entire world being set back by X years?
In this scenario X is a two-digit number, right?
On top TSMC has fabs and personnel with expertise in other places. After all this threat isn't new.
Buy in-demand fab output today, even at a premium price and even if you can't install or power it all, expecting shortages tomorrow. Which is pretty much the way the tech economy is already working.
So no, no hedge. NVIDIA's customers already beat you to it.
mirroring, come to think of it, the movement to un-democratize of modern governments...
(I would be happier if the news behind Nvidia's strength was sales of good, reasonably priced consumer GPU cards...but it's clearly not. I can walk down the street and buy anything from Tim Cook, but 9 out of 10 times, I cannot buy a 5080/5090 FE card from Jenson Huang).
And possibly other types of hardware also had price bumped or used outdated chips because Apple has to build their iPhone/mac n+1.
That's why you see some folks actually mocking Apple about the situation. They were already affected.
If anything this might force a market-wide fix in the medium term.
But the point here is that a few companies are outbidding everyone else, hoarding shittons of compute and putting it into their data centers, to rent to people. This is effectively taking compute ownership away from consumers and centralizing compute i.e. un-democratising.
Apple outcompeting other companies to put their products into the hands of regular people is vastly different.
Between the $99/year sideloading, Liquid Glass and fighting fruitlessly against CUDA, I think Apple needs a break to reflect on why their software strategy is so unpopular with everyone. The hardware advances are doing them more harm than good at this point.
Intel seems to be very competitive again when it comes to laptop battery life. If macbooks again get the reputation of sluggy and overheating that's not great for sales.