Top
Best
New

Posted by todsacerdoti 1 day ago

Nvidia won, we all lost(blog.sebin-nyshkim.net)
844 points | 480 commentspage 4
fracus 1 day ago|
This was an efficient, well written, TKO.
anonymars 1 day ago|
Agreed. An excellent summary of a lot of missteps that have been building for a while. I had watched that article on the power connector/ shunt resistors and was dumbfounded at the seemingly rank-amateurish design. And although I don't have a 5000 series GPU I have been astonished at how awful the drivers have been for the better part of a year.

As someone who filed the AMD/ATi ecosystems due to their quirky unreliability, Nvidia and Intel have really shit the bed these days (I also had the misfortune of "upgrading" to a 13th gen Intel processor just before we learned that they cook themselves)

I do think DLSS supersampling is incredible but Lord almighty is it annoying that the frame generation is under the same umbrella because that is nowhere near the same, and the water is awful muddy since "DLSS" is often used without distinction

voxleone 1 day ago||
It’s reasonable to argue that NVIDIA has a de facto monopoly in the field of GPU-accelerated compute, especially due to CUDA (Compute Unified Device Architecture). While not a legal monopoly in the strict antitrust sense (yet), in practice, NVIDIA's control over the GPU compute ecosystem — particularly in AI, HPC, and increasingly in professional content creation — is extraordinarily dominant.
arcanus 1 day ago||
> NVIDIA's control over the GPU compute ecosystem — particularly in AI, HPC

The two largest supercomputers in the world are powered by AMD. I don't think it's accurate to say Nvidia has monopoly on HPC

Source: https://top500.org/lists/top500/2025/06/

infocollector 13 hours ago||
It’s misleading to cite two government-funded supercomputers as evidence that NVIDIA lacks monopoly power in HPC and AI:

- Government-funded outliers don’t disprove monopoly behavior. The two AMD-powered systems on the TOP500 list—both U.S. government funded—are exceptions driven by procurement constraints, not market dynamics. NVIDIA’s pricing is often prohibitive, and its dominance gives it the power to walk away from bids that don’t meet its margins. That’s not competition—it’s monopoly leverage.

- Market power isn't disproven by isolated wins. Monopoly status isn’t defined by having every win, but by the lack of viable alternatives in most of the market. In commercial AI, research, and enterprise HPC workloads, NVIDIA owns an overwhelming share—often >90%. That kind of dominance is monopoly-level control.

- AMD’s affordability is a symptom, not a sign of strength. AMD's lower pricing reflects its underdog status in a market it struggles to compete in—largely because NVIDIA has cornered not just the hardware but the entire CUDA software stack, developer ecosystem, and AI model compatibility. You don't need 100% market share to be a monopoly—you need control. NVIDIA has it.

In short: pointing to a couple of symbolic exceptions doesn’t change the fact that NVIDIA’s grip on the GPU compute stack—from software to hardware to developer mindshare—is monopolistic in practice.

yxhuvud 23 hours ago|||
Strict antitrust sense don't look at actual monopoly to trigger, but just if you use your standing in the market to gain unjust advantages. Which does not require a monopoly situation but just a strong standing used wrong (like abusing vertical integration). So Standard Oil, to take a famous example, never had more than a 30% market share.

Breaking a monopoly can be a solution to that, however. But having a large part of a market by itself doesn't trigger anti trust legislation.

hank808 22 hours ago||
Thanks ChatGPT!
mrkramer 16 hours ago||
Probably the next big thing will be Chinese GPUs that are the same quality as NVIDIA GPUs but at least 10-20% cheaper aaand we will have to wait for that maybe 5-10 years.
zoobab 15 hours ago||
Not enough VRAM to load big LLMs, in order not to compète with their expensive high end. Market segmentation it's called.
musebox35 1 day ago||
With the rise of LLM training, Nvidia’s main revenue stream switched to datacenter gpus (>10x gaming revenue). I wonder whether this have affected the quality of these consumer cards, including both their design and product processes:

https://stockanalysis.com/stocks/nvda/metrics/revenue-by-seg...

FeepingCreature 1 day ago||
Oh man, you haven't gotten into their AI benchmark bullshittery. There's factors of 4x on their numbers that are basically invented whole cloth by switching units.
robbies 5 hours ago|
They “learned” this trick from their consumer days. Devs always had to reverse-engineer the hypothetical scaling from their fantasy numbers
Ancapistani 1 day ago||
I disagree with some of the article’s points - primarily, that nVidia’s drivers were ever “good” - but the gist I agree with.

I have a 4070 Ti right now. I use it for inference and VR gaming on a Pimax Crystal (2880x2880x2). In War Thunder I get ~60 FPS. I’d love to be able to upgrade to a card with at least 16GB of VRAM and better graphics performance… but as far as I can tell, such a card does not exist at any price.

PoshBreeze 1 day ago||
> The RTX 4090 was massive, a real heccin chonker. It was so huge in fact, that it kicked off the trend of needing support brackets to keep the GPU from sagging and straining the PCIe slot.

This isn't true. People were buying brackets with 10 series cards.

tonyhart7 1 day ago|
Consumer GPU feels like an "paper launch" for the past years

that's like they purposely not selling because they allocated 80% of their production to enterprise only

I just hope that new fabs operate early as possible because these price is insane

More comments...