Top
Best
New

Posted by todsacerdoti 22 hours ago

Nvidia won, we all lost(blog.sebin-nyshkim.net)
765 points | 416 commentspage 3
liendolucas 9 hours ago|
I haven't read the whole article but a few things to remark:

* The prices for Nvidia GPUs are insane. For that money you can have an extremely good PC with a good non Nvidia GPU.

* The physical GPU sizes are massive, even letting the card rest on a horizontal motherboard looks like scary.

* Nvidia has still issues with melting cables? I've heard about those some years ago and thought it was a solved problem.

* Proprietary frameworks like CUDA and others are going to fall at some point, is just a matter of time.

Looks as if Nvidia at the moment is only looking at the AI market (which as a personal belief has to burst at some point) and simply does not care the non GPU AI market at all.

I remember many many years ago when I was a teenager and 3dfx was the dominant graphics card manufacturer that John Carmack profethically in a gaming computer magazine (the article was about Quake I) predicted that the future wasn't going to be 3dfx and Glide. Some years passed by and effectively 3dfx was gone.

Perhaps is just the beginning of the same story that happened with 3dfx. I think AMD and Intel have a huge opportunity to balance the market and bring Nvidia down, both in the AI and gaming space.

I have only heard excellent things about Intel's ARC GPUs in other HNs threads and if I need to build a new desktop PC from scratch there's no way to pay for the prices that Nvidia is pushing to the market, I'll definitely look at Intel or AMD.

porphyra 20 hours ago||
The article complains about issues with consumer GPUs but those are nowadays relegated to being merely a side hobby project of Nvidia, whose core business is enterprise AI chips. Anyway Nvidia still has no significant competition from AMD on either front so they are still getting away with this.

Deceptive marketing aside, it's true that it's sad that we can't get 4K 60 Hz with ray tracing with current hardware without some kind of AI denoising and upscaling, but ray tracing is really just _profoundly_ hard so I can't really blame anyone for not having figured out how to put it in a consumer pc yet. There's a reason why pixar movies need huge render farms that take lots of time per frame. We would probably sooner get gaussian splatting and real time diffusion models in games than nice full resolution ray tracing tbh.

Jabrov 20 hours ago|
I get ray tracing at 4K 60Hz with my 4090 just fine
trynumber9 18 hours ago||
Really? I can't even play Minecraft (DXR: ON) at 4K 60Hz on a RTX 5090...

Maybe another regression in Blackwell.

Dylan16807 18 hours ago||
> The competing open standard is FreeSync, spearheaded by AMD. Since 2019, NVIDIA also supports FreeSync, but under their “G-Sync Compatible” branding. Personally, I wouldn’t bother with G-Sync when a competing, open standard exists and differences are negligible[4].

Open is good, but the open standard itself is not enough. You need some kind of testing/certification, which is built in to the G-Sync process. AMD does have a FreeSync certification program now which is good.

If you rely on just the standard, some manufacturers get really lazy. One of my screens technically supports FreeSync but I turned it off day one because it has a narrow range and flickers very badly.

reichstein 12 hours ago||
Aks. "Every beef anyone has ever had with Nvidia in one outrage friendly article."

If you want to hate on Nvidia, there'll be something for you in there.

An entire section on 12vhpwr connectors, with no mention of 12V-2x6.

A lot of "OMG Monopoly" and "why won't people buy AMD" without considering that maybe ... AMD cards are not considered by the general public to be as good _where it counts_. (Like benefit per Watt, aka heat.) Maybe it's all perception, but then AMD should work on that perception. If you want the cooler CPU/GPU, perception is that that's Intel/Nvidia. That's reason enough for me, and many others.

Availability isn't great, I'll admit that, if you don't want to settle for a 5060.

yunyu 20 hours ago||
If you are a gamer, you are no longer NVIDIA's most important customer.
Rapzid 12 hours ago||
Sounds like an opening for AMD then. But as long as NVidia has the best tech I'll keep buying it when it's time to upgrade.
bigyabai 20 hours ago|||
A revelation on-par with Mac users waking up to learn their computer was made by a phone company.
ravetcofx 19 hours ago||
Barely even a phone company, more like a app store and microtransactions services company
theshackleford 12 hours ago|||
Yes but why should I care provided the product they have already sold me continues to work? How does this materially change my life because Nvidia doesnt want to go steady with me anymore?
dcchambers 20 hours ago||
Haven't been for a while. Not since crypto bros started buying up GPUs for coin mining.
fithisux 5 hours ago||
NVidia won?

Not for me. I prefer Intel offerings. Open and Linux friendly.

I even hope they would release the next gen Risc-V boards with Intel Graphics.

camel-cdr 1 hour ago|
A RISC-V board with NVIDIA graphics is more likely: https://mp.weixin.qq.com/s/KiV13GqXGMZfZjopY0Xxpg

NVIDIA Keynote from the upcoming RISC-V Summit China: "Enabling RISC-V application processors in NVIDIA compute platforms"

frollogaston 20 hours ago||
Because they won't sell you an in-demand high-end GPU for cheap? Well TS
tiahura 18 hours ago|
Not to mention that they are currently in stock at my local microcenter.
voxleone 18 hours ago||
It’s reasonable to argue that NVIDIA has a de facto monopoly in the field of GPU-accelerated compute, especially due to CUDA (Compute Unified Device Architecture). While not a legal monopoly in the strict antitrust sense (yet), in practice, NVIDIA's control over the GPU compute ecosystem — particularly in AI, HPC, and increasingly in professional content creation — is extraordinarily dominant.
arcanus 17 hours ago||
> NVIDIA's control over the GPU compute ecosystem — particularly in AI, HPC

The two largest supercomputers in the world are powered by AMD. I don't think it's accurate to say Nvidia has monopoly on HPC

Source: https://top500.org/lists/top500/2025/06/

infocollector 2 hours ago||
It’s misleading to cite two government-funded supercomputers as evidence that NVIDIA lacks monopoly power in HPC and AI:

- Government-funded outliers don’t disprove monopoly behavior. The two AMD-powered systems on the TOP500 list—both U.S. government funded—are exceptions driven by procurement constraints, not market dynamics. NVIDIA’s pricing is often prohibitive, and its dominance gives it the power to walk away from bids that don’t meet its margins. That’s not competition—it’s monopoly leverage.

- Market power isn't disproven by isolated wins. Monopoly status isn’t defined by having every win, but by the lack of viable alternatives in most of the market. In commercial AI, research, and enterprise HPC workloads, NVIDIA owns an overwhelming share—often >90%. That kind of dominance is monopoly-level control.

- AMD’s affordability is a symptom, not a sign of strength. AMD's lower pricing reflects its underdog status in a market it struggles to compete in—largely because NVIDIA has cornered not just the hardware but the entire CUDA software stack, developer ecosystem, and AI model compatibility. You don't need 100% market share to be a monopoly—you need control. NVIDIA has it.

In short: pointing to a couple of symbolic exceptions doesn’t change the fact that NVIDIA’s grip on the GPU compute stack—from software to hardware to developer mindshare—is monopolistic in practice.

yxhuvud 12 hours ago|||
Strict antitrust sense don't look at actual monopoly to trigger, but just if you use your standing in the market to gain unjust advantages. Which does not require a monopoly situation but just a strong standing used wrong (like abusing vertical integration). So Standard Oil, to take a famous example, never had more than a 30% market share.

Breaking a monopoly can be a solution to that, however. But having a large part of a market by itself doesn't trigger anti trust legislation.

hank808 11 hours ago||
Thanks ChatGPT!
musebox35 13 hours ago||
With the rise of LLM training, Nvidia’s main revenue stream switched to datacenter gpus (>10x gaming revenue). I wonder whether this have affected the quality of these consumer cards, including both their design and product processes:

https://stockanalysis.com/stocks/nvda/metrics/revenue-by-seg...

fracus 17 hours ago|
This was an efficient, well written, TKO.
anonymars 16 hours ago|
Agreed. An excellent summary of a lot of missteps that have been building for a while. I had watched that article on the power connector/ shunt resistors and was dumbfounded at the seemingly rank-amateurish design. And although I don't have a 5000 series GPU I have been astonished at how awful the drivers have been for the better part of a year.

As someone who filed the AMD/ATi ecosystems due to their quirky unreliability, Nvidia and Intel have really shit the bed these days (I also had the misfortune of "upgrading" to a 13th gen Intel processor just before we learned that they cook themselves)

I do think DLSS supersampling is incredible but Lord almighty is it annoying that the frame generation is under the same umbrella because that is nowhere near the same, and the water is awful muddy since "DLSS" is often used without distinction

More comments...