Posted by todsacerdoti 7/4/2025
4K HDR gaming is not the future, is has been the standard for many years now for good reason.
* The prices for Nvidia GPUs are insane. For that money you can have an extremely good PC with a good non Nvidia GPU.
* The physical GPU sizes are massive, even letting the card rest on a horizontal motherboard looks like scary.
* Nvidia has still issues with melting cables? I've heard about those some years ago and thought it was a solved problem.
* Proprietary frameworks like CUDA and others are going to fall at some point, is just a matter of time.
Looks as if Nvidia at the moment is only looking at the AI market (which as a personal belief has to burst at some point) and simply does not care the non GPU AI market at all.
I remember many many years ago when I was a teenager and 3dfx was the dominant graphics card manufacturer that John Carmack profethically in a gaming computer magazine (the article was about Quake I) predicted that the future wasn't going to be 3dfx and Glide. Some years passed by and effectively 3dfx was gone.
Perhaps is just the beginning of the same story that happened with 3dfx. I think AMD and Intel have a huge opportunity to balance the market and bring Nvidia down, both in the AI and gaming space.
I have only heard excellent things about Intel's ARC GPUs in other HNs threads and if I need to build a new desktop PC from scratch there's no way to pay for the prices that Nvidia is pushing to the market, I'll definitely look at Intel or AMD.
CUDA outlived several attempts to offer an open alternative by now, starting with OpenCL.
It's really ironic for a hardware company that its moat, such as it is, is largely about software. And it's not even that software around CUDA is that great. But for some reason AMD is seemingly incapable of hitting even that low bar, even though they had literally decades to catch up.
Intel, on the other hand, is seriously lagging behind on the hardware end.
As someone who filed the AMD/ATi ecosystems due to their quirky unreliability, Nvidia and Intel have really shit the bed these days (I also had the misfortune of "upgrading" to a 13th gen Intel processor just before we learned that they cook themselves)
I do think DLSS supersampling is incredible but Lord almighty is it annoying that the frame generation is under the same umbrella because that is nowhere near the same, and the water is awful muddy since "DLSS" is often used without distinction
The two largest supercomputers in the world are powered by AMD. I don't think it's accurate to say Nvidia has monopoly on HPC
- Government-funded outliers don’t disprove monopoly behavior. The two AMD-powered systems on the TOP500 list—both U.S. government funded—are exceptions driven by procurement constraints, not market dynamics. NVIDIA’s pricing is often prohibitive, and its dominance gives it the power to walk away from bids that don’t meet its margins. That’s not competition—it’s monopoly leverage.
- Market power isn't disproven by isolated wins. Monopoly status isn’t defined by having every win, but by the lack of viable alternatives in most of the market. In commercial AI, research, and enterprise HPC workloads, NVIDIA owns an overwhelming share—often >90%. That kind of dominance is monopoly-level control.
- AMD’s affordability is a symptom, not a sign of strength. AMD's lower pricing reflects its underdog status in a market it struggles to compete in—largely because NVIDIA has cornered not just the hardware but the entire CUDA software stack, developer ecosystem, and AI model compatibility. You don't need 100% market share to be a monopoly—you need control. NVIDIA has it.
In short: pointing to a couple of symbolic exceptions doesn’t change the fact that NVIDIA’s grip on the GPU compute stack—from software to hardware to developer mindshare—is monopolistic in practice.
Breaking a monopoly can be a solution to that, however. But having a large part of a market by itself doesn't trigger anti trust legislation.
Each year those performance margins seem to narrow. I paid $1000+ dollars for my RTX 4080 Super. That’s ridiculous. No video card should cost over $1000. So the next time I “upgrade,” it won’t be NVIDIA. I’ll probably go back to AMD or Intel.
I would love to see Intel continue to develop video cards that are high performance and affordable. There is a huge market for those unicorns. AMDs model seems to be slightly less performance for slightly less money. Intel on the other hand is offering performance on par with AMD and sometimes NVIDIA for far less money - a winning formula.
NVIDIA got too greedy. They overplayed their hand. Time for Intel to focus on development and fill the gaping void of price for performance metrics.
Here's another nvdia/mellanox bs problem: many mlx nic cards are finalized or post assembled say by hp. So if you have a hp "mellanox" nic nvidia washes their hands of anything detailed. It's not ours; hp could have done anything to it what do we know? So one phones hp ... and they have no clue either because it's really not their IP or their drivers.
It's a total cluster bleep and more and more why corporate america sucks
I'll have to take your word on that.
And if I take your word: ergo not Connect-X support sucks
So that's sucks yet again on the table ... for what the 3rd time? Nvidia sucks.
Every line of the article convinces me I'm reading bad rage bait, every comment in the thread confirms it's working.
The article provides a nice list of grievances from the "optimized youtube channel tech expert" sphere ("doink" face and arrow in the thumbnail or GTFO), and none of them really stick. Except for the part where nVidia is clearly leaving money on the table... From 5080 up no one can compete, with or without "fake frames", at no price, I'd love to take the dividends on the sale of the top 3 cards, but that money is going to scalpers.
If nvidia is winning, it's because competitors and regulators are letting them.