Top
Best
New

Posted by todsacerdoti 1 day ago

Nvidia won, we all lost(blog.sebin-nyshkim.net)
888 points | 531 commentspage 5
spoaceman7777 1 day ago|
The real issue here is actually harebrained youtubers stirring up drama for views. That's 80% of the problem. And their viewers (and readers, for that which makes it into print) eat it up.

Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly, posting to social media, and youtubers jumping on the trend for likes.

These are 99% user error issues drummed up by non-professionals (and, in some cases, people paid by 3rd party vendors to protect those vendors' reputation).

And the complaints about transient performances issues with drivers, drummed up into apocalyptics scenarios, again, by youtubers, who are putting this stuff under a microscope for views, are universal across every single hardware and software product. Everything.

Claiming "DLSS is snakeoil", and similar things are just an expression of the complete lack of understanding of the people involved in these pot-stirring contests. Like... the technique obviously couldn't magically multiply the ability of hardware to generate frames using the primary method. It is exactly as advertised. It uses machine learning to approximate it. And it's some fantastic technology, that is now ubiquitous across the industry. Support and quality will increase over time, just like every _quality_ hardware product does during its early lifespan.

It's all so stupid and rooted in greed by those seeking ad-money, and those lacking in basic sense or experience in what they're talking about and doing. Embarrassing for the author to so publicly admit to eating up social media whinging.

grg0 1 day ago||
If you've ever watched a GN or LTT video, they never claimed that DLSS is snakeoil. They specifically call out the pros of the technology, but also point out that Nvidia lies, very literally, about its performance claims in marketing material. Both statements are true and not mutually exclusive. I think people like in this post get worked up about the false marketing and develop (understandably) a negative view of the technology as a whole.

> Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly

This is not true. Even GN reproduced the melting of the first-party cable.

Also, why shouldn't you be able to use third-party cables? Fuck DRM too.

spoaceman7777 1 day ago||
I'm referring to the section header in this article. Youtubers are not a truly hegemonic group, but there's a set of ideas and narratives that pervade the group as a whole that different subsets buy into, and push, and that's one that exists in the overall sphere of people who discuss the use of hardware for gaming.
grg0 1 day ago||
Well, I can't speak for all youtubers, but I do watch most GN and LTT videos and the complaints are legitimate, nor are they random jabronis yolo'ing hardware installations.
spoaceman7777 1 day ago||
As far as I know, neither of them have had a card unintentionally light on fire.

The whole thing started with Derbauer going to bat for a cable from some 3rd party vendor that he'd admitted he'd already plugged in and out of various cards something like 50 times.

The actual instances that youtubers report on are all reddit posters and other random social media users who would clearly be better off getting a professional installation. The huge popularity for enthusiast consumer hardware, due to the social media hype cycle, has brought a huge number of naive enthusiasts into the arena. And they're getting burned by doing hardware projects on their own. It's entirely unsurprising, given what happens in all other realms of amateur hardware projects.

Most of those who are whinging about their issues are false positive user errors. The actual failure rates (and there are device failures) are far lower, and that's what warrantys are for.

grg0 1 day ago||
I'm sure the failure rates are blown out of proportion, I agree with that.

But the fact of the matter is that Nvidia has shifted from a consumer business to b2b, and they don't even give a shit about pretending they care anymore. People take beef with that, understandably, and when you couple that with the false marketing, the lack of inventory, the occasional hardware failure, missing ROPs, insane prices that nobody can afford and all the other shit that's wrong with these GPUs, then this is the end result.

Rapzid 1 day ago||
GN were the OG "fake framers" going back to their constant casting shade on DLSS, ignoring it on their reviews, and also crapping on RT.

AI upscaling, AI denoising, and RT were clearly the future even 6 years ago. CDPR and the rest of the industry knew it, but outlets like GN pushed a narrative(borderline conspiracy) the developers were somehow out of touch and didn't know what they were talking about?

There is a contingent of gamers who play competitive FPS. Most of which are, like in all casual competitive hobbies, not very good. But they ate up the 240hz rasterization be-all meat GN was feeding them. Then they think they are the majority and speak for all gamers(as every loud minority on the internet does).

Fast forward 6 years and NVidia is crushing the Steam top 10 GPU list, AI rendering techniques are becoming ubiquitous, and RT is slowly edging out rasterization.

Now that the data is clear the narrative is most consumers are "suckers" for purchasing NVidia, Nintendo, and etc. And the content creator economy will be there to tell them they are right.

Edit: I believe too some of these outlets had chips on their shoulder regarding NVidia going way back. So AMDs poor RT performance and lack of any competitive answer the the DLSS suite for YEARS had them lying to themselves about where the industry was headed. Essentially they were running interference for AMD. Now that FSR4 is finally here it's like AI upscaling is finally ok.

PoshBreeze 1 day ago||
> The RTX 4090 was massive, a real heccin chonker. It was so huge in fact, that it kicked off the trend of needing support brackets to keep the GPU from sagging and straining the PCIe slot.

This isn't true. People were buying brackets with 10 series cards.

zoobab 22 hours ago||
Not enough VRAM to load big LLMs, in order not to compète with their expensive high end. Market segmentation it's called.
tonyhart7 1 day ago||
Consumer GPU feels like an "paper launch" for the past years

that's like they purposely not selling because they allocated 80% of their production to enterprise only

I just hope that new fabs operate early as possible because these price is insane

avipars 19 hours ago||
If only, NVIDIA could use their enterprise solution on consumer hardware.
jes5199 1 day ago||
with Intel also shitting the bed, it seems like AMD is poised to pick up “traditional computing” while everybody else runs off to chase the new gold rush. Presumably there’s still some money in desktops and gaming rigs?
dofubej 1 day ago||
> With over 90% of the PC market running on NVIDIA tech, they’re the clear winner of the GPU race. The losers are every single one of us.

Of course the fact that we overwhelmingly chose the better option means that… we are worse off or something?

atq2119 1 day ago||
That bit does seem a bit whiney. AMD's latest offerings are quite good, certainly better value for money. Why not buy that? The only shame is that they don't sell anything as massive as Nvidia's high end.
johnklos 21 hours ago|||
Many of you chose Windows, so, well, yes.
ohdeargodno 1 day ago||
Choosing the vendor locked in, standards hating brand does tend to mean that you inevitably get screwed when they decide do massively inflate their prices and there's nothing you can do about it does tend to make you worse off, yes.

Not that AMD was anywhere near being in a good state 10 years ago. Nvidia still fucked you over.

alganet 1 day ago||
Right now, all silicon talk is bullshit. It has been for a while.

It became obvious when old e-waste Xeons were turned into viable, usable machines, years ago.

Something is obviously wrong with this entire industry, and I cannot wait for it to pop. THIS will be the excitement everyone is looking for.

bigyabai 1 day ago||
A lot of those Xeon e-waste machines were downright awful, especially for the "cheap gaming PC" niche they were popular in. Low single-core clock speeds, low memory bandwidth for desktop-style configurations and super expensive motherboards that ran at a higher wattage than the consumer alternatives.

> THIS will be the excitement everyone is looking for.

Or TSMC could become geopolitically jeopardized somehow, drastically increasing the secondhand value of modern GPUs even beyond what they're priced at now. It's all a system of scarcity, things could go either way.

alganet 1 day ago||
They were awful compared to newer models, but for the price of nothing, pretty good deal.

If no good use is found for high-end GPUs, secondhand models will be like AOL CDs.

bigyabai 1 day ago||
Sure, eventually. Then in 2032, you can enjoy the raster performance that slightly-affluent people in 2025 had for years.

By your logic people should be snatching up the 900 and 1000-series cards by the truckload if the demand was so huge. But a GTX 980 is like $60 these days, and honestly not very competitive in many departments. Neither it nor the 1000-series have driver support nowadays, so most users will reach for a more recent card.

alganet 1 day ago||
There's no zero-cost e-waste like that anymore, it was a once-time thing.

Also, it's not "a logic", it's not a cosumer recomendation. It was a fluke in the industry that to me, represents a symptom.

gizajob 1 day ago||
Do you have a timeframe for the pop? I need some excitement.
alganet 1 day ago|||
More a sequence of potential events than a timeframe.

High-end GPUs are already useless for gaming (a low-end GPU is enough), their traditional source of demand. They're floating on artificial demand for a while now.

There are two markets that currently could use them: LLMs and Augmented Reality. Both of these are currently useless, and getting more useless by the day.

CPUs are just piggybacking on all of this.

So, lots of things hanging on unrealized promises. It will pop when there is no next use for super high-end GPUs.

War is a potential user of such devices, and I predict it could be the next thing after LLMs and AR. But then if war breaks out in such a scale to drive silicon prices up, lots of things are going to pop, and food and fuel will boom to such a magnitude that will make silicon look silly.

I think it will pop before it comes to the point of war driving it, and it will happen within our lifetimes (so, not a Nostradamus-style prediction that will only be realized long-after I'm dead).

int_19h 13 hours ago|||
> High-end GPUs are already useless for gaming (a low-end GPU is enough), their traditional source of demand. They're floating on artificial demand for a while now.

This is not the case if you want things like ray tracing or 4K.

selfhoster11 1 day ago||||
Local LLMs are becoming more popular and easier to run, and Chinese corporations are releasing extremely good models of all sizes under MIT or similar terms in many cases. There amount of VRAM is the main limiter, and it would help with gaming too.
alganet 1 day ago||
Gaming needs no additional VRAM.

From a market perspective, LLMs sell GPUs. Doesn't even matter if they work or not.

From the geopolitical tensions perspective, they're the perfect excuse to create infrastructure for a global analogue of the Great Firewall (something that the Chinese are pioneers of, and catching up to the plan).

From the software engineering perspective, LLMs are a nuissance, a distraction. They harm everyone.

selfhoster11 1 day ago||
> Gaming needs no additional VRAM.

Really? What about textures? Any ML that the new wave of games might use? For instance, while current LLMs powering NPC interactions would be pretty horrible, what about in 2 years time? You could have arbitrary dialogue trees AND dynamically voiced NPCs or PCs. This is categorically impossible without more VRAM.

> the perfect excuse to create infrastructure for a global analogue of the Great Firewall

Yes, let's have more censorship and kill the dream of the Internet even deader than it already is.

> From the software engineering perspective, LLMs are a nuissance, a distraction. They harm everyone.

You should be aware that reasonable minds can differ in this issue. I won't defend companies forcing the use of LLMs (it would be like forcing use of vim or any other tech you dislike), but I disagree about being a nuisance, distraction, or a universal harm. It's all down to choices and fit for use case.

alganet 1 day ago||
How is any of that related to actual silicon sales strategies?

Do not mistake adjacent topics for the main thing I'm discussing. It only proves my point that right now, all silicon talk is bullshit.

rightbyte 1 day ago||||
I don't see how GPU factories could be running in the event of war "in such a scale to drive silicon prices up". Unless you mean that supply will be low and people scavanging TI calculators for processors to make boxes playing Tetris and Space Invaders.
alganet 1 day ago||
Why not?

This is the exact model in which WWII operated. Car and plane supply chains were practically nationalized to support the military industry.

If drones, surveillance, satellites become the main war tech, they'll all use silicon, and things will be fully nationalized.

We already have all sorts of hints of this. Doesn't need a genius to predict that it could be what happens to these industries.

The balance with food and fuel is more delicate though. A war with drones, satellites and surveillance is not like WWII, there's a commercial aspect to it. If you put it on paper, food and fuel project more power and thus, can move more money. Any public crisis can make people forget about GPUs and jeopardize the process of nationalization that is currently being implemented, which still depends on relatively peaceful international trade.

rightbyte 1 day ago|||
> Why not?

Bombs that fly between continents or are launched from submarines for any "big scale" war.

alganet 1 day ago||
I don't see how this is connected to what you said before.
rightbyte 1 day ago||
My point is that GPU factories are big static targets with sensitive supply chains and thus have no strategic importance in being so easy to distrupt.
alganet 23 hours ago||
So are airplane and car factories. I already explained all of this, what keeps the supply chain together, and what their strategic value is.
rightbyte 15 hours ago||
I have no clue if we agree with eachother or not?
newsclues 1 day ago|||
CPU and GPU compute will be needed for military use processing the vast data from all sorts of sensors. Think about data centres crunching satellite imagery for trenches, fortifications and vehicles.
alganet 1 day ago||
> satellite imagery for trenches, fortifications and vehicles

Dude, you're describing the 80s. We're in 2025.

GPUs will be used for automated surveillance, espionage, brainwashing and market manipulation. At least that's what the current batch of technologies implies.

The only thing stopping this from becoming a full dystopia is that delicate balance with food and fuel I mentioned earlier.

It has become pretty obvious that entire wealthy nations can starve if they make the wrong move. Turns out GPUs cannot produce calories, and there's a limit to how much of a market you can manipulate to produce calories for you.

grg0 1 day ago|||
Hell, yeah. I'm in for some shared excitement too if y'all want to get some popcorn.
shmerl 1 day ago|
> ... NVENC are pretty much indispensable

What's so special about NVENC that Vulkan video or VAAPI can't provide?

> AMD also has accelerated video transcoding tech but for some reason nobody seems to be willing to implement it into their products

OBS works with VAAPI fine. Looking forward to them adding Vulkan video as an option.

Either way, as a Linux gamer I haven't touched Nvidia in years. AMD is a way better experience.

More comments...