Posted by todsacerdoti 1 day ago
Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly, posting to social media, and youtubers jumping on the trend for likes.
These are 99% user error issues drummed up by non-professionals (and, in some cases, people paid by 3rd party vendors to protect those vendors' reputation).
And the complaints about transient performances issues with drivers, drummed up into apocalyptics scenarios, again, by youtubers, who are putting this stuff under a microscope for views, are universal across every single hardware and software product. Everything.
Claiming "DLSS is snakeoil", and similar things are just an expression of the complete lack of understanding of the people involved in these pot-stirring contests. Like... the technique obviously couldn't magically multiply the ability of hardware to generate frames using the primary method. It is exactly as advertised. It uses machine learning to approximate it. And it's some fantastic technology, that is now ubiquitous across the industry. Support and quality will increase over time, just like every _quality_ hardware product does during its early lifespan.
It's all so stupid and rooted in greed by those seeking ad-money, and those lacking in basic sense or experience in what they're talking about and doing. Embarrassing for the author to so publicly admit to eating up social media whinging.
> Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly
This is not true. Even GN reproduced the melting of the first-party cable.
Also, why shouldn't you be able to use third-party cables? Fuck DRM too.
The whole thing started with Derbauer going to bat for a cable from some 3rd party vendor that he'd admitted he'd already plugged in and out of various cards something like 50 times.
The actual instances that youtubers report on are all reddit posters and other random social media users who would clearly be better off getting a professional installation. The huge popularity for enthusiast consumer hardware, due to the social media hype cycle, has brought a huge number of naive enthusiasts into the arena. And they're getting burned by doing hardware projects on their own. It's entirely unsurprising, given what happens in all other realms of amateur hardware projects.
Most of those who are whinging about their issues are false positive user errors. The actual failure rates (and there are device failures) are far lower, and that's what warrantys are for.
But the fact of the matter is that Nvidia has shifted from a consumer business to b2b, and they don't even give a shit about pretending they care anymore. People take beef with that, understandably, and when you couple that with the false marketing, the lack of inventory, the occasional hardware failure, missing ROPs, insane prices that nobody can afford and all the other shit that's wrong with these GPUs, then this is the end result.
AI upscaling, AI denoising, and RT were clearly the future even 6 years ago. CDPR and the rest of the industry knew it, but outlets like GN pushed a narrative(borderline conspiracy) the developers were somehow out of touch and didn't know what they were talking about?
There is a contingent of gamers who play competitive FPS. Most of which are, like in all casual competitive hobbies, not very good. But they ate up the 240hz rasterization be-all meat GN was feeding them. Then they think they are the majority and speak for all gamers(as every loud minority on the internet does).
Fast forward 6 years and NVidia is crushing the Steam top 10 GPU list, AI rendering techniques are becoming ubiquitous, and RT is slowly edging out rasterization.
Now that the data is clear the narrative is most consumers are "suckers" for purchasing NVidia, Nintendo, and etc. And the content creator economy will be there to tell them they are right.
Edit: I believe too some of these outlets had chips on their shoulder regarding NVidia going way back. So AMDs poor RT performance and lack of any competitive answer the the DLSS suite for YEARS had them lying to themselves about where the industry was headed. Essentially they were running interference for AMD. Now that FSR4 is finally here it's like AI upscaling is finally ok.
This isn't true. People were buying brackets with 10 series cards.
that's like they purposely not selling because they allocated 80% of their production to enterprise only
I just hope that new fabs operate early as possible because these price is insane
Of course the fact that we overwhelmingly chose the better option means that… we are worse off or something?
Not that AMD was anywhere near being in a good state 10 years ago. Nvidia still fucked you over.
It became obvious when old e-waste Xeons were turned into viable, usable machines, years ago.
Something is obviously wrong with this entire industry, and I cannot wait for it to pop. THIS will be the excitement everyone is looking for.
> THIS will be the excitement everyone is looking for.
Or TSMC could become geopolitically jeopardized somehow, drastically increasing the secondhand value of modern GPUs even beyond what they're priced at now. It's all a system of scarcity, things could go either way.
If no good use is found for high-end GPUs, secondhand models will be like AOL CDs.
By your logic people should be snatching up the 900 and 1000-series cards by the truckload if the demand was so huge. But a GTX 980 is like $60 these days, and honestly not very competitive in many departments. Neither it nor the 1000-series have driver support nowadays, so most users will reach for a more recent card.
Also, it's not "a logic", it's not a cosumer recomendation. It was a fluke in the industry that to me, represents a symptom.
High-end GPUs are already useless for gaming (a low-end GPU is enough), their traditional source of demand. They're floating on artificial demand for a while now.
There are two markets that currently could use them: LLMs and Augmented Reality. Both of these are currently useless, and getting more useless by the day.
CPUs are just piggybacking on all of this.
So, lots of things hanging on unrealized promises. It will pop when there is no next use for super high-end GPUs.
War is a potential user of such devices, and I predict it could be the next thing after LLMs and AR. But then if war breaks out in such a scale to drive silicon prices up, lots of things are going to pop, and food and fuel will boom to such a magnitude that will make silicon look silly.
I think it will pop before it comes to the point of war driving it, and it will happen within our lifetimes (so, not a Nostradamus-style prediction that will only be realized long-after I'm dead).
This is not the case if you want things like ray tracing or 4K.
From a market perspective, LLMs sell GPUs. Doesn't even matter if they work or not.
From the geopolitical tensions perspective, they're the perfect excuse to create infrastructure for a global analogue of the Great Firewall (something that the Chinese are pioneers of, and catching up to the plan).
From the software engineering perspective, LLMs are a nuissance, a distraction. They harm everyone.
Really? What about textures? Any ML that the new wave of games might use? For instance, while current LLMs powering NPC interactions would be pretty horrible, what about in 2 years time? You could have arbitrary dialogue trees AND dynamically voiced NPCs or PCs. This is categorically impossible without more VRAM.
> the perfect excuse to create infrastructure for a global analogue of the Great Firewall
Yes, let's have more censorship and kill the dream of the Internet even deader than it already is.
> From the software engineering perspective, LLMs are a nuissance, a distraction. They harm everyone.
You should be aware that reasonable minds can differ in this issue. I won't defend companies forcing the use of LLMs (it would be like forcing use of vim or any other tech you dislike), but I disagree about being a nuisance, distraction, or a universal harm. It's all down to choices and fit for use case.
Do not mistake adjacent topics for the main thing I'm discussing. It only proves my point that right now, all silicon talk is bullshit.
This is the exact model in which WWII operated. Car and plane supply chains were practically nationalized to support the military industry.
If drones, surveillance, satellites become the main war tech, they'll all use silicon, and things will be fully nationalized.
We already have all sorts of hints of this. Doesn't need a genius to predict that it could be what happens to these industries.
The balance with food and fuel is more delicate though. A war with drones, satellites and surveillance is not like WWII, there's a commercial aspect to it. If you put it on paper, food and fuel project more power and thus, can move more money. Any public crisis can make people forget about GPUs and jeopardize the process of nationalization that is currently being implemented, which still depends on relatively peaceful international trade.
Bombs that fly between continents or are launched from submarines for any "big scale" war.
Dude, you're describing the 80s. We're in 2025.
GPUs will be used for automated surveillance, espionage, brainwashing and market manipulation. At least that's what the current batch of technologies implies.
The only thing stopping this from becoming a full dystopia is that delicate balance with food and fuel I mentioned earlier.
It has become pretty obvious that entire wealthy nations can starve if they make the wrong move. Turns out GPUs cannot produce calories, and there's a limit to how much of a market you can manipulate to produce calories for you.
What's so special about NVENC that Vulkan video or VAAPI can't provide?
> AMD also has accelerated video transcoding tech but for some reason nobody seems to be willing to implement it into their products
OBS works with VAAPI fine. Looking forward to them adding Vulkan video as an option.
Either way, as a Linux gamer I haven't touched Nvidia in years. AMD is a way better experience.