Top
Best
New

Posted by todsacerdoti 7/4/2025

Nvidia won, we all lost(blog.sebin-nyshkim.net)
995 points | 571 commentspage 4
bigyabai 7/4/2025|
> Pretty much all upscalers force TAA for anti-aliasing and it makes the entire image on the screen look blurry as fuck the lower the resolution is.

I feel like this is a misunderstanding, though I admit I'm splitting hairs here. DLSS is a form of TAA, and so is FSR and most other modern upscalers. You generally don't need an extra antialiasing pipeline if you're getting an artificially supersampled image.

We've seen this technique variably developed across the lifespan of realtime raster graphics; first with checkerboard rendering, then TAA, then now DLSS/frame generation. It has upsides and downsides, and some TAA implementations were actually really good for the time.

kbolino 7/4/2025||
Every kind of TAA that I've seen creates artifacts around fast-moving objects. This may sound like a niche problem only found in fast-twitch games but it's cropped up in turn-based RPGs and factory/city builders. I personally turn it off as soon as I notice it. Unfortunately, some games have removed traditional MSAA as an option, and some are even making it difficult to turn off AA when TAA and FXAA are the only options (though you can usually override these restrictions with driver settings).
user____name 7/5/2025|||
The sad truth is that with rasterization every renderer needs to be designed around a specific set of antialiasing solutions. Antialiasing is like a big wall in your rendering pipeline, there's the stuff you can do before resolving and the stuff you can do afterwards. The problem with MSAA is that it is pretty much tightly coupled with all your architectural rendering decisions. To that end, TAA is simply the easiest to implement and it kills a lot of proverbial birds with one stone. And it can all be implemented as essentially a post processing effect, it has much less of the tight coupling.

MSAA only helps with geometric edges, shader aliasing can be combatted with prefiltering but even then it's difficult to get rid of it completely. MSAA also needs beefy multisample intermediate buffers, this makes it pretty much a non-starter on heavily deferred rendering pipelines, which throw away coverage information to fit their framebuffer budget. On top of that the industry moved to stochastic effects for rendering all kinds of things that were too expensive before, the latest being actual realtime path tracing. I know people moan about TAA and DLSS but to do realtime path tracing at 4k is sort of nuts really. I still consider it a bit of a miracle we can do it at all.

Personally, I wish there was more research by big players into things like texture space lighting, which makes shading aliasing mostly go away, plays nice with alpha blending and would make MSAA viable again. The issue there is with shading only the stuff you see and not wasting texels.

kbolino 7/5/2025||
There's another path, which is to raise the pixel densities so high we don't need AA (as much) anymore, but I'm going to guess it's a) even more expensive and b) not going to fix all the problems anyway.
MindSpunk 7/5/2025||
That's just called super sampling. Render at 4k+ and down sample to your target display. It's as expensive as it sounds.
kbolino 7/5/2025||
No, I mean high pixel densities all the way to the display.

SSAA is an even older technique than MSAA but the results are not visually the same as just having a really high-DPI screen with no AA.

int_19h 7/6/2025||
Up to a point. I would argue that 8K downsampled to 4K is practically indistinguishable from native 8K.
ohdeargodno 7/4/2025|||
It's not that it's difficult to turn off TAA: it's that so many modern techniques do not work without temporal accumulation and anti-aliasing.

Ray tracing? Temporal accumulation and denoising. Irradiance cache? Temporal accumulation and denoising. most modern light rendering techniques cannot be done in time in a single frame. Add to that the fact that deferred or hybrid rendering makes implementing MSAA be anywhere between "miserable" and "impossible", and you have the situation we're in today.

kbolino 7/4/2025||
A lot of this is going to come down to taste so de gustibus and all that, but this feels like building on a foundation of sand. If the artifacts can be removed (or at least mitigated), then by all means let's keep going with cool new stuff as long as it doesn't detract from other aspects of a game. But if they can't be fixed, then either these techniques ought to be relegated to special uses (like cutscenes or the background, kinda like the pre-rendered backdrops of FF7) or abandoned/rethought as pretty but impractical.
ohdeargodno 7/5/2025||
So, there is a way to make it so that TAA and various temporal techniques look basically flawless. They need a _lot_ of information and pixels.

You need a 4k rendering resolution, at least. Modern effects look stunning at that res.

Unfortunately, nothing runs well at 4k with all the effects on.

yalok 7/5/2025||
a friend of mine is a SW developer in Nvidia, working on their drivers. He was complaining lately that he is required to fix a few bugs in the drivers code for the new card (RTX?), while not provided with the actual hardware. His pleas to send him this HW were ignored, but the demand to fix by a deadline kept being pushed.

He actually ended up buying older but somewhat similar used hardware with his personal money, to be able to do his work.

Not even sure if he was eventually able to expense it, but wouldn't be surprised if not, knowing how big companies bureaucracy works...

Nifty3929 7/5/2025||
I just don't think NVidia cares all that much about it's gaming cards, except to the extent that they don't want to cede too much ground to AMD and basically preserve their image in that market for now. Basically they don't want to lose their legions of gaming fans that got them started, and who still carry the torch. But they'll produce the minimum number of gaming cards needed to accomplish that.

Otherwise the money is in the datacenter (AI/HPC) cards.

FeepingCreature 7/4/2025||
Oh man, you haven't gotten into their AI benchmark bullshittery. There's factors of 4x on their numbers that are basically invented whole cloth by switching units.
robbies 7/6/2025|
They “learned” this trick from their consumer days. Devs always had to reverse-engineer the hypothetical scaling from their fantasy numbers
musebox35 7/5/2025||
With the rise of LLM training, Nvidia’s main revenue stream switched to datacenter gpus (>10x gaming revenue). I wonder whether this have affected the quality of these consumer cards, including both their design and product processes:

https://stockanalysis.com/stocks/nvda/metrics/revenue-by-seg...

kldg 7/6/2025||
the big reason I upgrade GPUs these days is for more VRAM for LLMs and diffusion models. I don't care (or need to care, really) as much about gaming -- along with great Proton support, running things from a midrange Linux-based gaming PC I have shoved in my home server rack works great via Steam's Remote Play (NoMachine also pretty good), but I play strategy/spreadsheet games, not twitchy FPS games.

my most recent upgrade was for a 4090, but that gives me only 24GB VRAM, and it's too expensive to justify buying two of them. I also have an antique kepler datacenter GPU, but Nvidia cut driver support a long while ago, making software quite a pain to get sorted. there's a nonzero chance I will wind up importing a Moore Threads GPU for next purchase; Nvidia's just way too expensive, and I don't need blazing fast speeds given most of my workloads run well inside the time I'm sleeping, but I can't be running at the speed of CPU; I need everything to fit into VRAM. I'd alternately be stoked for Intel to cater to me. $1500, 48GB+ VRAM, good pytorch support; make it happen, somebody.

Ancapistani 7/5/2025||
I disagree with some of the article’s points - primarily, that nVidia’s drivers were ever “good” - but the gist I agree with.

I have a 4070 Ti right now. I use it for inference and VR gaming on a Pimax Crystal (2880x2880x2). In War Thunder I get ~60 FPS. I’d love to be able to upgrade to a card with at least 16GB of VRAM and better graphics performance… but as far as I can tell, such a card does not exist at any price.

spoaceman7777 7/5/2025||
The real issue here is actually harebrained youtubers stirring up drama for views. That's 80% of the problem. And their viewers (and readers, for that which makes it into print) eat it up.

Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly, posting to social media, and youtubers jumping on the trend for likes.

These are 99% user error issues drummed up by non-professionals (and, in some cases, people paid by 3rd party vendors to protect those vendors' reputation).

And the complaints about transient performances issues with drivers, drummed up into apocalyptics scenarios, again, by youtubers, who are putting this stuff under a microscope for views, are universal across every single hardware and software product. Everything.

Claiming "DLSS is snakeoil", and similar things are just an expression of the complete lack of understanding of the people involved in these pot-stirring contests. Like... the technique obviously couldn't magically multiply the ability of hardware to generate frames using the primary method. It is exactly as advertised. It uses machine learning to approximate it. And it's some fantastic technology, that is now ubiquitous across the industry. Support and quality will increase over time, just like every _quality_ hardware product does during its early lifespan.

It's all so stupid and rooted in greed by those seeking ad-money, and those lacking in basic sense or experience in what they're talking about and doing. Embarrassing for the author to so publicly admit to eating up social media whinging.

grg0 7/5/2025||
If you've ever watched a GN or LTT video, they never claimed that DLSS is snakeoil. They specifically call out the pros of the technology, but also point out that Nvidia lies, very literally, about its performance claims in marketing material. Both statements are true and not mutually exclusive. I think people like in this post get worked up about the false marketing and develop (understandably) a negative view of the technology as a whole.

> Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly

This is not true. Even GN reproduced the melting of the first-party cable.

Also, why shouldn't you be able to use third-party cables? Fuck DRM too.

spoaceman7777 7/5/2025||
I'm referring to the section header in this article. Youtubers are not a truly hegemonic group, but there's a set of ideas and narratives that pervade the group as a whole that different subsets buy into, and push, and that's one that exists in the overall sphere of people who discuss the use of hardware for gaming.
grg0 7/5/2025||
Well, I can't speak for all youtubers, but I do watch most GN and LTT videos and the complaints are legitimate, nor are they random jabronis yolo'ing hardware installations.
spoaceman7777 7/5/2025||
As far as I know, neither of them have had a card unintentionally light on fire.

The whole thing started with Derbauer going to bat for a cable from some 3rd party vendor that he'd admitted he'd already plugged in and out of various cards something like 50 times.

The actual instances that youtubers report on are all reddit posters and other random social media users who would clearly be better off getting a professional installation. The huge popularity for enthusiast consumer hardware, due to the social media hype cycle, has brought a huge number of naive enthusiasts into the arena. And they're getting burned by doing hardware projects on their own. It's entirely unsurprising, given what happens in all other realms of amateur hardware projects.

Most of those who are whinging about their issues are false positive user errors. The actual failure rates (and there are device failures) are far lower, and that's what warrantys are for.

grg0 7/5/2025||
I'm sure the failure rates are blown out of proportion, I agree with that.

But the fact of the matter is that Nvidia has shifted from a consumer business to b2b, and they don't even give a shit about pretending they care anymore. People take beef with that, understandably, and when you couple that with the false marketing, the lack of inventory, the occasional hardware failure, missing ROPs, insane prices that nobody can afford and all the other shit that's wrong with these GPUs, then this is the end result.

Rapzid 7/5/2025||
GN were the OG "fake framers" going back to their constant casting shade on DLSS, ignoring it on their reviews, and also crapping on RT.

AI upscaling, AI denoising, and RT were clearly the future even 6 years ago. CDPR and the rest of the industry knew it, but outlets like GN pushed a narrative(borderline conspiracy) the developers were somehow out of touch and didn't know what they were talking about?

There is a contingent of gamers who play competitive FPS. Most of which are, like in all casual competitive hobbies, not very good. But they ate up the 240hz rasterization be-all meat GN was feeding them. Then they think they are the majority and speak for all gamers(as every loud minority on the internet does).

Fast forward 6 years and NVidia is crushing the Steam top 10 GPU list, AI rendering techniques are becoming ubiquitous, and RT is slowly edging out rasterization.

Now that the data is clear the narrative is most consumers are "suckers" for purchasing NVidia, Nintendo, and etc. And the content creator economy will be there to tell them they are right.

Edit: I believe too some of these outlets had chips on their shoulder regarding NVidia going way back. So AMDs poor RT performance and lack of any competitive answer the the DLSS suite for YEARS had them lying to themselves about where the industry was headed. Essentially they were running interference for AMD. Now that FSR4 is finally here it's like AI upscaling is finally ok.

TimParker1727 7/5/2025||
Here’s my take on video cards in general. I love NVIDIA cards for all out performance. You simply can’t beat them. And until someone does, they will not change. I have owned AMD and Intel cards as well and played mainly FPS games like Doim, Quake, Crysis, Medal of Honor, COD, etc. all of them perform better on NVIDIA. But I have noticed a change.

Each year those performance margins seem to narrow. I paid $1000+ dollars for my RTX 4080 Super. That’s ridiculous. No video card should cost over $1000. So the next time I “upgrade,” it won’t be NVIDIA. I’ll probably go back to AMD or Intel.

I would love to see Intel continue to develop video cards that are high performance and affordable. There is a huge market for those unicorns. AMDs model seems to be slightly less performance for slightly less money. Intel on the other hand is offering performance on par with AMD and sometimes NVIDIA for far less money - a winning formula.

NVIDIA got too greedy. They overplayed their hand. Time for Intel to focus on development and fill the gaping void of price for performance metrics.

tom_m 7/6/2025|
Know what really kicks me in the nuts? Stupid kid me didn't buy Nvidia when I told my father to back in like 2002 for $16 or something. He did. And holds it until this day. Fortunately that means taking care of him is easier haha, but dang I should have gotten some too.
More comments...