Posted by todsacerdoti 7/4/2025
I feel like this is a misunderstanding, though I admit I'm splitting hairs here. DLSS is a form of TAA, and so is FSR and most other modern upscalers. You generally don't need an extra antialiasing pipeline if you're getting an artificially supersampled image.
We've seen this technique variably developed across the lifespan of realtime raster graphics; first with checkerboard rendering, then TAA, then now DLSS/frame generation. It has upsides and downsides, and some TAA implementations were actually really good for the time.
MSAA only helps with geometric edges, shader aliasing can be combatted with prefiltering but even then it's difficult to get rid of it completely. MSAA also needs beefy multisample intermediate buffers, this makes it pretty much a non-starter on heavily deferred rendering pipelines, which throw away coverage information to fit their framebuffer budget. On top of that the industry moved to stochastic effects for rendering all kinds of things that were too expensive before, the latest being actual realtime path tracing. I know people moan about TAA and DLSS but to do realtime path tracing at 4k is sort of nuts really. I still consider it a bit of a miracle we can do it at all.
Personally, I wish there was more research by big players into things like texture space lighting, which makes shading aliasing mostly go away, plays nice with alpha blending and would make MSAA viable again. The issue there is with shading only the stuff you see and not wasting texels.
SSAA is an even older technique than MSAA but the results are not visually the same as just having a really high-DPI screen with no AA.
Ray tracing? Temporal accumulation and denoising. Irradiance cache? Temporal accumulation and denoising. most modern light rendering techniques cannot be done in time in a single frame. Add to that the fact that deferred or hybrid rendering makes implementing MSAA be anywhere between "miserable" and "impossible", and you have the situation we're in today.
You need a 4k rendering resolution, at least. Modern effects look stunning at that res.
Unfortunately, nothing runs well at 4k with all the effects on.
He actually ended up buying older but somewhat similar used hardware with his personal money, to be able to do his work.
Not even sure if he was eventually able to expense it, but wouldn't be surprised if not, knowing how big companies bureaucracy works...
Otherwise the money is in the datacenter (AI/HPC) cards.
https://stockanalysis.com/stocks/nvda/metrics/revenue-by-seg...
my most recent upgrade was for a 4090, but that gives me only 24GB VRAM, and it's too expensive to justify buying two of them. I also have an antique kepler datacenter GPU, but Nvidia cut driver support a long while ago, making software quite a pain to get sorted. there's a nonzero chance I will wind up importing a Moore Threads GPU for next purchase; Nvidia's just way too expensive, and I don't need blazing fast speeds given most of my workloads run well inside the time I'm sleeping, but I can't be running at the speed of CPU; I need everything to fit into VRAM. I'd alternately be stoked for Intel to cater to me. $1500, 48GB+ VRAM, good pytorch support; make it happen, somebody.
I have a 4070 Ti right now. I use it for inference and VR gaming on a Pimax Crystal (2880x2880x2). In War Thunder I get ~60 FPS. I’d love to be able to upgrade to a card with at least 16GB of VRAM and better graphics performance… but as far as I can tell, such a card does not exist at any price.
Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly, posting to social media, and youtubers jumping on the trend for likes.
These are 99% user error issues drummed up by non-professionals (and, in some cases, people paid by 3rd party vendors to protect those vendors' reputation).
And the complaints about transient performances issues with drivers, drummed up into apocalyptics scenarios, again, by youtubers, who are putting this stuff under a microscope for views, are universal across every single hardware and software product. Everything.
Claiming "DLSS is snakeoil", and similar things are just an expression of the complete lack of understanding of the people involved in these pot-stirring contests. Like... the technique obviously couldn't magically multiply the ability of hardware to generate frames using the primary method. It is exactly as advertised. It uses machine learning to approximate it. And it's some fantastic technology, that is now ubiquitous across the industry. Support and quality will increase over time, just like every _quality_ hardware product does during its early lifespan.
It's all so stupid and rooted in greed by those seeking ad-money, and those lacking in basic sense or experience in what they're talking about and doing. Embarrassing for the author to so publicly admit to eating up social media whinging.
> Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly
This is not true. Even GN reproduced the melting of the first-party cable.
Also, why shouldn't you be able to use third-party cables? Fuck DRM too.
The whole thing started with Derbauer going to bat for a cable from some 3rd party vendor that he'd admitted he'd already plugged in and out of various cards something like 50 times.
The actual instances that youtubers report on are all reddit posters and other random social media users who would clearly be better off getting a professional installation. The huge popularity for enthusiast consumer hardware, due to the social media hype cycle, has brought a huge number of naive enthusiasts into the arena. And they're getting burned by doing hardware projects on their own. It's entirely unsurprising, given what happens in all other realms of amateur hardware projects.
Most of those who are whinging about their issues are false positive user errors. The actual failure rates (and there are device failures) are far lower, and that's what warrantys are for.
But the fact of the matter is that Nvidia has shifted from a consumer business to b2b, and they don't even give a shit about pretending they care anymore. People take beef with that, understandably, and when you couple that with the false marketing, the lack of inventory, the occasional hardware failure, missing ROPs, insane prices that nobody can afford and all the other shit that's wrong with these GPUs, then this is the end result.
AI upscaling, AI denoising, and RT were clearly the future even 6 years ago. CDPR and the rest of the industry knew it, but outlets like GN pushed a narrative(borderline conspiracy) the developers were somehow out of touch and didn't know what they were talking about?
There is a contingent of gamers who play competitive FPS. Most of which are, like in all casual competitive hobbies, not very good. But they ate up the 240hz rasterization be-all meat GN was feeding them. Then they think they are the majority and speak for all gamers(as every loud minority on the internet does).
Fast forward 6 years and NVidia is crushing the Steam top 10 GPU list, AI rendering techniques are becoming ubiquitous, and RT is slowly edging out rasterization.
Now that the data is clear the narrative is most consumers are "suckers" for purchasing NVidia, Nintendo, and etc. And the content creator economy will be there to tell them they are right.
Edit: I believe too some of these outlets had chips on their shoulder regarding NVidia going way back. So AMDs poor RT performance and lack of any competitive answer the the DLSS suite for YEARS had them lying to themselves about where the industry was headed. Essentially they were running interference for AMD. Now that FSR4 is finally here it's like AI upscaling is finally ok.
Each year those performance margins seem to narrow. I paid $1000+ dollars for my RTX 4080 Super. That’s ridiculous. No video card should cost over $1000. So the next time I “upgrade,” it won’t be NVIDIA. I’ll probably go back to AMD or Intel.
I would love to see Intel continue to develop video cards that are high performance and affordable. There is a huge market for those unicorns. AMDs model seems to be slightly less performance for slightly less money. Intel on the other hand is offering performance on par with AMD and sometimes NVIDIA for far less money - a winning formula.
NVIDIA got too greedy. They overplayed their hand. Time for Intel to focus on development and fill the gaping void of price for performance metrics.