Top
Best
New

Posted by todsacerdoti 1 day ago

Nvidia won, we all lost(blog.sebin-nyshkim.net)
911 points | 543 commentspage 6
benreesman 1 day ago|
The thing is, company culture is a real thing. And some cultures are invasive/contagious like kudzu both internally to the company and into adjacent companies that they get comped against. The people get to thinking a certain way, they move around between adjacent companies at far higher rates than to more distant parts of their field, the executives start sitting on one another's boards, before you know it a whole segment is enshittified, and customers feel like captives in an exploitation machine instead of parties to a mutually beneficial transaction in which trade increases the wealth of all.

And you can build mythologies around falsehoods to further reinforce it: "I have a legal obligation to maximize shareholder value." No buddy, you have some very specific restrictions on your ability to sell the company to your cousin (ha!) for a handful of glass beads. You have a legal obligation to bin your wafers the way it says on your own box, but that doesn't seem to bother you.

These days I get a machine like the excellent ASUS Proart P16 (grab one of those before they're all gone if you can) with a little 4060 or 4070 in it that can boot up Pytorch and make sure the model will run forwards and backwards at a contrived size, and then go rent a GB200 or whatever from Latitude or someone (seriously check out Latitude, they're great), or maybe one of those wildly competitive L40 series fly machines (fly whips the llama's ass like nothing since Winamp, check them out too). The GMTek EVO-X1 is a pretty capable little ROCm inference machine for under 1000, its big brother is nipping at the heels of a DGX Spark under 2k. There is good stuff out there but its all from non-incumbent angles.

I don't game anymore but if I did I would be paying a lot of attention to ARC, I've heard great things.

Fuck the cloud and their ancient Xeon SKUs for more than Latitude charges for 5Ghz EPYC. Fuck NVIDIA gaming retail rat race, its an electrical as well as moral hazard in 2025.

It's a shame we all have to be tricky to get what used to be a halfway fair deal 5-10 years ago (and 20 years ago they passed a HUGE part of the scaling bonanza down to the consumer), but its possible to compute well in 2025.

glitchc 1 day ago||
Nice advertorial. I hope you got paid for all of those plugs.
benreesman 1 day ago||
I wish! People don't care what I think enough to monetize it.

But I do spend a lot of effort finding good deals on modern ass compute. This is the shit I use to get a lot of performance on a budget.

Will people pay you to post on HN? How do I sign up?

827a 1 day ago||
> Fuck the cloud and their ancient Xeon SKUs

Dude, no one talks about this and it drives me up the wall. The only way to guarantee modern CPUs from any cloud provider is to explicitly provision really new instance types. If you use any higher-level abstracted services (Fargate, Cloud Run, Lambda, whatever) you get salvation army second-hand CPUs from 15 years ago, you're billed by the second so the slower, older CPUs screw you over there, and you pay a 30%+ premium over the lower-level instances because its a "managed service". Its insane and extremely sad that so many customers put up with it.

benreesman 1 day ago||
Bare metal is priced like it always was but is mad convenient now. latitude.sh is my favorite, but there are a bunch of providers that are maybe a little less polished.

It's also way faster to deploy and easier to operate now. And mad global, I've needed to do it all over the world (a lot of places the shit works flawlessly and you can get Ryzen SKUs for nothing).

Protip: burn a partition of Ubuntu 24.04 LTS which is the default on everything and use that as "premium IPMI", even if you run Ubuntu. you can always boot into a known perfect thing with all the tools to tweak whatever. If I have to even restart on I just image it, faster than launching a VM on EC2.

oilkillsbirds 1 day ago||
Nobody’s going to read this, but this article and sentiment is utter anti-corporate bullshit, and the vastly congruent responses show that none of you have watched the historical development of GPGPU, or do any serious work on GPUs, or keep up with the open work of nvidia researchers.

The spoiled gamer mentality is getting old for those of us that actually work daily in GPGPU across industries, develop with RTX kit, do AI research, etc.

Yes they’ve had some marketing and technical flubs as any giant publically traded company will have, but their balance of research-driven development alongside corporate profit necessities is unmatched.

oilkillsbirds 1 day ago||
And no I don’t work for nvidia. I’ve just been in the industry long enough to watch the immense contribution nvidia has made to every. single. field. The work of their researchers is astounding, it’s clear to anyone that’s honestly worked in this field long enough. It’s insane to hate on them.
grg0 1 day ago||
Their contribution to various fields and the fact that they treat the average consumer like shit nowadays are not mutually exclusive.

Also, nobody ever said they hate their researchers.

Rapzid 1 day ago||
Maybe the average consumer doesn't agree they are being treated like shit? Steam top 10 GPU list is almost all NVidia. Happy customers or duped suckers? I've seen the later sentiment a lot over the years and discounting consumer's preferences never seems to lead to correct prediction of outcomes..
detaro 1 day ago||
Or maybe the average consumer bought them while still being unhappy about the overall situation?
gdbsjjdn 1 day ago||
It pains me to be on the side of "gamers" but I would rather support spoiled gamers than modern LLM bros.
amatecha 1 day ago||
Uhh, these 12VHPWR connectors seem like a serious fire risk. How are they not being recalled? I just got a 5060ti , now I'm wishing I went AMD instead.. what the hell :(

Whoa, the stuff covered in the rest of the post is just as egregious. Wow! Maybe time to figure out which AMD models compares performance-wise and sell this thing, jeez.

Havoc 1 day ago||
They’re not full of shit - they’re just doing what a for profit co in a dominant position does.

In other news I hope intel pulls their thumb out of their ass cause AMD is crushing it and that’s gonna end the same way

fithisux 1 day ago||
NVidia won?

Not for me. I prefer Intel offerings. Open and Linux friendly.

I even hope they would release the next gen Risc-V boards with Intel Graphics.

camel-cdr 1 day ago|
A RISC-V board with NVIDIA graphics is more likely: https://mp.weixin.qq.com/s/KiV13GqXGMZfZjopY0Xxpg

NVIDIA Keynote from the upcoming RISC-V Summit China: "Enabling RISC-V application processors in NVIDIA compute platforms"

bigyabai 1 day ago||
> Pretty much all upscalers force TAA for anti-aliasing and it makes the entire image on the screen look blurry as fuck the lower the resolution is.

I feel like this is a misunderstanding, though I admit I'm splitting hairs here. DLSS is a form of TAA, and so is FSR and most other modern upscalers. You generally don't need an extra antialiasing pipeline if you're getting an artificially supersampled image.

We've seen this technique variably developed across the lifespan of realtime raster graphics; first with checkerboard rendering, then TAA, then now DLSS/frame generation. It has upsides and downsides, and some TAA implementations were actually really good for the time.

kbolino 1 day ago||
Every kind of TAA that I've seen creates artifacts around fast-moving objects. This may sound like a niche problem only found in fast-twitch games but it's cropped up in turn-based RPGs and factory/city builders. I personally turn it off as soon as I notice it. Unfortunately, some games have removed traditional MSAA as an option, and some are even making it difficult to turn off AA when TAA and FXAA are the only options (though you can usually override these restrictions with driver settings).
user____name 1 day ago|||
The sad truth is that with rasterization every renderer needs to be designed around a specific set of antialiasing solutions. Antialiasing is like a big wall in your rendering pipeline, there's the stuff you can do before resolving and the stuff you can do afterwards. The problem with MSAA is that it is pretty much tightly coupled with all your architectural rendering decisions. To that end, TAA is simply the easiest to implement and it kills a lot of proverbial birds with one stone. And it can all be implemented as essentially a post processing effect, it has much less of the tight coupling.

MSAA only helps with geometric edges, shader aliasing can be combatted with prefiltering but even then it's difficult to get rid of it completely. MSAA also needs beefy multisample intermediate buffers, this makes it pretty much a non-starter on heavily deferred rendering pipelines, which throw away coverage information to fit their framebuffer budget. On top of that the industry moved to stochastic effects for rendering all kinds of things that were too expensive before, the latest being actual realtime path tracing. I know people moan about TAA and DLSS but to do realtime path tracing at 4k is sort of nuts really. I still consider it a bit of a miracle we can do it at all.

Personally, I wish there was more research by big players into things like texture space lighting, which makes shading aliasing mostly go away, plays nice with alpha blending and would make MSAA viable again. The issue there is with shading only the stuff you see and not wasting texels.

kbolino 1 day ago||
There's another path, which is to raise the pixel densities so high we don't need AA (as much) anymore, but I'm going to guess it's a) even more expensive and b) not going to fix all the problems anyway.
MindSpunk 1 day ago||
That's just called super sampling. Render at 4k+ and down sample to your target display. It's as expensive as it sounds.
kbolino 1 day ago||
No, I mean high pixel densities all the way to the display.

SSAA is an even older technique than MSAA but the results are not visually the same as just having a really high-DPI screen with no AA.

int_19h 19 hours ago||
Up to a point. I would argue that 8K downsampled to 4K is practically indistinguishable from native 8K.
ohdeargodno 1 day ago|||
It's not that it's difficult to turn off TAA: it's that so many modern techniques do not work without temporal accumulation and anti-aliasing.

Ray tracing? Temporal accumulation and denoising. Irradiance cache? Temporal accumulation and denoising. most modern light rendering techniques cannot be done in time in a single frame. Add to that the fact that deferred or hybrid rendering makes implementing MSAA be anywhere between "miserable" and "impossible", and you have the situation we're in today.

kbolino 1 day ago||
A lot of this is going to come down to taste so de gustibus and all that, but this feels like building on a foundation of sand. If the artifacts can be removed (or at least mitigated), then by all means let's keep going with cool new stuff as long as it doesn't detract from other aspects of a game. But if they can't be fixed, then either these techniques ought to be relegated to special uses (like cutscenes or the background, kinda like the pre-rendered backdrops of FF7) or abandoned/rethought as pretty but impractical.
ohdeargodno 1 day ago||
So, there is a way to make it so that TAA and various temporal techniques look basically flawless. They need a _lot_ of information and pixels.

You need a 4k rendering resolution, at least. Modern effects look stunning at that res.

Unfortunately, nothing runs well at 4k with all the effects on.

d00mB0t 1 day ago||
Sounds about right :D
andrewstuart 1 day ago||
All symptoms of being number one.

Customers don’t matter, the company matters.

Competition sorts out such attitude quick smart but AMD never misses a chance to copy Nvidias strategy in any way and intel is well behind.

So for now, you’ll eat what Jensen feeds you.

sonicvrooom 1 day ago|
it would be "just" capitalist to call these fuckers out for real, on the smallest level.

you are safe.

More comments...