Top
Best
New

Posted by todsacerdoti 7/4/2025

Nvidia won, we all lost(blog.sebin-nyshkim.net)
995 points | 571 commentspage 2
rkagerer 7/5/2025|
I am a volunteer firefighter and hold a degree in electrical engineering. The shenanigans with their shunt resistors, and ensuing melting cables, is in my view criminal. Any engineer worth their salt would recognize pushing 600W through a bunch of small cables with no contingency if some of them have failed is just asking for trouble. These assholes are going to set someone's house on fire.

I hope they get hit with a class action lawsuit and are forced to recall and properly fix these products before anyone dies as a result of their shoddy engineering.

rkagerer 7/5/2025||
Apparently somebody did sue a couple years back. Anyone know what happened with the [plaintiff] vs. nVidia lawsuit?

EDIT: Plantiff dismissed it. Guessing they settled. Here are the court documents (alternately, shakna's links below include unredacted copies):

https://www.classaction.org/media/plaintiff-v-nvidia-corpora...

https://www.classaction.org/media/plaintiff-v-nvidia-corpora...

A GamersNexus article investigating the matter: https://gamersnexus.net/gpus/12vhpwr-dumpster-fire-investiga...

And a video referenced in the original post, describing how the design changed from one that proactively managed current balancing, to simply bundling all the connections together and hoping for the best: https://youtu.be/kb5YzMoVQyw

shakna 7/5/2025|||
> NOTICE of Voluntary Dismissal With Prejudice by [name redacted] (Filed on 3/10/2023) (Entered: 03/10/2023)

Sounds like it was settled out of court.

[0] https://www.docketalarm.com/cases/California_Northern_Distri...

middle-aged-man 7/5/2025||||
Do those mention failing to follow Underwriters Laboratory requirements?

I’m curious whether the 5090 package was not following UL requirements.

Would that make them even more liable?

Part of me believes that the blame here is probably on the manufacturers and that this isn’t a problem with Nvidia corporate.

autobodie 7/5/2025|||
GamersNexus ftw as always
lukeschlather 7/5/2025|||
Also, like, I kind of want to play with these things, but also I'm not sure I want a computer that uses 500W+ in my house, let alone just a GPU.

I might actually be happy to buy one of these things, at the inflated price, and run it at half voltage or something... but I can't tell if that is going to fix these concerns or they're just bad cards.

wasabinator 7/5/2025|||
It's not the voltage, it's the current you'd want to halve. The wire gauge required to carry power is dependent on the current load. It's why when i first saw these new connectors and the loads they were being tasked with it was a wtf moment for me. Better to just avoid them in the first place though.
dietr1ch 7/5/2025||
It's crazy, you don't even need to know about electricity after you see a thermal camera on them operating at full load. I'm surprised they can be sold to the general public, the reports of cables melting plus the high temps should be enough to force a recall.
izacus 7/5/2025|||
With 5080 using 300W, talking about 500W is a bit of an exaggeration, isn't it?
lukeschlather 7/5/2025||
I'm talking about the 5090 which is 575W.
izacus 7/5/2025||
But why are you talking about it? It's a hugely niche hardware which is a tiny % of nVidia cards out there. It's deliberately outsized and you wouldn't put it in 99% of gaming PCs.

And yet you speak of it like it's a representative model. Do you also use a Hummer EV to measure all EVs?

lukeschlather 7/5/2025||
I am interested in buying hardware that can run the full DeepSeek R1 locally. I don't think it's a particularly good idea, but I've contemplated an array of 5090s.

If I were interested in using an EV to haul particularly heavy loads, I might be interested in the Hummer EV and have similar questions that might sound ridiculous.

ryao 7/5/2025|||
Has anyone made 12VHPWR cables that replace the 12 little wires with 2 large gauge wires yet? That would prevent the wires from becoming unbalanced, which should preempt the melting connector problem.

As a bonus, if the gauge is large enough, the cable would actually cool the connectors, although that should not be necessary since the failure appears to be caused by overloaded wires dumping heat into the connector as they overheat.

alright2565 7/5/2025|||
Might help a little bit, by heatsinking the contacts better, but the problem is the contact resistance, not the wire resistance. The connector itself dangerously heats up.

Or at least I think so? Was that a different 12VHPWR scandal?

bobmcnamara 7/5/2025|||
Contact resistance is a problem.

Another problem is when the connector is angled, several of the pins may not make contact, shoving all the power through as few as one wire. A common bus would help this but the contact resistance in this case is still bad.

ryao 7/5/2025||
A common bus that is not also overheating would cool the overheating contact(s).
alright2565 7/5/2025||
It would help, but my intuition is that the thin steel of the contact would not move the heat fast enough to make a significant difference. Only way to really know is to test it.
ryao 7/5/2025||||
I thought that the contact resistance caused the unbalanced wires, which then overheat alongside the connector, giving the connector’s heat nowhere to go.
chris11 7/5/2025||||
I think it's both contact and wire resistance.

It is technically possible to solder a new connector on. LTT did that in a video. https://www.youtube.com/watch?v=WzwrLLg1RR4

ryao 7/5/2025||
Uneven abnormal contact resistance is what causes the wires to become unbalanced, and then the remaining ones whose contacts have low resistance have huge currents pushed through them, causing them to overheat due to wire resistance. I am not sure if it is possible to have perfect contact resistance in all systems.
bobmcnamara 7/5/2025||||
Or 12 strands in a single sheath so it's not overly rigid.
AzN1337c0d3r 7/5/2025|||
They don't just specify 12 smaller cables for nothing if 2 larger ones will do. There are concerns here with mechanical compatibility (12 wires have smaller allowable bend radius than 2 larger ones with the same ampacity).
kuschku 7/5/2025||
One option is to use two very wide, thin insulated copper sheets as cable. Still has a good bend radius in one dimension, but is able to sink a lot of power.
dreamcompiler 7/5/2025||
To emphasize this point, go outside at noon in the summer and mark off a square meter on the sidewalk. That square of concrete is receiving about 1000w from the sun.

Now imagine a magnifying glass that big (or more practically a fresnel lens) concentrating all that light into one square inch. That's a lot of power. When copper connections don't work perfectly they have nonzero resistance, and the current running through them turns into heat by I^2R.

ionwake 7/4/2025||
I don’t want to jump on nvidia but I found it super weird when they clearly remote controlled a Disney bot onto the stage and claimed it was all using real time AI which was clearly impossible due to no latency and weirdly the bot verifying correct stage position in relation to the presenter. It was obviously the Disney bot just being controlled by someone off stage.

I found it super alarming because why would they fake something on stage to the extent of just lying.i know Steve jobs had backup phones but jsut claiming a robot is autonomous when it isn’t I just feel it was scammy.

It reminded me of when Tesla had remote controlled Optimus bots. I mean I think that’s awesome like super cool but clearly the users thought the robots were autonomous during that dinner party.

I have no idea why I seem to be the only person bothered by “stage lies” to this level. Tbh even the Tesla bots weren’t claimed to be autonomous so actually I should never have mentioned them but it explains the “not real” vibe.

Not meaning to disparage just explaining my perception as a European maybe it’s just me though!

EDIT > Im kinda suprised by the weak arguments in the replies, I love both companies, I am just offering POSITIVE feedback, that its important ( in my eyes ) to be careful not to pretend in certain specific ways or it makes the viewer question the foundation ( which we all know is SOLID and good ).

EDIT 2 >There actually is a good rebuttal in the replies, although apparently I have "reading comprehension skill deficiencies" its just my pov that they were insinuating the robot was aware of its surroundings, which is fair enough.

elil17 7/4/2025||
As I understand it the Disney bots do actually use AI in a novel way: https://la.disneyresearch.com/publication/design-and-control...

So there’s at least a bit more “there” there than the Tesla bots.

ionwake 7/4/2025||
I believe its RL trained only.

See this snipet : "Operator Commands Are Merged: The control system blends expressive animation commands (e.g., wave, look left) with balance-maintaining RL motions"

I will print a full retraction if someone can confirm my gut feeling is correct

dwattttt 7/4/2025|||
Having worked on control systems a long time ago, that's a 'nothing' statement: the whole job of the control system is to keep the robot stable/ambulating, regardless of whatever disturbances occur. It's meant to reject the forces induced due to waving exactly as much as bumping into something unexpected.

It's easier to stabilise from an operator initiated wave, really; it knows it's happening before it does the wave, and would have a model of the forces it'll induce.

ionwake 7/5/2025||
I tried to understand the point of your reply but Im not sure what your point was - I only seemed to glean "its easier to balance if the operator is moving it".

Please elaborate unless Im being thick.

EDIT > I upvoted your comment in any case as Im sure its helping

rcxdude 7/5/2025|||
'control system' in this case is not implying remote control, it's referring to the feedback system that adjust the actuators in response to the sensed information. If the motion is controlled automatically, then the control loop can in principle anticipate the motion in a way that it could not if it was remote controlled: i.e. the opposite, it's easier to control the motions (in terms of maintaining balance and avoiding overstressing the actuators) if the operator is not live puppeteering it.
dwattttt 7/5/2025|||
Apologies, yes, "control system" is somewhat niche jargon. "Balance system" is probably more appropriate.
dboreham 7/5/2025|||
Well "control system" is a proper term understood by anyone with a decent STEM education since 150 years ago.
dwattttt 7/6/2025||
To be fair, lots of fields have a notion of a "control" system. Control Theory doesn't have a monopoly on the term, for all that the field revolves around 'control systems'.
tekla 7/5/2025|||
> "control system" is somewhat niche jargon

Oh my god. What the hell is happening to STEM education? Control systems engineering is standard parlance. This is what Com Sci people are like?

ionwake 7/5/2025|||
Thank you for the explanation
dwattttt 7/5/2025|||
It's that there's nothing special about blending "operator initiated animation commands" with the RL balancing system. The balance system has to balance anyway; if there was no connection between an operator's wave command and balance, it would have exactly the same job to do.

At best the advantage of connecting those systems is that the operator command can inform the balance system, but there's nothing novel about that.

numpad0 7/5/2025||||
"RL is not AI" "Disney bots were remote controlled" are major AI hypebro delulu moment lol

Your understanding of AI and robotics are more cucumber than pear shaped. You're making very little technical sense here. Challenges and progress in robotics aren't where you think they are. It's all propagandish contents you're basing your understandings on.

If you're getting information from TikTok or YouTube Shorts style content, especially around Tesla bros - get the hell out of it at Ludicrous Speed. Or consume way more of it so thoroughly that you cannot be deceived anymore despite blatant lies everywhere. Then come back. They're all plain wrong and it's not good for you.

elil17 7/5/2025|||
Only as opposed to what? VLAM/something else more trendy?
CoastalCoder 7/4/2025|||
Not just you.

I hate being lied to, especially if it's so the liar can reap some economic advantage from having the lie believed.

AnimalMuppet 7/4/2025||
Yeah. I have a general rule that I don't do business with people who lie to me.
MichaelZuo 7/5/2025||
I can’t even imagine what kind of person would not follow that rule.

Do business with people that are known liars? And just get repeatedly deceived?

…Though upon reflection that would explain why the depression rate is so high.

frollogaston 7/4/2025|||
There's also a very thick coat of hype in https://www.nvidia.com/en-us/glossary/ai-factory/ and related material, even though the underlying product (an ML training cluster) is real.
ionwake 7/5/2025|||
Not sure why my comment got so upvoted, all my comments are my personal opinion based solely on the publicly streamed video, and as I said, I’ll happily correct or retract my impression.
AtariATMHacker 7/4/2025|||
[dead]
abxyz 7/5/2025|||
Disney are open about their droids being operator controlled. Unless nvidia took a Disney droid and built it to be autonomous (which seems unlikely) it would follow that it is also operator controlled. The presentation was demonstrating what Disney had achieved using nvidia’s technology. You can see an explainer of how these droids use machine learning here: https://youtube.com/shorts/uWObkOV71ZI

If you think the droid was autonomous then I guess that is evidence that nvidia were misrepresenting (if not lying).

Having seen these droids outside of the nvidia presentation and watching the nvidia presentation, I think it’s obvious it was human operated and that nvidia were misleading people.

ionwake 7/5/2025||||
I think its cool you disagree with me, it would be nice to hear a counter argument though.
AtariATMHacker 7/5/2025||
[dead]
Larrikin 7/5/2025||||
I assume any green accounts that are just asking questions with no research are usually lying. Actual new users will just comment and say their thoughts to join the community.
timschmidt 7/4/2025|||
It seems to me like both cases raised by OP - the Disney droids and Optimus - are cases of people making assumptions and then getting upset that their assumptions were wrong and making accusations.

Neither company was very forthcoming about the robots being piloted, but neither seems to be denying it either. And both seem to use RL / ML techniques to maintain balance, locomotion, etc. Not unlike Boston Dynamics' bots, which are also very carefully orchestrated by humans in multiple ways.

Haters gonna hate (downvotes just prove it - ha!)

ionwake 7/5/2025|||
If you look at the video he says " this is real time simulation .. can you believe it" basically : https://www.youtube.com/shorts/jD5y1eQ3Y_o

Yet he lists all the RL stuff that we know is used in the robot, he isnt being silent and saying " this robot is aided by AI" , or better yet, not commenting on the specifics, ( which would have been totally ok ), instead he is saying " This is real life simulation", which it isnt.

EDIT > apparently I am wrong - thank you for the correction everyone!

timschmidt 7/5/2025||
I have written motion control firmwares for 20+ years, and "this is real time simulation" has very domain-specific meaning to me. "Real time" means the code is responding to events as they happen, like with interrupts, and not via preemptible processing which could get out of sync with events. "simulation" is used by most control systems from simple PID loops to advanced balancing and motion planning.

It is clearly - to me at least - doing both of those things.

I think you're reading things into what he said that aren't there.

ionwake 7/5/2025||
ok thanks
NewsaHackO 7/5/2025|||
Yea, this seems like the initial poster has reading comprehension skill deficiencies and is blaming NVIDIA for lying about a point they never made. NVIDIA is even releasing some of the code they used to power the robot, which further proves that they in no way said the robot was not being operator controlled, just that it was using AI to make it’s movement look more fluid.
ionwake 7/5/2025||
fair enough, upvoted.
topato 7/5/2025||
I seem to remember multiple posts on large tech websites having the exact same opinion/conclusion/insinuation as the one you originally had, so not necessarily comprehension problem on your part. My opinion: Nvidia's CEO has a problem communicating in good faith. He absolutely knew what he was doing during that little stage show, and it was absolutely designed to mislead people toward the most "AI HYPE, PLEASE BUY GPUs, MY ROBOT NEEDS GPUS TO LIVE" conclusion
abletonlive 7/4/2025|||
[flagged]
omega3 7/4/2025||
Ableton Live is from Europe :)
gizajob 7/4/2025|||
You win the award for instant karma
ionwake 7/4/2025||||
oof!
abletonlive 7/4/2025|||
And it has fallen vastly behind other DAWs
gizajob 7/5/2025|||
Crazy talk. All the others have been playing catchup and still aren’t there with some things.
NetOpWibby 7/4/2025||||
I just want Acid Pro on Mac
windowshopping 7/4/2025|||
How so?
hn_throwaway_99 7/5/2025||
> I don’t want to jump on nvidia but I found it super weird when they clearly remote controlled a Disney bot onto the stage and claimed it was all using real time AI which was clearly impossible due to no latency and weirdly the bot verifying correct stage position in relation to the presenter. It was obviously the Disney bot just being controlled by someone off stage.

I don't know what you're referring to, but I'd just say that I don't believe what you are describing could have possibly happened.

Nvidia is a huge corporation, with more than a few lawyers on staff and on retainer, and what you are describing is criminal fraud that any plaintiff's lawyer would have a field day with. So, given that, and since I don't think people who work at Nvidia are complete idiots, I think whatever you are describing didn't happen the way you are describing it. Now, it's certainly possible there was some small print disclaimer, or there was some "weasel wording" that described something with ambiguity, but when you accuse someone of criminal fraud you want to have more than "hey this is just my opinion" to back it up.

kalleboo 7/5/2025|||
Tefal literally sells a rice cooker that boasts "AI Smart Cooking Technology" while not even containing a microcontroller and just being controlled by the time-honored technology of "a magnet that gets hot". They also have lawyers.

AI doesn't mean anything. You can claim anything uses "AI" and just define what that means yourself. They could have some basic anti-collision technology and claim it's "AI".

numpad0 7/5/2025||||
They're soaked eyebrows deep in Tiktok style hype juice, believing that latest breakthrough in robotics is that AGIs just casually started walking and talking on their own and therefore anything code controlled by now is considered proof of ineptitude and fake.

It's complete cult crazy talk. Not even cargocult, it's proper cultism.

moogly 7/5/2025|||
> what you are describing is criminal fraud that any plaintiff's lawyer would have a field day with

"Corporate puffery"

Nextgrid 7/4/2025||
I wonder if the 12VHPWR connector is intentionally defective to prevent large-scale use of those consumer cards in server/datacenter contexts?

The failure rate is just barely acceptable in a consumer use-case with a single card, but with multiple cards the probability of failure (which takes down the whole machine, as there's no way to hot-swap the card) makes it unusable.

I can't otherwise see why they'd persevere on that stupid connector when better alternatives exist.

transcriptase 7/4/2025||
It boggles my mind that an army of the most talented electrical engineers on earth somehow fumble a power connector and then don’t catch it before shipping.
mjevans 7/4/2025|||
Sunk cost fallacy and a burning (literal) desire to have small artistic things. That's probably also the reason the connector was densified so much, and clearly, released with so VERY little tolerance for error human and otherwise.
ls612 7/5/2025|||
They use the 12VHPWR on some datacenter cards too.
KerrAvon 7/4/2025||
IANAL, but knowingly leaving a serious defect in your product at scale for that purpose would be very bad behavior and juries tend not like that sort of thing.
thimabi 7/5/2025||
However, as we’ve learned from the Epic vs Apple case, corporations don’t really care about bad behavior — as long as their ulterior motives don’t get caught.
shmerl 7/4/2025||
> ... NVENC are pretty much indispensable

What's so special about NVENC that Vulkan video or VAAPI can't provide?

> AMD also has accelerated video transcoding tech but for some reason nobody seems to be willing to implement it into their products

OBS works with VAAPI fine. Looking forward to them adding Vulkan video as an option.

Either way, as a Linux gamer I haven't touched Nvidia in years. AMD is a way better experience.

ryao 7/4/2025||
> The RTX 50 series are the second generation of NVIDIA cards to use the 12VHPWR connector.

This is wrong. The 50 series uses 12V-2x6, not 12VHPWR. The 30 series was the first to use 12VHPR. The 40 series was the second to use 12VHPWR and first to use 12V-2x6. The 50 series was the second to use 12V-2x6. The female connectors are what changed in 12V-2x6. The male connectors are identical between 12V-2x6 and 12VHPWR.

ohdeargodno 7/4/2025|
Nitpicking it doesn't change the fact that the 12v2x6 connector _also_ burns down.
ryao 7/4/2025|||
The guy accuses Nvidia of not doing anything about that problem, but ignored that they did with the 12V-2x6 connector, which as far as I can tell, has had far fewer issues.
Gracana 7/4/2025|||
It still has no fusing, sensing, or load balancing for the individual wires. It is a fire waiting to happen.
ryao 7/5/2025||
It is a connector. None of the connectors inside a PC have those. They could add them to the circuitry on the PCB side of the connector, but that is entirely separate from the connector.

That said, the industry seems to be moving to adding detection into the PSU, given seasonic’s announcement:

https://www.tomshardware.com/pc-components/power-supplies/se...

Finally, I think there is a simpler solution, which is to change the cable to use two large gauge wires instead of 12 individual ones to carry current. That would eliminate the need for balancing the wires in the first place.

Gracana 7/5/2025||
Previous well-designed video cards used the technologies I described. Eliminating the sense circuits and fusing is a recent development.

I do like the idea of just using big wires. It’d be so much cleaner and simpler. Also using 24 or 48V would be nice, but that’d be an even bigger departure from current designs.

ryao 7/5/2025||
> Previous well-designed video cards used the technologies I described. Eliminating the sense circuits and fusing is a recent development.

My point is that the PCB is where such features would be present, not the connector. There are connectors that have fusing. The UK’s AC power plugs are examples of them. The connectors inside PCs are not.

Gracana 7/5/2025||
Oh, sure, I’m not proposing that the connector itself should have those features, rather that it shouldn’t be used without them present on the device.
MindSpunk 7/5/2025|||
The 50 series connectors burned up too. The issue was not fixed.
ryao 7/5/2025||
It seems incredibly wrong to assume that there was only 1 issue with 12WHPWR. 12V-2x6 was an improvement that eliminated some potential issues, not all of them. If you want to eliminate all of them, replace the 12 current carrying wires with 2 large gauge wires. Then the wires cannot become unbalanced. Of course, the connector would need to split the two into 12 very short wires to be compatible, but those would be recombined on the GPU’s PCB into a single wire.
numpad0 7/5/2025|||
(context: 12VHPWR and 12V-2x6 are the exact same thing. The latter is supposed to be improved and totally fixed, complete with the underspecced load-bearing "supposed to be" clause.)
AzN1337c0d3r 7/5/2025||
They are not the exact same thing.

https://www.corsair.com/us/en/explorer/diy-builder/power-sup...

DeepYogurt 7/5/2025||
> And I hate that they’re getting away with it, time and time again, for over seven years.

Nvidia's been at this way longer than 7 years. They were cheating at benchmarks to control a narrative back in 2003. https://tech.slashdot.org/story/03/05/23/1516220/futuremark-...

mcdeltat 7/5/2025||
Anyone else getting a bit disillusioned with the whole tech hardware improvements thing? Seems like every year we get less improvement for higher cost and the use cases become less useful. Like the whole industry is becoming a rent seeking exercise with diminishing returns. I used to follow hardware improvements and now largely don't because I realised I (and probably most of us) don't need it.

It's staggering that we are throwing so many resources at marginal improvements for things like gaming, and I say that as someone whose main hobby used to be gaming. Ray tracing, path tracing, DLSS, etc at a price point of $3000 just for the GPU - who cares when a 2010 cell shaded game running on an upmarket toaster gave me the utmost joy? And the AI use cases don't impress me either - seems like all we do each generation is burn more power to shove more data through and pray for an improvement (collecting sweet $$$ in the meantime).

Another commenter here said it well, there's just so much more you can do with your life than follow along with this drama.

philistine 7/5/2025||
Your disillusionment is warranted, but I'll say that on the Mac side the grass has never been greener. The M chips are screamers year after year, the GPUs are getting ok, the ML cores are incredible and actually useful.
mcdeltat 7/5/2025|||
Good point, we should commend genuinely novel efforts towards making baseline computation more efficient, like Apple has done as you say. Particularly in light of recent x86 development which seems to be "shove as many cores as possible on a die and heat your apartment while your power supply combusts" (meanwhile the software gets less efficient by the day, but that's another thing altogether...). ANY DAY of the week I will take a compute platform that's no-bs no-bells-and-whistles simply more efficient without the manufacturer trying to blow smoke up our asses.
hot_gril 7/6/2025|||
Yeah, going from Intel to M1 was a huge improvement, but not in every way. So now they're closing all the other gaps, and it's getting even better.
bamboozled 7/5/2025|||
I remember when it was a serious difference, like PS1-PS3 was absolutely miraculous and exciting to watch.

It's also fun that no matter how fast the hardware seems to get, we seem to fill it up with shitty bloated software.

mcdeltat 7/5/2025||
IMO at some point in the history of software we lost track of hardware capabilities versus software end outcomes. Hardware improved many orders of magnitude but overall software quality/usefulness/efficiency did not (yes this is a hill I will die on). We've ended up with mostly garbage and an occasional legitimately brilliant use of transistors.
keyringlight 7/5/2025|||
What stands out to me is that it's not just the hardware side, software production to make use of it to realize the benefits offered doesn't seem to be running smoothly either, at least for gaming. I'm not sure nvidia really cares too much though as there's no market pressure on them where it's a weakness for them, if consumer GPUs disappeared tomorrow they'd be fine.

A few months ago Jensen Huang said he sees quantum computing as the next big thing he wants nvidia to be a part of over the next 10-15 years (which seems like a similar timeline as GPU compute), so I don't think consumer GPUs are a priority for anyone. Gaming used to be the main objective with byproducts for professional usage, for the past few years that's reversed where gaming piggybacks on common aspects to compute.

seydor 7/5/2025||
Our stock investments are going up so ...... What can we do other than shrug
porphyra 7/4/2025||
The article complains about issues with consumer GPUs but those are nowadays relegated to being merely a side hobby project of Nvidia, whose core business is enterprise AI chips. Anyway Nvidia still has no significant competition from AMD on either front so they are still getting away with this.

Deceptive marketing aside, it's true that it's sad that we can't get 4K 60 Hz with ray tracing with current hardware without some kind of AI denoising and upscaling, but ray tracing is really just _profoundly_ hard so I can't really blame anyone for not having figured out how to put it in a consumer pc yet. There's a reason why pixar movies need huge render farms that take lots of time per frame. We would probably sooner get gaussian splatting and real time diffusion models in games than nice full resolution ray tracing tbh.

Jabrov 7/4/2025|
I get ray tracing at 4K 60Hz with my 4090 just fine
marcellus23 7/5/2025|||
What game? And with no upscaling or anything?
trynumber9 7/5/2025|||
Really? I can't even play Minecraft (DXR: ON) at 4K 60Hz on a RTX 5090...

Maybe another regression in Blackwell.

reichstein 7/5/2025||
Aks. "Every beef anyone has ever had with Nvidia in one outrage friendly article."

If you want to hate on Nvidia, there'll be something for you in there.

An entire section on 12vhpwr connectors, with no mention of 12V-2x6.

A lot of "OMG Monopoly" and "why won't people buy AMD" without considering that maybe ... AMD cards are not considered by the general public to be as good _where it counts_. (Like benefit per Watt, aka heat.) Maybe it's all perception, but then AMD should work on that perception. If you want the cooler CPU/GPU, perception is that that's Intel/Nvidia. That's reason enough for me, and many others.

Availability isn't great, I'll admit that, if you don't want to settle for a 5060.

Dylan16807 7/5/2025|
> The competing open standard is FreeSync, spearheaded by AMD. Since 2019, NVIDIA also supports FreeSync, but under their “G-Sync Compatible” branding. Personally, I wouldn’t bother with G-Sync when a competing, open standard exists and differences are negligible[4].

Open is good, but the open standard itself is not enough. You need some kind of testing/certification, which is built in to the G-Sync process. AMD does have a FreeSync certification program now which is good.

If you rely on just the standard, some manufacturers get really lazy. One of my screens technically supports FreeSync but I turned it off day one because it has a narrow range and flickers very badly.

More comments...