Posted by todsacerdoti 7/4/2025
I hope they get hit with a class action lawsuit and are forced to recall and properly fix these products before anyone dies as a result of their shoddy engineering.
EDIT: Plantiff dismissed it. Guessing they settled. Here are the court documents (alternately, shakna's links below include unredacted copies):
https://www.classaction.org/media/plaintiff-v-nvidia-corpora...
https://www.classaction.org/media/plaintiff-v-nvidia-corpora...
A GamersNexus article investigating the matter: https://gamersnexus.net/gpus/12vhpwr-dumpster-fire-investiga...
And a video referenced in the original post, describing how the design changed from one that proactively managed current balancing, to simply bundling all the connections together and hoping for the best: https://youtu.be/kb5YzMoVQyw
Sounds like it was settled out of court.
[0] https://www.docketalarm.com/cases/California_Northern_Distri...
I’m curious whether the 5090 package was not following UL requirements.
Would that make them even more liable?
Part of me believes that the blame here is probably on the manufacturers and that this isn’t a problem with Nvidia corporate.
I might actually be happy to buy one of these things, at the inflated price, and run it at half voltage or something... but I can't tell if that is going to fix these concerns or they're just bad cards.
And yet you speak of it like it's a representative model. Do you also use a Hummer EV to measure all EVs?
If I were interested in using an EV to haul particularly heavy loads, I might be interested in the Hummer EV and have similar questions that might sound ridiculous.
As a bonus, if the gauge is large enough, the cable would actually cool the connectors, although that should not be necessary since the failure appears to be caused by overloaded wires dumping heat into the connector as they overheat.
Or at least I think so? Was that a different 12VHPWR scandal?
Another problem is when the connector is angled, several of the pins may not make contact, shoving all the power through as few as one wire. A common bus would help this but the contact resistance in this case is still bad.
It is technically possible to solder a new connector on. LTT did that in a video. https://www.youtube.com/watch?v=WzwrLLg1RR4
Now imagine a magnifying glass that big (or more practically a fresnel lens) concentrating all that light into one square inch. That's a lot of power. When copper connections don't work perfectly they have nonzero resistance, and the current running through them turns into heat by I^2R.
I found it super alarming because why would they fake something on stage to the extent of just lying.i know Steve jobs had backup phones but jsut claiming a robot is autonomous when it isn’t I just feel it was scammy.
It reminded me of when Tesla had remote controlled Optimus bots. I mean I think that’s awesome like super cool but clearly the users thought the robots were autonomous during that dinner party.
I have no idea why I seem to be the only person bothered by “stage lies” to this level. Tbh even the Tesla bots weren’t claimed to be autonomous so actually I should never have mentioned them but it explains the “not real” vibe.
Not meaning to disparage just explaining my perception as a European maybe it’s just me though!
EDIT > Im kinda suprised by the weak arguments in the replies, I love both companies, I am just offering POSITIVE feedback, that its important ( in my eyes ) to be careful not to pretend in certain specific ways or it makes the viewer question the foundation ( which we all know is SOLID and good ).
EDIT 2 >There actually is a good rebuttal in the replies, although apparently I have "reading comprehension skill deficiencies" its just my pov that they were insinuating the robot was aware of its surroundings, which is fair enough.
So there’s at least a bit more “there” there than the Tesla bots.
See this snipet : "Operator Commands Are Merged: The control system blends expressive animation commands (e.g., wave, look left) with balance-maintaining RL motions"
I will print a full retraction if someone can confirm my gut feeling is correct
It's easier to stabilise from an operator initiated wave, really; it knows it's happening before it does the wave, and would have a model of the forces it'll induce.
Please elaborate unless Im being thick.
EDIT > I upvoted your comment in any case as Im sure its helping
Oh my god. What the hell is happening to STEM education? Control systems engineering is standard parlance. This is what Com Sci people are like?
At best the advantage of connecting those systems is that the operator command can inform the balance system, but there's nothing novel about that.
Your understanding of AI and robotics are more cucumber than pear shaped. You're making very little technical sense here. Challenges and progress in robotics aren't where you think they are. It's all propagandish contents you're basing your understandings on.
If you're getting information from TikTok or YouTube Shorts style content, especially around Tesla bros - get the hell out of it at Ludicrous Speed. Or consume way more of it so thoroughly that you cannot be deceived anymore despite blatant lies everywhere. Then come back. They're all plain wrong and it's not good for you.
I hate being lied to, especially if it's so the liar can reap some economic advantage from having the lie believed.
Do business with people that are known liars? And just get repeatedly deceived?
…Though upon reflection that would explain why the depression rate is so high.
If you think the droid was autonomous then I guess that is evidence that nvidia were misrepresenting (if not lying).
Having seen these droids outside of the nvidia presentation and watching the nvidia presentation, I think it’s obvious it was human operated and that nvidia were misleading people.
Neither company was very forthcoming about the robots being piloted, but neither seems to be denying it either. And both seem to use RL / ML techniques to maintain balance, locomotion, etc. Not unlike Boston Dynamics' bots, which are also very carefully orchestrated by humans in multiple ways.
Haters gonna hate (downvotes just prove it - ha!)
Yet he lists all the RL stuff that we know is used in the robot, he isnt being silent and saying " this robot is aided by AI" , or better yet, not commenting on the specifics, ( which would have been totally ok ), instead he is saying " This is real life simulation", which it isnt.
EDIT > apparently I am wrong - thank you for the correction everyone!
It is clearly - to me at least - doing both of those things.
I think you're reading things into what he said that aren't there.
I don't know what you're referring to, but I'd just say that I don't believe what you are describing could have possibly happened.
Nvidia is a huge corporation, with more than a few lawyers on staff and on retainer, and what you are describing is criminal fraud that any plaintiff's lawyer would have a field day with. So, given that, and since I don't think people who work at Nvidia are complete idiots, I think whatever you are describing didn't happen the way you are describing it. Now, it's certainly possible there was some small print disclaimer, or there was some "weasel wording" that described something with ambiguity, but when you accuse someone of criminal fraud you want to have more than "hey this is just my opinion" to back it up.
AI doesn't mean anything. You can claim anything uses "AI" and just define what that means yourself. They could have some basic anti-collision technology and claim it's "AI".
It's complete cult crazy talk. Not even cargocult, it's proper cultism.
"Corporate puffery"
The failure rate is just barely acceptable in a consumer use-case with a single card, but with multiple cards the probability of failure (which takes down the whole machine, as there's no way to hot-swap the card) makes it unusable.
I can't otherwise see why they'd persevere on that stupid connector when better alternatives exist.
What's so special about NVENC that Vulkan video or VAAPI can't provide?
> AMD also has accelerated video transcoding tech but for some reason nobody seems to be willing to implement it into their products
OBS works with VAAPI fine. Looking forward to them adding Vulkan video as an option.
Either way, as a Linux gamer I haven't touched Nvidia in years. AMD is a way better experience.
This is wrong. The 50 series uses 12V-2x6, not 12VHPWR. The 30 series was the first to use 12VHPR. The 40 series was the second to use 12VHPWR and first to use 12V-2x6. The 50 series was the second to use 12V-2x6. The female connectors are what changed in 12V-2x6. The male connectors are identical between 12V-2x6 and 12VHPWR.
That said, the industry seems to be moving to adding detection into the PSU, given seasonic’s announcement:
https://www.tomshardware.com/pc-components/power-supplies/se...
Finally, I think there is a simpler solution, which is to change the cable to use two large gauge wires instead of 12 individual ones to carry current. That would eliminate the need for balancing the wires in the first place.
I do like the idea of just using big wires. It’d be so much cleaner and simpler. Also using 24 or 48V would be nice, but that’d be an even bigger departure from current designs.
My point is that the PCB is where such features would be present, not the connector. There are connectors that have fusing. The UK’s AC power plugs are examples of them. The connectors inside PCs are not.
https://www.corsair.com/us/en/explorer/diy-builder/power-sup...
Nvidia's been at this way longer than 7 years. They were cheating at benchmarks to control a narrative back in 2003. https://tech.slashdot.org/story/03/05/23/1516220/futuremark-...
It's staggering that we are throwing so many resources at marginal improvements for things like gaming, and I say that as someone whose main hobby used to be gaming. Ray tracing, path tracing, DLSS, etc at a price point of $3000 just for the GPU - who cares when a 2010 cell shaded game running on an upmarket toaster gave me the utmost joy? And the AI use cases don't impress me either - seems like all we do each generation is burn more power to shove more data through and pray for an improvement (collecting sweet $$$ in the meantime).
Another commenter here said it well, there's just so much more you can do with your life than follow along with this drama.
It's also fun that no matter how fast the hardware seems to get, we seem to fill it up with shitty bloated software.
A few months ago Jensen Huang said he sees quantum computing as the next big thing he wants nvidia to be a part of over the next 10-15 years (which seems like a similar timeline as GPU compute), so I don't think consumer GPUs are a priority for anyone. Gaming used to be the main objective with byproducts for professional usage, for the past few years that's reversed where gaming piggybacks on common aspects to compute.
Deceptive marketing aside, it's true that it's sad that we can't get 4K 60 Hz with ray tracing with current hardware without some kind of AI denoising and upscaling, but ray tracing is really just _profoundly_ hard so I can't really blame anyone for not having figured out how to put it in a consumer pc yet. There's a reason why pixar movies need huge render farms that take lots of time per frame. We would probably sooner get gaussian splatting and real time diffusion models in games than nice full resolution ray tracing tbh.
Maybe another regression in Blackwell.
If you want to hate on Nvidia, there'll be something for you in there.
An entire section on 12vhpwr connectors, with no mention of 12V-2x6.
A lot of "OMG Monopoly" and "why won't people buy AMD" without considering that maybe ... AMD cards are not considered by the general public to be as good _where it counts_. (Like benefit per Watt, aka heat.) Maybe it's all perception, but then AMD should work on that perception. If you want the cooler CPU/GPU, perception is that that's Intel/Nvidia. That's reason enough for me, and many others.
Availability isn't great, I'll admit that, if you don't want to settle for a 5060.
Open is good, but the open standard itself is not enough. You need some kind of testing/certification, which is built in to the G-Sync process. AMD does have a FreeSync certification program now which is good.
If you rely on just the standard, some manufacturers get really lazy. One of my screens technically supports FreeSync but I turned it off day one because it has a narrow range and flickers very badly.