Top
Best
New

Posted by todsacerdoti 17 hours ago

Nvidia is full of shit(blog.sebin-nyshkim.net)
673 points | 345 commentspage 2
leakycap 17 hours ago|
This article goes much deeper than I expected, and is a nice recap of the last few years of "green" gpu drama.

Liars or not, the performance has not been there for me in any of my usecases, from personal to professional.

A system from 2017/2018 with an 8700K and an 8GB 2080 performs so closely to the top end, expensive systems today that it makes almost no sense to upgrade at MSRP+markup unless your system is older than this.

Unless you need specific features only on more recent cards, there are very few use cases I can think of needing more than a 30 series card right now.

theshackleford 8 hours ago||
> A system from 2017/2018 with an 8700K and an 8GB 2080 performs so closely to the top end, expensive systems today

This is in no way true and is quite an absurd claim. Unless you meant for some specific isolated purposed restricted purely to yourself and your performance needs.

> there are very few use cases I can think of needing more than a 30 series card right now.

How about I like high refresh and high resolutions? I'll throw in VR to boot. Which are my real use cases. I use a high refresh 4K display and VR, both have benefited hugely from my 2080Ti > 4090 Shift.

pixl97 16 hours ago||
I mean, most people probably won't directly upgrade. Their old card will die, or eventually nvidia will stop making drivers for it. Unless you're looking around for used cards, the price difference between something low end like a 3060 isn't that much less in price for the length of support you're going to get.

Unless nvidia's money printing machine breaks soon, expect the same to continue for the next 3+ years. Crappy expensive cards with a premium on memory with almost no actual video rendering performance increase.

leakycap 16 hours ago||
> Unless you're looking around for used cards, the price difference between something low end like a 3060 isn't that much less in price for the length of support you're going to get.

This does not somehow give purchasers more budget room now, but they can buy 30-series cards in spades and not have to worry about the same heating and power deliveries as a little bonus.

ionwake 16 hours ago||
I don’t want to jump on nvidia but I found it super weird when they clearly remote controlled a Disney bot onto the stage and claimed it was all using real time AI which was clearly impossible due to no latency and weirdly the bot verifying correct stage position in relation to the presenter. It was obviously the Disney bot just being controlled by someone off stage.

I found it super alarming because why would they fake something on stage to the extent of just lying.i know Steve jobs had backup phones but jsut claiming a robot is autonomous when it isn’t I just feel it was scammy.

It reminded me of when Tesla had remote controlled Optimus bots. I mean I think that’s awesome like super cool but clearly the users thought the robots were autonomous during that dinner party.

I have no idea why I seem to be the only person bothered by “stage lies” to this level. Tbh even the Tesla bots weren’t claimed to be autonomous so actually I should never have mentioned them but it explains the “not real” vibe.

Not meaning to disparage just explaining my perception as a European maybe it’s just me though!

EDIT > Im kinda suprised by the weak arguments in the replies, I love both companies, I am just offering POSITIVE feedback, that its important ( in my eyes ) to be careful not to pretend in certain specific ways or it makes the viewer question the foundation ( which we all know is SOLID and good ).

EDIT 2 >There actually is a good rebuttal in the replies, although apparently I have "reading comprehension skill deficiencies" its just my pov that they were insinuating the robot was aware of its surroundings, which is fair enough.

elil17 16 hours ago||
As I understand it the Disney bots do actually use AI in a novel way: https://la.disneyresearch.com/publication/design-and-control...

So there’s at least a bit more “there” there than the Tesla bots.

ionwake 16 hours ago||
I believe its RL trained only.

See this snipet : "Operator Commands Are Merged: The control system blends expressive animation commands (e.g., wave, look left) with balance-maintaining RL motions"

I will print a full retraction if someone can confirm my gut feeling is correct

dwattttt 15 hours ago|||
Having worked on control systems a long time ago, that's a 'nothing' statement: the whole job of the control system is to keep the robot stable/ambulating, regardless of whatever disturbances occur. It's meant to reject the forces induced due to waving exactly as much as bumping into something unexpected.

It's easier to stabilise from an operator initiated wave, really; it knows it's happening before it does the wave, and would have a model of the forces it'll induce.

ionwake 15 hours ago||
I tried to understand the point of your reply but Im not sure what your point was - I only seemed to glean "its easier to balance if the operator is moving it".

Please elaborate unless Im being thick.

EDIT > I upvoted your comment in any case as Im sure its helping

rcxdude 15 hours ago|||
'control system' in this case is not implying remote control, it's referring to the feedback system that adjust the actuators in response to the sensed information. If the motion is controlled automatically, then the control loop can in principle anticipate the motion in a way that it could not if it was remote controlled: i.e. the opposite, it's easier to control the motions (in terms of maintaining balance and avoiding overstressing the actuators) if the operator is not live puppeteering it.
dwattttt 15 hours ago|||
Apologies, yes, "control system" is somewhat niche jargon. "Balance system" is probably more appropriate.
dboreham 15 hours ago||
Well "control system" is a proper term understood by anyone with a decent STEM education since 150 years ago.
ionwake 15 hours ago|||
Thank you for the explanation
dwattttt 15 hours ago|||
It's that there's nothing special about blending "operator initiated animation commands" with the RL balancing system. The balance system has to balance anyway; if there was no connection between an operator's wave command and balance, it would have exactly the same job to do.

At best the advantage of connecting those systems is that the operator command can inform the balance system, but there's nothing novel about that.

numpad0 12 hours ago||||
"RL is not AI" "Disney bots were remote controlled" are major AI hypebro delulu moment lol

Your understanding of AI and robotics are more cucumber than pear shaped. You're making very little technical sense here. Challenges and progress in robotics aren't where you think they are. It's all propagandish contents you're basing your understandings on.

If you're getting information from TikTok or YouTube Shorts style content, especially around Tesla bros - get the hell out of it at Ludicrous Speed. Or consume way more of it so thoroughly that you cannot be deceived anymore despite blatant lies everywhere. Then come back. They're all plain wrong and it's not good for you.

elil17 14 hours ago|||
Only as opposed to what? VLAM/something else more trendy?
CoastalCoder 16 hours ago|||
Not just you.

I hate being lied to, especially if it's so the liar can reap some economic advantage from having the lie believed.

AnimalMuppet 15 hours ago||
Yeah. I have a general rule that I don't do business with people who lie to me.
MichaelZuo 14 hours ago||
I can’t even imagine what kind of person would not follow that rule.

Do business with people that are known liars? And just get repeatedly deceived?

…Though upon reflection that would explain why the depression rate is so high.

ionwake 2 hours ago|||
Not sure why my comment got so upvoted, all my comments are my personal opinion based solely on the publicly streamed video, and as I said, I’ll happily correct or retract my impression.
frollogaston 15 hours ago|||
There's also a very thick coat of hype in https://www.nvidia.com/en-us/glossary/ai-factory/ and related material, even though the underlying product (an ML training cluster) is real.
hn_throwaway_99 13 hours ago|||
> I don’t want to jump on nvidia but I found it super weird when they clearly remote controlled a Disney bot onto the stage and claimed it was all using real time AI which was clearly impossible due to no latency and weirdly the bot verifying correct stage position in relation to the presenter. It was obviously the Disney bot just being controlled by someone off stage.

I don't know what you're referring to, but I'd just say that I don't believe what you are describing could have possibly happened.

Nvidia is a huge corporation, with more than a few lawyers on staff and on retainer, and what you are describing is criminal fraud that any plaintiff's lawyer would have a field day with. So, given that, and since I don't think people who work at Nvidia are complete idiots, I think whatever you are describing didn't happen the way you are describing it. Now, it's certainly possible there was some small print disclaimer, or there was some "weasel wording" that described something with ambiguity, but when you accuse someone of criminal fraud you want to have more than "hey this is just my opinion" to back it up.

kalleboo 11 hours ago|||
Tefal literally sells a rice cooker that boasts "AI Smart Cooking Technology" while not even containing a microcontroller and just being controlled by the time-honored technology of "a magnet that gets hot". They also have lawyers.

AI doesn't mean anything. You can claim anything uses "AI" and just define what that means yourself. They could have some basic anti-collision technology and claim it's "AI".

moogly 2 hours ago||||
> what you are describing is criminal fraud that any plaintiff's lawyer would have a field day with

"Corporate puffery"

numpad0 12 hours ago|||
They're soaked eyebrows deep in Tiktok style hype juice, believing that latest breakthrough in robotics is that AGIs just casually started walking and talking on their own and therefore anything code controlled by now is considered proof of ineptitude and fake.

It's complete cult crazy talk. Not even cargocult, it's proper cultism.

AtariATMHacker 16 hours ago|||
[dead]
abxyz 15 hours ago|||
Disney are open about their droids being operator controlled. Unless nvidia took a Disney droid and built it to be autonomous (which seems unlikely) it would follow that it is also operator controlled. The presentation was demonstrating what Disney had achieved using nvidia’s technology. You can see an explainer of how these droids use machine learning here: https://youtube.com/shorts/uWObkOV71ZI

If you think the droid was autonomous then I guess that is evidence that nvidia were misrepresenting (if not lying).

Having seen these droids outside of the nvidia presentation and watching the nvidia presentation, I think it’s obvious it was human operated and that nvidia were misleading people.

ionwake 15 hours ago||||
I think its cool you disagree with me, it would be nice to hear a counter argument though.
AtariATMHacker 6 hours ago||
[dead]
Larrikin 15 hours ago||||
I assume any green accounts that are just asking questions with no research are usually lying. Actual new users will just comment and say their thoughts to join the community.
timschmidt 15 hours ago|||
It seems to me like both cases raised by OP - the Disney droids and Optimus - are cases of people making assumptions and then getting upset that their assumptions were wrong and making accusations.

Neither company was very forthcoming about the robots being piloted, but neither seems to be denying it either. And both seem to use RL / ML techniques to maintain balance, locomotion, etc. Not unlike Boston Dynamics' bots, which are also very carefully orchestrated by humans in multiple ways.

Haters gonna hate (downvotes just prove it - ha!)

ionwake 15 hours ago|||
If you look at the video he says " this is real time simulation .. can you believe it" basically : https://www.youtube.com/shorts/jD5y1eQ3Y_o

Yet he lists all the RL stuff that we know is used in the robot, he isnt being silent and saying " this robot is aided by AI" , or better yet, not commenting on the specifics, ( which would have been totally ok ), instead he is saying " This is real life simulation", which it isnt.

EDIT > apparently I am wrong - thank you for the correction everyone!

timschmidt 15 hours ago||
I have written motion control firmwares for 20+ years, and "this is real time simulation" has very domain-specific meaning to me. "Real time" means the code is responding to events as they happen, like with interrupts, and not via preemptible processing which could get out of sync with events. "simulation" is used by most control systems from simple PID loops to advanced balancing and motion planning.

It is clearly - to me at least - doing both of those things.

I think you're reading things into what he said that aren't there.

ionwake 15 hours ago||
ok thanks
NewsaHackO 15 hours ago|||
Yea, this seems like the initial poster has reading comprehension skill deficiencies and is blaming NVIDIA for lying about a point they never made. NVIDIA is even releasing some of the code they used to power the robot, which further proves that they in no way said the robot was not being operator controlled, just that it was using AI to make it’s movement look more fluid.
ionwake 14 hours ago||
fair enough, upvoted.
topato 12 hours ago||
I seem to remember multiple posts on large tech websites having the exact same opinion/conclusion/insinuation as the one you originally had, so not necessarily comprehension problem on your part. My opinion: Nvidia's CEO has a problem communicating in good faith. He absolutely knew what he was doing during that little stage show, and it was absolutely designed to mislead people toward the most "AI HYPE, PLEASE BUY GPUs, MY ROBOT NEEDS GPUS TO LIVE" conclusion
abletonlive 16 hours ago||
[flagged]
omega3 16 hours ago||
Ableton Live is from Europe :)
gizajob 16 hours ago|||
You win the award for instant karma
ionwake 16 hours ago||||
oof!
abletonlive 15 hours ago|||
And it has fallen vastly behind other DAWs
gizajob 9 hours ago|||
Crazy talk. All the others have been playing catchup and still aren’t there with some things.
NetOpWibby 15 hours ago||||
I just want Acid Pro on Mac
windowshopping 15 hours ago|||
How so?
Nextgrid 16 hours ago||
I wonder if the 12VHPWR connector is intentionally defective to prevent large-scale use of those consumer cards in server/datacenter contexts?

The failure rate is just barely acceptable in a consumer use-case with a single card, but with multiple cards the probability of failure (which takes down the whole machine, as there's no way to hot-swap the card) makes it unusable.

I can't otherwise see why they'd persevere on that stupid connector when better alternatives exist.

transcriptase 15 hours ago||
It boggles my mind that an army of the most talented electrical engineers on earth somehow fumble a power connector and then don’t catch it before shipping.
mjevans 16 hours ago|||
Sunk cost fallacy and a burning (literal) desire to have small artistic things. That's probably also the reason the connector was densified so much, and clearly, released with so VERY little tolerance for error human and otherwise.
ls612 11 hours ago|||
They use the 12VHPWR on some datacenter cards too.
KerrAvon 15 hours ago||
IANAL, but knowingly leaving a serious defect in your product at scale for that purpose would be very bad behavior and juries tend not like that sort of thing.
thimabi 13 hours ago||
However, as we’ve learned from the Epic vs Apple case, corporations don’t really care about bad behavior — as long as their ulterior motives don’t get caught.
mcdeltat 11 hours ago||
Anyone else getting a bit disillusioned with the whole tech hardware improvements thing? Seems like every year we get less improvement for higher cost and the use cases become less useful. Like the whole industry is becoming a rent seeking exercise with diminishing returns. I used to follow hardware improvements and now largely don't because I realised I (and probably most of us) don't need it.

It's staggering that we are throwing so many resources at marginal improvements for things like gaming, and I say that as someone whose main hobby used to be gaming. Ray tracing, path tracing, DLSS, etc at a price point of $3000 just for the GPU - who cares when a 2010 cell shaded game running on an upmarket toaster gave me the utmost joy? And the AI use cases don't impress me either - seems like all we do each generation is burn more power to shove more data through and pray for an improvement (collecting sweet $$$ in the meantime).

Another commenter here said it well, there's just so much more you can do with your life than follow along with this drama.

philistine 10 hours ago||
Your disillusionment is warranted, but I'll say that on the Mac side the grass has never been greener. The M chips are screamers year after year, the GPUs are getting ok, the ML cores are incredible and actually useful.
mcdeltat 4 hours ago||
Good point, we should commend genuinely novel efforts towards making baseline computation more efficient, like Apple has done as you say. Particularly in light of recent x86 development which seems to be "shove as many cores as possible on a die and heat your apartment while your power supply combusts" (meanwhile the software gets less efficient by the day, but that's another thing altogether...). ANY DAY of the week I will take a compute platform that's no-bs no-bells-and-whistles simply more efficient without the manufacturer trying to blow smoke up our asses.
keyringlight 5 hours ago|||
What stands out to me is that it's not just the hardware side, software production to make use of it to realize the benefits offered doesn't seem to be running smoothly either, at least for gaming. I'm not sure nvidia really cares too much though as there's no market pressure on them where it's a weakness for them, if consumer GPUs disappeared tomorrow they'd be fine.

A few months ago Jensen Huang said he sees quantum computing as the next big thing he wants nvidia to be a part of over the next 10-15 years (which seems like a similar timeline as GPU compute), so I don't think consumer GPUs are a priority for anyone. Gaming used to be the main objective with byproducts for professional usage, for the past few years that's reversed where gaming piggybacks on common aspects to compute.

bamboozled 10 hours ago|||
I remember when it was a serious difference, like PS1-PS3 was absolutely miraculous and exciting to watch.

It's also fun that no matter how fast the hardware seems to get, we seem to fill it up with shitty bloated software.

mcdeltat 1 hour ago||
IMO at some point in the history of software we lost track of hardware capabilities versus software end outcomes. Hardware improved many orders of magnitude but overall software quality/usefulness/efficiency did not (yes this is a hill I will die on). We've ended up with mostly garbage and an occasional legitimately brilliant use of transistors.
seydor 9 hours ago||
Our stock investments are going up so ...... What can we do other than shrug
ryao 16 hours ago||
> The RTX 50 series are the second generation of NVIDIA cards to use the 12VHPWR connector.

This is wrong. The 50 series uses 12V-2x6, not 12VHPWR. The 30 series was the first to use 12VHPR. The 40 series was the second to use 12VHPWR and first to use 12V-2x6. The 50 series was the second to use 12V-2x6. The female connectors are what changed in 12V-2x6. The male connectors are identical between 12V-2x6 and 12VHPWR.

ohdeargodno 16 hours ago|
Nitpicking it doesn't change the fact that the 12v2x6 connector _also_ burns down.
ryao 16 hours ago|||
The guy accuses Nvidia of not doing anything about that problem, but ignored that they did with the 12V-2x6 connector, which as far as I can tell, has had far fewer issues.
Gracana 16 hours ago|||
It still has no fusing, sensing, or load balancing for the individual wires. It is a fire waiting to happen.
ryao 13 hours ago||
It is a connector. None of the connectors inside a PC have those. They could add them to the circuitry on the PCB side of the connector, but that is entirely separate from the connector.

That said, the industry seems to be moving to adding detection into the PSU, given seasonic’s announcement:

https://www.tomshardware.com/pc-components/power-supplies/se...

Finally, I think there is a simpler solution, which is to change the cable to use two large gauge wires instead of 12 individual ones to carry current. That would eliminate the need for balancing the wires in the first place.

Gracana 10 hours ago||
Previous well-designed video cards used the technologies I described. Eliminating the sense circuits and fusing is a recent development.

I do like the idea of just using big wires. It’d be so much cleaner and simpler. Also using 24 or 48V would be nice, but that’d be an even bigger departure from current designs.

ryao 1 hour ago||
> Previous well-designed video cards used the technologies I described. Eliminating the sense circuits and fusing is a recent development.

My point is that the PCB is where such features would be present, not the connector. There are connectors that have fusing. The UK’s AC power plugs are examples of them. The connectors inside PCs are not.

Gracana 7 minutes ago||
Oh, sure, I’m not proposing that the connector itself should have those features, rather that it shouldn’t be used without them present on the device.
MindSpunk 13 hours ago|||
The 50 series connectors burned up too. The issue was not fixed.
ryao 13 hours ago||
It seems incredibly wrong to assume that there was only 1 issue with 12WHPWR. 12V-2x6 was an improvement that eliminated some potential issues, not all of them. If you want to eliminate all of them, replace the 12 current carrying wires with 2 large gauge wires. Then the wires cannot become unbalanced. Of course, the connector would need to split the two into 12 very short wires to be compatible, but those would be recombined on the GPU’s PCB into a single wire.
numpad0 12 hours ago|||
(context: 12VHPWR and 12V-2x6 are the exact same thing. The latter is supposed to be improved and totally fixed, complete with the underspecced load-bearing "supposed to be" clause.)
AzN1337c0d3r 9 hours ago||
They are not the exact same thing.

https://www.corsair.com/us/en/explorer/diy-builder/power-sup...

DeepYogurt 11 hours ago||
> And I hate that they’re getting away with it, time and time again, for over seven years.

Nvidia's been at this way longer than 7 years. They were cheating at benchmarks to control a narrative back in 2003. https://tech.slashdot.org/story/03/05/23/1516220/futuremark-...

Dylan16807 13 hours ago||
> The competing open standard is FreeSync, spearheaded by AMD. Since 2019, NVIDIA also supports FreeSync, but under their “G-Sync Compatible” branding. Personally, I wouldn’t bother with G-Sync when a competing, open standard exists and differences are negligible[4].

Open is good, but the open standard itself is not enough. You need some kind of testing/certification, which is built in to the G-Sync process. AMD does have a FreeSync certification program now which is good.

If you rely on just the standard, some manufacturers get really lazy. One of my screens technically supports FreeSync but I turned it off day one because it has a narrow range and flickers very badly.

porphyra 16 hours ago||
The article complains about issues with consumer GPUs but those are nowadays relegated to being merely a side hobby project of Nvidia, whose core business is enterprise AI chips. Anyway Nvidia still has no significant competition from AMD on either front so they are still getting away with this.

Deceptive marketing aside, it's true that it's sad that we can't get 4K 60 Hz with ray tracing with current hardware without some kind of AI denoising and upscaling, but ray tracing is really just _profoundly_ hard so I can't really blame anyone for not having figured out how to put it in a consumer pc yet. There's a reason why pixar movies need huge render farms that take lots of time per frame. We would probably sooner get gaussian splatting and real time diffusion models in games than nice full resolution ray tracing tbh.

Jabrov 15 hours ago|
I get ray tracing at 4K 60Hz with my 4090 just fine
trynumber9 13 hours ago||
Really? I can't even play Minecraft (DXR: ON) at 4K 60Hz on a RTX 5090...

Maybe another regression in Blackwell.

yunyu 16 hours ago||
If you are a gamer, you are no longer NVIDIA's most important customer.
Rapzid 7 hours ago||
Sounds like an opening for AMD then. But as long as NVidia has the best tech I'll keep buying it when it's time to upgrade.
bigyabai 16 hours ago|||
A revelation on-par with Mac users waking up to learn their computer was made by a phone company.
ravetcofx 15 hours ago||
Barely even a phone company, more like a app store and microtransactions services company
theshackleford 8 hours ago|||
Yes but why should I care provided the product they have already sold me continues to work? How does this materially change my life because Nvidia doesnt want to go steady with me anymore?
dcchambers 15 hours ago||
Haven't been for a while. Not since crypto bros started buying up GPUs for coin mining.
musebox35 8 hours ago|
With the rise of LLM training, Nvidia’s main revenue stream switched to datacenter gpus (>10x gaming revenue). I wonder whether this have affected the quality of these consumer cards, including both their design and product processes:

https://stockanalysis.com/stocks/nvda/metrics/revenue-by-seg...

More comments...