Posted by todsacerdoti 13 hours ago
I have been rocking AMD GPU ever since the drivers were upstreamed into the linux kernel. No regrets.
I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy. But consumer gotta consoooooom and then cry and outrage when they are exploited instead of just walking away and doing something else.
Same with magic the gathering, the game went to shit and so many people got outraged and in a big huff but they still spend thousands on the hobby. I just stopped playing mtg.
1. Nvidia cards
2. Hooked up to Linux boxes
It turns out that Nvidia tends to work pretty well on Linux too, despite the binary blob drivers.
Other than gaming and ML, I'm not sure what the value of spending much on a GPU is... AMD is just in a tough spot.
I'd really love to try AMD as a daily driver. For me CUDA is the showstopper. There's really nothing comparable in the AMD camp.
I guess there games that you can only play on PC with Nvidia graphics. That begs the question why someone create a game and ignore large console market.
With their current generation of cards AMD has caught up on all of those things except CUDA, and Intel is in a similar spot now that they've had time to improve their drivers, so it's pretty easy now to buy a non-Nvidia card without feeling like you're giving anything up.
I suspect the thing you're referring to is ZLUDA[0], it allows you to run CUDA code on a range of non NVidia hardware (for some value of "run").
Traditionally the NVIDIA drivers have been more stable on Windows than the AMD drivers. I choose an AMD card because I wanted a hassle free experience on Linux (well as much as you can).
It's not that I can't live like this, I still have the same card, but if I were looking to do anything AI locally with a new card, for sure it wouldn't be an AMD one.
its been great. flawless in fact.
Software. AMD has traditionally been really bad at their drivers. (They also missed the AI train and are trying to catch up).
I use Linux and have learned not to touch AMD GPUs (and to a lesser extent CPUs due to chipset quality/support) a long time ago. Even if they are better now, (I feel) Intel integrated (if no special GPU perf needed) or NVidia are less risky choices.
* Purchase always AMD.
* Purchase never Nvidia.
* Intel is also okay.
Because the AMD drivers are good and open-source. And AMD cares about bug reports. The one from Nvidia can and will create issues because they’re closed-source and avoided for years to support Wayland. Now Nvidia published source-code and refuses to merge it into Linux and Mesa facepalmWhile Nvidia comes up with proprietary stuff AMD brought us Vulkan, FreeSync, supported Wayland well already with Implicit-Sync (like Intel) and used the regular Video-Acceleration APIs for long time.
Meanwhile Nvidia:
https://registry.khronos.org/OpenGL/extensions/NV/NV_robustn...
It’s not a bug, it’s a feature!
Their bad drivers still don’t handle simple actions like a VT-Switch or Suspend/Resume. If a developer doesn’t know about that extension the users suffer for years.Okay. But that is probably only a short term solution? It is Nvidias short term solution since 2016!
NVIDIA's drivers also recently completely changed how they worked. Hopefully that'll result in a lot of these long term issues getting fixed. As I understand it, the change is this: The nvidia drivers contain a huge amount of proprietary, closed source code. This code used to be shipped as a closed source binary blob which needed to run on your CPU. And that caused all sorts of problems - because its linux and you can't recompile their binary blob. Earlier this year, they moved all the secret, proprietary parts into a firmware image instead which runs on a coprocessor within the GPU itself. This then allowed them to - at last - opensource (most? all?) of their remaining linux driver code. And that means we can patch and change and recompile that part of the driver. And that should mean the wayland & kernel teams can start fixing these issues.
In theory, users shouldn't notice any changes at all. But I suspect all the nvidia driver problems people have been running into lately have been fallout from this change.
Err, what? While you're right about Intel integrated GPUs being a safe choice, AMD has long since been the GPU of choice for Linux -- it just works. Whereas Nvidia on Linux has been flaky for as long as I can remember.
That said, I've been avoiding AMD in general for so long the ecosystem might have really improved in the meantime, as there was no incentive for me to try and switch.
Recently I've been dabbling in AI where AMD GPUs (well, sw ecosystem, really) are lagging behind. Just wasn't worth the hassle.
NVidia hw, once I set it up (which may be a bit involved), has been pretty stable for me.
Not OP, I had same experience in the past with AMD,I bought a new laptop and in 6 months the AMD decided that my card is obsolete and no longer provided drivers forcing me to be stuck with older kernel/X11 , so I switched to NVIDIA and after 2 PC changes I still use NVIDIA since the official drivers work great, I really hope AMD this time is putting the effort to keep older generations of cards working on latest kernels/X11 maybe next card will be AMD.
But this is an explanations why us some older Linux users have bad memories with AMD and we had good reason to switch over to NVIDIA and no good reason to switch back to AMD
I'm with you - in principle. Capital-G "Gamers" who see themselves as the real discriminated group have fully earned the ridicule.
But I think where the criticism is valid is how NVIDIA's behavior is part of the wider enshittification trend in tech. Lock-in and overpricing in entertainment software might be acceptance, but it gets problematic when we have the exact same trends in actually critical tech like phones and cars.
My main hobby is videogames, but since I can consistently play most games on Linux (that has good AMD support), it doesn't really matter.
Efficiency: https://tpucdn.com/review/gigabyte-geforce-rtx-5050-gaming-o...
Vsync power draw: https://tpucdn.com/review/gigabyte-geforce-rtx-5050-gaming-o...
The variance within Nvidia's line-up is much larger than the variance between brands, anyway.
They will complain endlessly about the price of a RTX 5090 and still rush out to buy it. I know people that own these high end cards as a flex, but their lives are too busy to actually play games.
Now it is hard to draw a straight comparison. Gamers may spend a lot more time playing so $/h isn't a perfect metric. And some will frequently buy new games or worse things like microtransactions which quickly skyrocket the cost. But overall it doesn't seem like the most expensive hobby, especially if you are trying to spend less.
My favorite part about being a reformed gaming addict is the fact that my MacBook now covers ~100% of my computer use cases. The desktop is nice for Visual Studio but that's about it.
I'm still running a 5700XT in my desktop. I have absolutely zero desire to upgrade.
No mesh shader supports though. I bet more games will start using that soon
Just got PRO 6000 96GB for models tuning/training/etc. The cheapest 'good enough' for my needs option.
Same boat. I have 5700XT as well and since 2023, used mostly my Mac for gaming.
...and even if you're all in on video games, there's a massive amount of really brilliant indie games on Steam that run just fine on a 1070 or 2070 (I still have my 2070 and haven't found a compelling reason to upgrade yet).
I think more and more people will realize games are a waste of time for them and go on to find other hobbies. As a game developer, it kinda worries me. As a gamer, I can't wait for gaming to be a niche thing again, haha.
And then there's the age group younger than me, for whom games are not only a hobby but also a "social place to be", I doubt they'll be dropping gaming entirely easily.
Nah. Games will always be around.
My experience with running non-game windows-only programs has been similar over the past ~5 years. It really is finally the Year of the Linux Desktop, only few people seem to have noticed.
I play a lot of HellDivers 2. Despite what a lot of Linux YouTubers say. It doesn't work very well on Linux. The recommendations I got from people was to change distro. I do other stuff on Linux. Game slows down when you need it to be running smoothly doesn't matter what resolution/settings you set.
Anything with anti-cheat probably won't work very well if at all.
I also wanted to play the old Command and Conquer games. Getting the fan made patchers (not the games itself) to run properly that fix a bunch of bugs that EA/Westwood never fixed and mod support is more difficult than I cared to bother with.
Make sure to change your Steam launch options to:
PULSE_LATENCY_MSEC=84 gamemoderun %command%
This will use gamemode to run it, give it priority, put the system in performance power mode, and will fix any pulse audio static you may be having. You can do this for any game you launch with steam, any shortcut, etc.
It's missing probably 15fps on this card between windows and Linux, and since it's above 100fps I really don't even notice.
It does seem to run a bit better under gnome with Variable Refresh Rate than KDE.
I did get it running nice for about a day and then an update was pushed and it ran like rubbish again. The game runs smoothly when initially running the map and then massive dip in frames for several seconds. This is usually when one of the bugs is jumping at you.
This game may work better on Fedora/Bazzite or <some other distro> but I find Debian to be super reliable and don't want to switch distro. I also don't like Fedora generally as I've found it unreliable in the past. I had a look at Bazzite and I honestly just wasn't interested. This is due to it having a bunch of technologies that I have no interest in using.
There are other issues that are tangential but related issues.
e.g.
I normally play on Super HellDive with other players in a Discord VC. Discord / Pipewire seems to reset my sound for no particular reason and my Plantronics Headset Mic (good headset, not some gamer nonsense) will be not found. This requires a restart of pipewire/wireplumber and Discord (in that order). This happens often enough I have a shell script alias called "fix_discord".
I have weird audio problems on HDMI (AMD card) thanks to a regression in the kernel (Kernel 6.1 with Debian worked fine).
I could mess about with this for ages and maybe get it working or just reboot into Windows which takes me all of a minute.
It is just easier to use Windows for Gaming. Then use Linux for work stuff.
Honestly? Fedora is really the premier Linux distro these days. It's where the most the development is happening, by far.
All of my hardware, some old, some brand new (AMD card), worked flawlessly out of the box.
There was a point when you couldn't get me to use an rpm-based distro if my life depended on it. That time is long gone.
It the same reason I don't want to use Bazzite. It misses the point of using a Linux/Unix system altogether.
I also learned a long time ago Distro Hopping doesn't actually fix your issues. You just end up either with the same issues or different ones. If I switched from Debian to Fedora, I suspect I would have many of the same issues.
e.g. If a issue is in the Linux kernel itself such as HDMI Audio on AMD cards having random noise, I fail to see how changing from one distro to another would help. Fedora might have a custom patch to fix this, however I could also take this patch and make my own kernel image (which I've done in the past btw).
The reality is that most people doing development for various project / packages that make the Linux desktop don't have the setup I have and some of the peculiarities I am running into. If I had a more standard setup, I wouldn't have an issue.
Moreover, I would be using FreeBSD/OpenBSD or some other more traditional Unix system and ditch Linux if I didn't require some Linux specific applications. I am considering moving to something like Artix / Devuan in the future if I did decide to switch.
I also don't play any games that require a rootkit, so..
When I am in front of windows, I know I can permit myself to relax, breath easy and let off some steam. When I am not, I know I am there to learn/earn a living/produce something etc. Most probably do not need this, but my brain does, or I would never switch off.
The vast majority of my gaming library runs fine on Linux. Older games might run better than on Windows, in fact.
And yes, I rarely play anything online multiplayer.
Last one I ever tried was https://www.protondb.com/app/813780 with comments like "works perfectly, except multiplayer is completely broken" and the workaround has changed 3 times so far, also it lags no matter what. Gave up after stealing 4 different DLLs from Windows. It doesn't even have anticheat, it's just cause of some obscure math library.
I literally never had to do that. Most tweaking I needed to do was switching proton versions here and there (which is trivial to do).
Age of empires 2 used to work well, without needing any babying, so I'm not sure why it didn't for you. I will see about spinning it up.
I'm not saying they all got together and decided this together but their wonks are probably all saying the same thing. The market is shrinking and whether it's by design or incompetence, this creates a new opportunity to acquire it wholesale for pennies on the dollar and build a wall around it and charge for entry. It's a natural result of games requiring NVidia developers for driver tuning, bitcoin/ai and buying out capacity to prevent competitors.
The wildcard I can't fit into this puzzle is Valve. They have a huge opportunity here but they also might be convinced that they have already saturated the market and will read the writing on the wall.
From a supply/demand perspective, if all of your customers are still getting high on the 5 (or 20) year old supply, launching a new title in the same space isn't going to work. There are not an infinite # of gamers and the global dopamine budget is limited.
Launching a game like TF2 or Starcraft 2 in 2025 would be viewed as a business catastrophe by the metrics most AAA studios are currently operating under. Monthly ARPU for gamers years after purchasing the Orange Box was approximately $0.00. Giving gamers access to that strong of a drug would ruin the demand for other products.
What Microsoft is trying to do with Gamepass is a structural change. It may not work out the way that they plan but the truth is that sometimes these things do change the nature of the games you play.
I think Microsoft's strategy is going to come to the same result as Embracer Group. They've bought up lots of studios and they control a whole platform (by which I mean Xbox, not PC) but this doesn't give them that much power. Gaming does evolve and it often evolves to work around attempts like this, rather than in favor of them.
>> Microsoft's strategy is going to come to the same result as Embracer Group.
I hope you are right.
If I were trying to make a larger point, I guess it would be that big tech companies (Apple, MSFT, Amazon) don't want content creators to be too important in the ecosystem and tend to support initiatives that emphasize the platform.
100%. The platforms' ability to monetize in their factor is directly proportional to their relative power vs the most powerful creatives.
Thus, in order to keep more money, they make strategic moves that disempower creatives.
Also mobile games that got priced at $0.99 meant that only the unicorn level games could actually make decent money so In-App Purchases were born.
But also I suspect it is just a problem where as consumers we spend a certain amount of money on certain kinds of entertainment and if as a content producer you can catch enough people’s attention you can get a slice of that pie. We saw this with streaming services where an average household spent about $100/month on cable so Netflix, Hulu, et al all decided to price themselves such that they could be a portion of that pie (and would have loved to be the whole pie but ironically studios not willing to license everything to everyone is what prevented that).
I don’t think nVidia wants gaming collapse either. They might not prioritize it now but they definitely know that it will persist in some form. They bet on AI (and crypto before it) because those are lucrative opportunities but there’s no guarantee they will last. So they squeeze as much as they can out of those while they can. They definitely want gaming as a backup. It might be not as profitable and more finicky as it’s a consumer market but it’s much more stable in the long run.
It also won’t work, and Microsoft has developed no way to compete on actual value. As much as I hate the acquisitions they’ve made, even if Microsoft as a whole were to croak tomorrow I think the game industry would be fine.
Yes games can be expensive to make, but they don't have to be, and millions will still want new games to play. It is actually a pretty low bar for entry to bring an indie game to market (relative to other ventures). A triple A studio collapse would probably be an amazing thing for gamers, lots of new and unique indie titles. Just not great for profit for big companies, a problem I am not concerned with.
The striking one for me is their linux efforts, at least as far as I'm aware they don't do a lot that isn't tied to the steam deck (or similar devices) or running games available on steam through linux. Even the deck APU is derived from the semi-custom work AMD did for the consoles, they're benefiting from a second later harvest that MS/Sony have invested (hundreds of millions?) in many years earlier. I suppose a lot of it comes down to what Valve needs to support their customers (developers/publishers), they don't see the point in pioneering and establishing some new branch of tech with developers.
nvidia isn't purposely killing anything, they are just following the pivot into the AI nonsense. They have no choice, if they are in a unique position to make 10x by a pivot they will, even if it might be a dumpsterfire of a house of cards. Its immoral to just abandon the industry that created you, but companies have always been immoral.
Valve has an opportunity to what? Take over video card hardware market? No. AMD and Intel are already competitors in the market and cant get any foothold (until hopefully now consumers will have no choice but to shift to them)
> This in turn sparked rumors about NVIDIA purposefully keeping stock low to make it look like the cards are in high demand to drive prices. And sure enough, on secondary markets, the cards go way above MSRP
Nvidia doesn't earn more money when cards are sold above MSRP, but they get almost all the hate for it. Why would they set themselves up for that?
Scalpers are a retail wide problem. Acting like Nvidia has the insight or ability to prevent them is just silly. People may not believe this, but retailers hate it as well and spend millions of dollars trying to combat it. They would have sold the product either way, but scalping results in the retailer's customers being mad and becoming some other company's customers, which are both major negatives.
Either way, scalping is not a problem that persists for multiple years unless it's intentional corporate strategy. Either factories ramp up production capacity to ensure there is enough supply for launch, or MSRP rises much faster than inflation. Getting demand planning wrong year after year after year smells like incompetence leaving money on the table.
The argument that scalping is better for NVDA is coming from the fact that consumer GPUs no longer make a meaningful difference to the bottom line. Factory capacity is better reserved for even more profitable data center GPUs. The consumer GPU market exists not to increase NVDA profits directly, but as a marketing / "halo" effect that promotes decision makers sticking with NVDA data center chips. That results in a completely different strategy where out-of-stock is a feature, not a bug, and where product reputation is more important than actual product performance, hence the coercion on review media.
> I used to buy their cards because they had such a good warranty policy (in my experience)... :\
It's so wild to hear this as in my country, they were not considered anything special over any other third party retailer as we have strong consumer protection laws which means its all much of a muchness.
> Nowadays, $650 might get you a mid-range RX 9070 XT if you miraculously find one near MSRP.
If yes then it's industry wide phenomena.
How would we know if they were?
Oh trust me, they can combat it. The easiest way, which is what Nintendo often does for the launch of its consoles, is produce an enormous amount of units before launch. The steady supply to retailers, absolutely destroys folks ability to scalp. Yes a few units will be scalped, but most scalpers will be underwater if there is a constant resupply. I know this because I used to scalp consoles during my teens and early twenties, and Nintendo's consoles were the least profitable and most problematic because they really try to supply the market. The same with iPhones, yeah you might have to wait a month after launch to find one if you don't pre-order but you can get one.
It's widely reported that most retailers had maybe tens of cards per store, or a few hundred nationally, for the 5090s launch. This immediately creates a giant spike in demand, and drove prices up along with the incentive for scalpers. The manufacturing partners immediately saw what (some) people were willing to pay (to the scalpers) and jacked up prices so they could get their cut. It is still so bad in the case of the 5090 that MSRP prices from AIBs skyrocketed 30%-50%. PNY had cards at the original $1999.99 MSRP and now those same cards can't be found for less than $2,999.99.
By contrast look at how AMD launched it's 9000 series of GPUS-- each MicroCenter reportedly had hundreds on hand (and it sure looked like by pictures floating around). Folks were just walking in until noon and still able to get a GPU on launch day. Multiple restocks happened across many retailers immediately after launch. Are there still some inflated prices in the 9000 series GPUs? Yes, but we're not talking a 50% increase. Having some high priced AIBs has always occurred but what Nvidia has done by intentionally under supplying the market is awful.
I personally have been trying to buy a 5090 FE since launch. I have been awake attempting to add to cart for every drop on BB but haven't been successful. I refuse to pay the inflated MSRP for cards that haven't been been that well reviewed. My 3090 is fine... At this point, I'm so frustrated by NVidia I'll likely just piss off for this generation and hope AMD comes out with something that has 32GB+ of VRAM at a somewhat reasonable price.
Not to mention it's nowhere to be found on Steam Hardware Survey https://store.steampowered.com/hwsurvey/videocard/
As has been explained by others. They cant. Look at the tech which is used by Switch 2 and then look at the tech by Nvidia 50 series.
And Nintendo didn't destroy scalpers, they are still in many market not meeting demand despite "is produce an enormous amount of units before launch".
If we're looking at the ultra high end, you can pay double that and get an RTX 6000 Pro with double the VRAM (96GB vs 48GB), double the memory bandwidth (1792 GB/s vs 864 GB/s) and much much better software support. Or you could get an RTX 5000 Pro with the same VRAM, better memory bandwidth (1344 GB/s vs 864 GB/s) at similar ~$4.5k USD from what I can see (only a little more expensive than AMD).
Why the hell would I ever buy AMD in this situation? They don't really give you anything extra over NVidia, while having similar prices (usually only marginally cheaper) and much, much worse software support. Their strategy was always "slightly worse experience than NVidia, but $50 cheaper and with much worse software support"; it's no wonder they only have less than 10% GPU market share.
If you believe their public statements, because they didn't want to build out additional capacity and then have a huge excess supply of cards when demand suddenly dried up.
In other words, the charge of "purposefully keeping stock low" is something NVidia admitted to; there was just no theory of how they'd benefit from it in the present.
5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.
Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.
[0]: https://www.semiconductor-digest.com/moores-law-indeed-stopp...
In their never ending quest to find ways to suck more money out of people, one natural extension is to just turn the thing into a luxury good and that alone seems to justify the markup
This is why new home construction is expensive - the layout of a home doesn’t change much but it’s trivial to throw on some fancy fixtures and slap the deluxe label on the listing.
Or take a Toyota, slap some leather seats on it, call it a Lexus and mark up the price 40% (I get that these days there are more meaningful differences but the point stands)
This and turning everything into subscriptions alone are responsible for 90% of the issues I have as a consumer
Graphics cards seem to be headed in this direction as well - breaking through that last ceiling for maximum fps is going to be like buying a bentley (if it isn’t already) where as before it was just opting for the v8
I suppose you could also blame the software side, for adopting compute-intensive ray tracing features or getting lazy with upscaling. But PC gaming has always been a luxury market, at least since "can it run Crysis/DOOM" was a refrain. The homogeneity of a console lineup hasn't ever really existed on PC.
> DLSS vs FSR, or DLSS FG and Lossless Scaling.
I've used all of these (at 4K, 120hz, set to "balanced") since they came out, and I just don't understand how people say this.
FSR is a vaseline-like mess to me, it has its own distinct blurriness. Not as bad as naive upscaling, and I'll use it if no DLSS is available and the game doesn't run well, but it's distracting.
Lossless is borderline unusable. I don't remember the algorithm's name, but it has a blur similar to FSR. It cannot handle text or UI elements without artifacting (because it's not integrated in the engine, those don't get rendered at native resolution). The frame generation causes almost everything to have a ghost or afterimage - UI elements and the reticle included. It can also reduce your framerate because it's not as optimized. On top of that, the way the program works interferes with HDR pipelines. It is a last resort.
DLSS (3) is, by a large margin, the best offering. It just works and I can't notice any cons. Older versions did have ghosting, but it's been fixed. And I can retroactively fix older games by just swapping the DLL (there's a tool for this on GitHub, actually). I have not tried DLSS 4.
Most people either can’t tell the difference, don’t care about the difference, or both. Similar discourse can be found about FSR, frame drop, and frame stutter. I have conceded that most people do not care.
AMD is truly making excellent cards, and with a bit of luck UDNA is even better. But they're in the same situation as Nvidia: they could sell 200 GPUs, ship drivers, maintain them, deal with returns and make $100k... Or just sell a single MI300X to a trusted partner that won't make any waves and still make $100k.
Wafer availability unfortunately rules all, and as it stands, we're lucky neither of them have abandoned their gaming segments for massively profitable AI things.
I went with the 5070 Ti since the 5080 didn't seem like a real step up, and the 5090 was just too expensive and wasn't in stock for ages.
If I had a bit more patience, I would have waited till the next node refresh, or for the 5090. I don't think any of the other current 50-series cards are worth besides the 5090 it if you're coming from a 2080. And by worth it I mean will give you a big boost in performance.
So gamers have to pay much more and wait much longer than before, which they resent.
Some youtubers make content that profit from the resentment so they play fast and loose with the fundamental reasons in order to make gamers even more resentful. Nvidia has "crazy prices" they say.
But they're clearly not crazy. 2000 dollar gpus appear in quantities of 50+ from time to time at stores here but they sell out in minutes. Lowering the prices would be crazy.
It's hard to get too offended by them shirking the consumer marker right now when they're printing money with their enterprise business.
Nvidia could have said "we're prioritizing enterprise" but instead they put on a big horse and pony show about their consumer GPUs.
I really like the Gamer's Nexus paper launch shirt. ;)
- outbid Apple on new nodes
- sign commitments with TSMC to get the capacity in the pipeline
- absolutely own the process nodes they made cards on that are still selling way above retail
NVIDIA has been posting net earnings in the 60-90 range over the last few years. If you think that's going to continue? You book the fab capacity hell or high water. Apple doesn't make those margins (which is what on paper would determine who is in front for the next node).
These are the same question Apple Fans asking Apple to buy TSMC. The fact is isn't so simple. And even if Nvidia were willing to pay for it TSMC wouldn't do it just for Nvidia alone.
Big if, I I get that.
BS! Nvidia isn't entitled. I'm not obligated. Customer always has final say.
The problem is a lot of customers can't or don't stand their ground. And the other side knows that.
Maybe you're a well trained "customer" by Nvidia just like Basil Fawlty was well trained by his wife ...
Stop excusing bs.
* The prices for Nvidia GPUs are insane. For that money you can have an extremely good PC with a good non Nvidia GPU.
* The physical GPU sizes are massive, even letting the card rest on a horizontal motherboard looks like scary.
* Nvidia has still issues with melting cables? I've heard about those some years ago and thought it was a solved problem.
* Proprietary frameworks like CUDA and others are going to fall at some point, is just a matter of time.
Looks as if Nvidia at the moment is only looking at the AI market (which as a personal belief has to burst at some point) and simply does not care the non GPU AI market at all.
I remember many many years ago when I was a teenager and 3dfx was the dominant graphics card manufacturer that John Carmack profethically in a gaming computer magazine (the article was about Quake I) predicted that the future wasn't going to be 3dfx and Glide. Some years passed by and effectively 3dfx was gone.
Perhaps is just the beginning of the same story that happened with 3dfx. I think AMD and Intel have a huge opportunity to balance the market and bring Nvidia down, both in the AI and gaming space.
I have only heard excellent things about Intel's ARC GPUs in other HNs threads and if I need to build a new desktop PC from scratch there's no way to pay for the prices that Nvidia is pushing to the market, I'll definitely look at Intel or AMD.
Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly, posting to social media, and youtubers jumping on the trend for likes.
These are 99% user error issues drummed up by non-professionals (and, in some cases, people paid by 3rd party vendors to protect those vendors' reputation).
And the complaints about transient performances issues with drivers, drummed up into apocalyptics scenarios, again, by youtubers, who are putting this stuff under a microscope for views, are universal across every single hardware and software product. Everything.
Claiming "DLSS is snakeoil", and similar things are just an expression of the complete lack of understanding of the people involved in these pot-stirring contests. Like... the technique obviously couldn't magically multiply the ability of hardware to generate frames using the primary method. It is exactly as advertised. It uses machine learning to approximate it. And it's some fantastic technology, that is now ubiquitous across the industry. Support and quality will increase over time, just like every _quality_ hardware product does during its early lifespan.
It's all so stupid and rooted in greed by those seeking ad-money, and those lacking in basic sense or experience in what they're talking about and doing. Embarrassing for the author to so publicly admit to eating up social media whinging.
> Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly
This is not true. Even GN reproduced the melting of the first-party cable.
Also, why shouldn't you be able to use third-party cables? Fuck DRM too.
The whole thing started with Derbauer going to bat for a cable from some 3rd party vendor that he'd admitted he'd already plugged in and out of various cards something like 50 times.
The actual instances that youtubers report on are all reddit posters and other random social media users who would clearly be better off getting a professional installation. The huge popularity for enthusiast consumer hardware, due to the social media hype cycle, has brought a huge number of naive enthusiasts into the arena. And they're getting burned by doing hardware projects on their own. It's entirely unsurprising, given what happens in all other realms of amateur hardware projects.
Most of those who are whinging about their issues are false positive user errors. The actual failure rates (and there are device failures) are far lower, and that's what warrantys are for.
But the fact of the matter is that Nvidia has shifted from a consumer business to b2b, and they don't even give a shit about pretending they care anymore. People take beef with that, understandably, and when you couple that with the false marketing, the lack of inventory, the occasional hardware failure, missing ROPs, insane prices that nobody can afford and all the other shit that's wrong with these GPUs, then this is the end result.
AI upscaling, AI denoising, and RT were clearly the future even 6 years ago. CDPR and the rest of the industry knew it, but outlets like GN pushed a narrative(borderline conspiracy) the developers were somehow out of touch and didn't know what they were talking about?
There is a contingent of gamers who play competitive FPS. Most of which are, like in all casual competitive hobbies, not very good. But they ate up the 240hz rasterization be-all meat GN was feeding them. Then they think they are the majority and speak for all gamers(as every loud minority on the internet does).
Fast forward 6 years and NVidia is crushing the Steam top 10 GPU list, AI rendering techniques are becoming ubiquitous, and RT is slowly edging out rasterization.
Now that the data is clear the narrative is most consumers are "suckers" for purchasing NVidia, Nintendo, and etc. And the content creator economy will be there to tell them they are right.
Edit: I believe too some of these outlets had chips on their shoulder regarding NVidia going way back. So AMDs poor RT performance and lack of any competitive answer the the DLSS suite for YEARS had them lying to themselves about where the industry was headed. Essentially they were running interference for AMD. Now that FSR4 is finally here it's like AI upscaling is finally ok.
For as long as they have competition, I will support those companies instead. If they all fail, I guess I will start one. My spite for them knows no limits
The forum post you linked was an april fools joke.
"Kinda rather not do april 1st jokes like this as it does get cached and passed around after the fact without it being clear."
I hope they get hit with a class action lawsuit and are forced to recall and properly fix these products before anyone dies as a result of their shoddy engineering.
EDIT: Plantiff dismissed it. Guessing they settled. Here are the court documents (alternately, shakna's links below include unredacted copies):
https://www.classaction.org/media/plaintiff-v-nvidia-corpora...
https://www.classaction.org/media/plaintiff-v-nvidia-corpora...
A GamersNexus article investigating the matter: https://gamersnexus.net/gpus/12vhpwr-dumpster-fire-investiga...
And a video referenced in the original post, describing how the design changed from one that proactively managed current balancing, to simply bundling all the connections together and hoping for the best: https://youtu.be/kb5YzMoVQyw
Sounds like it was settled out of court.
[0] https://www.docketalarm.com/cases/California_Northern_Distri...
I’m curious whether the 5090 package was not following UL requirements.
Would that make them even more liable?
Part of me believes that the blame here is probably on the manufacturers and that this isn’t a problem with Nvidia corporate.
I might actually be happy to buy one of these things, at the inflated price, and run it at half voltage or something... but I can't tell if that is going to fix these concerns or they're just bad cards.
As a bonus, if the gauge is large enough, the cable would actually cool the connectors, although that should not be necessary since the failure appears to be caused by overloaded wires dumping heat into the connector as they overheat.
Or at least I think so? Was that a different 12VHPWR scandal?
Another problem is when the connector is angled, several of the pins may not make contact, shoving all the power through as few as one wire. A common bus would help this but the contact resistance in this case is still bad.
It is technically possible to solder a new connector on. LTT did that in a video. https://www.youtube.com/watch?v=WzwrLLg1RR4