Posted by todsacerdoti 7/4/2025
I'm not saying they all got together and decided this together but their wonks are probably all saying the same thing. The market is shrinking and whether it's by design or incompetence, this creates a new opportunity to acquire it wholesale for pennies on the dollar and build a wall around it and charge for entry. It's a natural result of games requiring NVidia developers for driver tuning, bitcoin/ai and buying out capacity to prevent competitors.
The wildcard I can't fit into this puzzle is Valve. They have a huge opportunity here but they also might be convinced that they have already saturated the market and will read the writing on the wall.
From a supply/demand perspective, if all of your customers are still getting high on the 5 (or 20) year old supply, launching a new title in the same space isn't going to work. There are not an infinite # of gamers and the global dopamine budget is limited.
Launching a game like TF2 or Starcraft 2 in 2025 would be viewed as a business catastrophe by the metrics most AAA studios are currently operating under. Monthly ARPU for gamers years after purchasing the Orange Box was approximately $0.00. Giving gamers access to that strong of a drug would ruin the demand for other products.
They can't even keep the lights on for SC2.
We [the community] have been designing our own balance patches for the past five years; and our own ladder maps since +/- day 1 - all Blizzard was to do since 2020 was to press the "deploy" button, and they f-ed it up several times anyway.
The news of the year so far is that someone has been exploiting a remote hole to upload some seriously disturbing stuff to the arcade (custom maps/mods) section. So of course rather than fixing the hole, Blizzard has cut off uploads.
So we can't test the balance changes.
Three weeks left until EWC, a __$700.000__ tournament, by the way.
Theoretically SC2 could become like Brood War, with balance changes happening purely through map design. Except we can't upload maps either.
I think the biggest factors involve willingness to operate with substantially smaller margins and org charts.
It genuinely seemed like "Is this fun?" was actually a bigger priority than profit prior to the Activision merger.
The strategy is simple: 1. there's always plenty of people who are ready to spend way more money in a game than you and I would consider sane - just let them spend it but 2. make it easy to gift in-game items to other players. You don't even need to keep adding that much content - the "whales" are always happy to keep giving away to new players all the time.
Assuming you've built up that goodwill, this is all you need to keep the cash flowing. But that's non-exploitative, so you'll be missing that extra 1%. /shrug
What Microsoft is trying to do with Gamepass is a structural change. It may not work out the way that they plan but the truth is that sometimes these things do change the nature of the games you play.
I think Microsoft's strategy is going to come to the same result as Embracer Group. They've bought up lots of studios and they control a whole platform (by which I mean Xbox, not PC) but this doesn't give them that much power. Gaming does evolve and it often evolves to work around attempts like this, rather than in favor of them.
>> Microsoft's strategy is going to come to the same result as Embracer Group.
I hope you are right.
If I were trying to make a larger point, I guess it would be that big tech companies (Apple, MSFT, Amazon) don't want content creators to be too important in the ecosystem and tend to support initiatives that emphasize the platform.
100%. The platforms' ability to monetize in their factor is directly proportional to their relative power vs the most powerful creatives.
Thus, in order to keep more money, they make strategic moves that disempower creatives.
Also mobile games that got priced at $0.99 meant that only the unicorn level games could actually make decent money so In-App Purchases were born.
But also I suspect it is just a problem where as consumers we spend a certain amount of money on certain kinds of entertainment and if as a content producer you can catch enough people’s attention you can get a slice of that pie. We saw this with streaming services where an average household spent about $100/month on cable so Netflix, Hulu, et al all decided to price themselves such that they could be a portion of that pie (and would have loved to be the whole pie but ironically studios not willing to license everything to everyone is what prevented that).
https://www.beyondallreason.info/
But... While bar is good, very good. It is also very hard to compete with, so I see it sort of killing any funding for good commercial RTS's for the next few years.
Some SC2 youtubers are now covering Mechabellum, Tempest Rising, BAR, AoE4, and some in-dev titles: Battle Aces, Immortal: Gates of Pyre, Zerospace, and of course Stormgate.
These are all on my list but I'm busy enough playing Warframe ^^'
Give 4 a try! Its multiplayer is excellent. Kind of a hybrid between Starcraft and AoE2 in terms of pacing and civ divergence. (Fewer, more diverse civs)
The archer kiting/dodging mechanic that dominates AoE2 is gone.
I play AoE2, not 4 because that's what my friends play, but 4 is the more interesting one from a strategy perspective. More opportunities to surprise the opponent, use novel strats, go off meta etc.
The striking one for me is their linux efforts, at least as far as I'm aware they don't do a lot that isn't tied to the steam deck (or similar devices) or running games available on steam through linux. Even the deck APU is derived from the semi-custom work AMD did for the consoles, they're benefiting from a second later harvest that MS/Sony have invested (hundreds of millions?) in many years earlier. I suppose a lot of it comes down to what Valve needs to support their customers (developers/publishers), they don't see the point in pioneering and establishing some new branch of tech with developers.
I don’t think nVidia wants gaming collapse either. They might not prioritize it now but they definitely know that it will persist in some form. They bet on AI (and crypto before it) because those are lucrative opportunities but there’s no guarantee they will last. So they squeeze as much as they can out of those while they can. They definitely want gaming as a backup. It might be not as profitable and more finicky as it’s a consumer market but it’s much more stable in the long run.
It also won’t work, and Microsoft has developed no way to compete on actual value. As much as I hate the acquisitions they’ve made, even if Microsoft as a whole were to croak tomorrow I think the game industry would be fine.
Yes games can be expensive to make, but they don't have to be, and millions will still want new games to play. It is actually a pretty low bar for entry to bring an indie game to market (relative to other ventures). A triple A studio collapse would probably be an amazing thing for gamers, lots of new and unique indie titles. Just not great for profit for big companies, a problem I am not concerned with.
nvidia isn't purposely killing anything, they are just following the pivot into the AI nonsense. They have no choice, if they are in a unique position to make 10x by a pivot they will, even if it might be a dumpsterfire of a house of cards. Its immoral to just abandon the industry that created you, but companies have always been immoral.
Valve has an opportunity to what? Take over video card hardware market? No. AMD and Intel are already competitors in the market and cant get any foothold (until hopefully now consumers will have no choice but to shift to them)
Even more concerning is that, by editorializing the title of an article that is (in part) about how Nvidia uses their market dominance to pressure reviewers and control the narrative, we must question whether or not the mod team is complicit in this effort.
Is team green afraid that a title like "NVIDIA is full of shit" on the front page of HN is bad for their image or stock price? Was HN pressured to change the name?
Sometimes, editorialization is just a dumb and lazy mistake. But editorializing something like this is a lot more concerning. And it's made worse by the fact that the title was changed by the mods.
the attempt to steer direction is well hidden, but it is very much there
with https://hnrankings.info/ you can see the correction applied, in real time
the hidden bits applied to dissenting accounts? far less visible
Also, it's somewhat easy to tell who is a bot. Really new accounts are colored green. I'm sure there's also long-running bots, and I'm not sure how you would find those.
There are many reasons why the editorialized-title rule exists. One of the most important reasons is so that we can trust HN as an unbiased news aggregator. Given the content of the article, this particular instance of editorialization is pretty egregious and trust breaking.
And to be clear, those questions I asked are not outlandish to ask, even if we do trust HN enough to dismiss them.
The title should not have been changed.
The concentration of wealth and influence gives entities like Nvidia the structural power to pressure smaller players in the economic system. That’s not speculative -- it’s common sense, and it's supported by antitrust cases. Firms like Nvidia are incentivized to abuse their market power to protect their reputation and, ultimately, their dominance. Moreover, such entities can minimize legal and economic consequences in the rare instances that there are any.
So what exactly is the risk created by the moderation team allowing criticism of YC or YC companies? There aren’t many alternatives -- please fill me in if I'm missing something. In contrast, allowing sustained or high-profile criticism of giants like Nvidia could, even if unlikely, carry unpredictable risks.
So were you playing devil’s advocate, or do you genuinely think OP’s concern is more conspiratorial than it is a plausible worry about the chilling effect created by concentration of immense wealth?
On this topic, I'm curious what others think of the renaming of this post:
https://news.ycombinator.com/item?id=44435732
The original title I gave was: "Paul Graham: without billionaires, there will be no startups."
As it was a tweet, I was trying to summarize his conclusive point in the first part of the sentence:
Few of them realize it, but people who say "I don’t think that we should have billionaires" are also saying "I don't think there should be startups,"
Now, this part of the sentence to me was the far more interesting part because it was a much bolder claim than the second part of the sentence:
because successful startups inevitably produce billionaires.
This second part seems like a pretty obvious observation and is a completely uninteresting observation by itself.
The claim that successful startups have produced billonaires therefore successful startups require billionaires is a far more contentious and interesting claim.
The mods removed "paul graham" from the title and switched the title to the uninteresting second part of the sentence, turning it into a completely banal and pointless title: Successful startups produce billionaires. Thereby removing any hint of the bold claim being made by the founder of one of the most succesful VCs of the 21st century. And incidentally, also the creator of this website.
I can only conclude someone is loathe to moderate a thread about whether billionaires are neccessary for sucessful startups to exist.
ps. There is no explicit guideline for tweets as far as I can tell. You are forced to use an incomplete quote or are forced to summarize the tweet im some fashion.
I think that if you want to understand why it might be helpful to change the title, consider how well "NVIDIA is full of shit" follows the HN comment guidelines.
I don't imagine you will agree with the title change no matter what, but I believe that's essentially the rationale. Note that the topic wasn't flagged, which if suppression of the author's ideas or protection of Nvidia were goals would have been more effective.
(FWIW I have plenty of issues with HN but how titles are handled isn't one of them.)
https://blog.sebin-nyshkim.net/posts/nvidia-is-full-of-shit/...
But, in my opinion, the public expectations in my opinion are clearly exaggerated and sometimes even dangerous as we ran the risk of throwing the baby with the bathwater when some ideas/marketing/vc people ideas become not realizable in the concrete world.
Why, having this outlook, I should be banned of using the very useful word/concept of "hype"?
Meanwhile the AI companies continue to produce new SotA models yearly, sometimes quarterly, meaning the evidence that you're just completely wrong never stops increasing.
This is a single prediction of a problem that will occur. The tools not living up to the hype leads to disappointment, and people are likely to entirely abandon it because they got burned (throw the baby out with the bath water), even though the tools are still useful if you ignore the hype.
> This in turn sparked rumors about NVIDIA purposefully keeping stock low to make it look like the cards are in high demand to drive prices. And sure enough, on secondary markets, the cards go way above MSRP
Nvidia doesn't earn more money when cards are sold above MSRP, but they get almost all the hate for it. Why would they set themselves up for that?
Scalpers are a retail wide problem. Acting like Nvidia has the insight or ability to prevent them is just silly. People may not believe this, but retailers hate it as well and spend millions of dollars trying to combat it. They would have sold the product either way, but scalping results in the retailer's customers being mad and becoming some other company's customers, which are both major negatives.
> I used to buy their cards because they had such a good warranty policy (in my experience)... :\
It's so wild to hear this as in my country, they were not considered anything special over any other third party retailer as we have strong consumer protection laws which means its all much of a muchness.
It's important to note that nVidia mostly doesn't sell or even make finished consumer-grade GPUs. They own and develop the IP cores, and they contract with TSMC and others to make the chips, and they do make limited runs of "Founders Edition" cards, but most cards that are available to consumers undergo final assembly and retail boxing according to the specs of the partner -- ASUS, GIGABYTE, MSI, formerly EVGA, etc.
MSRP-baiting is what happens when nVidia sets the MSRP without consulting any of its partners and then those partners go and assemble the graphics cards and have to charge more than that to make a reasonable profit. This has been going on for many GPU generations now, but it's not scalping. We can question why this "partnership" model even exists in the first place, since these middlemen offer very little unique value vs any of their competitors anymore, but again nVidia has the upper hand here and thus the lion's share of the blame.
Scalping is when somebody who's ostensibly outside of the industry buys up a bunch of GPUs at retail prices, causing a supply shortage, so that they can resell the cards at higher prices. While nVidia doesn't have direct control over this (though I wouldn't be too surprised if it came out that there was some insider involvement), they also never do very much to address it either. Getting all the hate for this without directly reaping the monetary benefit sounds irrational at first, but artificial scarcity and luxury goods mentality are real business tactics.
Then they tried to weaponize PR to beat nVidia into buying back their unsold cores they thought they'll massively profit off with inflated crypto hype prices.
But more to the point, there's still a trail of blame going back to nVidia here. If EVGA could buy the cores at an inflated price, then nVidia should have raised its advertised MSRP to match. The reason I call it MSRP-baiting is not because I care about EVGA or any of these other rent-seekers, it's because it's a calculated lie weaponized against the consumer.
As I kind of implied already, it's probably for the best if this "partner" arrangement ends. There's no good reason nVidia can't sell all of its desktop GPUs directly to the consumer. EVGA may have bet big and lost from their own folly, but everybody else was in on it too (except video gamers, who got shafted).
NVIDIA and Intel as companies are specialized in the design (and in the latter case, manufacturing) of chips. Board OEMs are specialized in making a consumer-ready product, maintaining worldwide sales and distribution channels, and consumer relations.
Of course, it wouldn’t be impossible for NVIDIA to start doing these things on their own (see Apple, who designs chips, designs computers around those chips, and operates retail stores where those computers are sold), but presumably NVIDIA prefers the current arrangement, where they can just focus on the chips and leave the rest to OEMs.
See also Intel under Gelsinger, who sold off the NUC and server lines (finished products) to focus on the core business (x86 chips).
As far as nVidia is concerned, they lost the privilege to be treated like a small fabless startup. They are regularly ranked as the highest valued company on the U.S. stock market. They clearly can make and sell the whole card themselves, so having GIGABYTE, ASUS, and co. hang around and take the heat for their business decisions feels pretty scummy. It's also clearly bad for the consumer, as Founders Edition cards actually do sell for MSRP. This partner crap is all an obsolete relic of a bygone era, being drawn out well past its prime.
They’re making an overwhelming share of their revenue on ‘data center’, so I doubt they’re desperate to shake up their gaming business.
Either way, scalping is not a problem that persists for multiple years unless it's intentional corporate strategy. Either factories ramp up production capacity to ensure there is enough supply for launch, or MSRP rises much faster than inflation. Getting demand planning wrong year after year after year smells like incompetence leaving money on the table.
The argument that scalping is better for NVDA is coming from the fact that consumer GPUs no longer make a meaningful difference to the bottom line. Factory capacity is better reserved for even more profitable data center GPUs. The consumer GPU market exists not to increase NVDA profits directly, but as a marketing / "halo" effect that promotes decision makers sticking with NVDA data center chips. That results in a completely different strategy where out-of-stock is a feature, not a bug, and where product reputation is more important than actual product performance, hence the coercion on review media.
Oh trust me, they can combat it. The easiest way, which is what Nintendo often does for the launch of its consoles, is produce an enormous amount of units before launch. The steady supply to retailers, absolutely destroys folks ability to scalp. Yes a few units will be scalped, but most scalpers will be underwater if there is a constant resupply. I know this because I used to scalp consoles during my teens and early twenties, and Nintendo's consoles were the least profitable and most problematic because they really try to supply the market. The same with iPhones, yeah you might have to wait a month after launch to find one if you don't pre-order but you can get one.
It's widely reported that most retailers had maybe tens of cards per store, or a few hundred nationally, for the 5090s launch. This immediately creates a giant spike in demand, and drove prices up along with the incentive for scalpers. The manufacturing partners immediately saw what (some) people were willing to pay (to the scalpers) and jacked up prices so they could get their cut. It is still so bad in the case of the 5090 that MSRP prices from AIBs skyrocketed 30%-50%. PNY had cards at the original $1999.99 MSRP and now those same cards can't be found for less than $2,999.99.
By contrast look at how AMD launched it's 9000 series of GPUS-- each MicroCenter reportedly had hundreds on hand (and it sure looked like by pictures floating around). Folks were just walking in until noon and still able to get a GPU on launch day. Multiple restocks happened across many retailers immediately after launch. Are there still some inflated prices in the 9000 series GPUs? Yes, but we're not talking a 50% increase. Having some high priced AIBs has always occurred but what Nvidia has done by intentionally under supplying the market is awful.
I personally have been trying to buy a 5090 FE since launch. I have been awake attempting to add to cart for every drop on BB but haven't been successful. I refuse to pay the inflated MSRP for cards that haven't been been that well reviewed. My 3090 is fine... At this point, I'm so frustrated by NVidia I'll likely just piss off for this generation and hope AMD comes out with something that has 32GB+ of VRAM at a somewhat reasonable price.
As has been explained by others. They cant. Look at the tech which is used by Switch 2 and then look at the tech by Nvidia 50 series.
And Nintendo didn't destroy scalpers, they are still in many market not meeting demand despite "is produce an enormous amount of units before launch".
Nintendo has already shipped over 5 million of them. That’s an insane amount of supply for its first month.
Also, Nvidia could have released the 50-series after building up inventory. Instead, they did the opposite trickling supply into the market to create scarcity and drive up prices. They have no real competition right now especially in the high end. There was no reason to have a “paper launch” except to drive up prices for consumers and margins for their board partners. Process node had zero to do with what has transpired.
Not to mention it's nowhere to be found on Steam Hardware Survey https://store.steampowered.com/hwsurvey/videocard/
If we're looking at the ultra high end, you can pay double that and get an RTX 6000 Pro with double the VRAM (96GB vs 48GB), double the memory bandwidth (1792 GB/s vs 864 GB/s) and much much better software support. Or you could get an RTX 5000 Pro with the same VRAM, better memory bandwidth (1344 GB/s vs 864 GB/s) at similar ~$4.5k USD from what I can see (only a little more expensive than AMD).
Why the hell would I ever buy AMD in this situation? They don't really give you anything extra over NVidia, while having similar prices (usually only marginally cheaper) and much, much worse software support. Their strategy was always "slightly worse experience than NVidia, but $50 cheaper and with much worse software support"; it's no wonder they only have less than 10% GPU market share.
I've purchased one earlier this year for ~3200 USD. Brand new. Dunno why US prices are so high. Torch/llama work fine on this card, it's suitable for multiple compute tasks, the price is reasonable, but apparently not always/everywhere.
> you can pay double that
That's... double that.
UPD. Just checked current EU prices. W7900 is 3200 EUR (been cheaper before), cheapest Nvidia card is RTX Pro 5000 for 5300 (much slower than W7900), cheapest 96Gb Nvidia card is 10K.
W7900 still provides best bang per dollar.
If you believe their public statements, because they didn't want to build out additional capacity and then have a huge excess supply of cards when demand suddenly dried up.
In other words, the charge of "purposefully keeping stock low" is something NVidia admitted to; there was just no theory of how they'd benefit from it in the present.
> Nowadays, $650 might get you a mid-range RX 9070 XT if you miraculously find one near MSRP.
If yes then it's industry wide phenomena.
How would we know if they were?
5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.
Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.
[0]: https://www.semiconductor-digest.com/moores-law-indeed-stopp...
if your product can't be cheap - your product is luxury, not a day-to-day one
In their never ending quest to find ways to suck more money out of people, one natural extension is to just turn the thing into a luxury good and that alone seems to justify the markup
This is why new home construction is expensive - the layout of a home doesn’t change much but it’s trivial to throw on some fancy fixtures and slap the deluxe label on the listing.
Or take a Toyota, slap some leather seats on it, call it a Lexus and mark up the price 40% (I get that these days there are more meaningful differences but the point stands)
This and turning everything into subscriptions alone are responsible for 90% of the issues I have as a consumer
Graphics cards seem to be headed in this direction as well - breaking through that last ceiling for maximum fps is going to be like buying a bentley (if it isn’t already) where as before it was just opting for the v8
I suppose you could also blame the software side, for adopting compute-intensive ray tracing features or getting lazy with upscaling. But PC gaming has always been a luxury market, at least since "can it run Crysis/DOOM" was a refrain. The homogeneity of a console lineup hasn't ever really existed on PC.
> DLSS vs FSR, or DLSS FG and Lossless Scaling.
I've used all of these (at 4K, 120hz, set to "balanced") since they came out, and I just don't understand how people say this.
FSR is a vaseline-like mess to me, it has its own distinct blurriness. Not as bad as naive upscaling, and I'll use it if no DLSS is available and the game doesn't run well, but it's distracting.
Lossless is borderline unusable. I don't remember the algorithm's name, but it has a blur similar to FSR. It cannot handle text or UI elements without artifacting (because it's not integrated in the engine, those don't get rendered at native resolution). The frame generation causes almost everything to have a ghost or afterimage - UI elements and the reticle included. It can also reduce your framerate because it's not as optimized. On top of that, the way the program works interferes with HDR pipelines. It is a last resort.
DLSS (3) is, by a large margin, the best offering. It just works and I can't notice any cons. Older versions did have ghosting, but it's been fixed. And I can retroactively fix older games by just swapping the DLL (there's a tool for this on GitHub, actually). I have not tried DLSS 4.
Most people either can’t tell the difference, don’t care about the difference, or both. Similar discourse can be found about FSR, frame drop, and frame stutter. I have conceded that most people do not care.
5070TI
Which, performance-wise, is a 60TI class card.AMD is truly making excellent cards, and with a bit of luck UDNA is even better. But they're in the same situation as Nvidia: they could sell 200 GPUs, ship drivers, maintain them, deal with returns and make $100k... Or just sell a single MI300X to a trusted partner that won't make any waves and still make $100k.
Wafer availability unfortunately rules all, and as it stands, we're lucky neither of them have abandoned their gaming segments for massively profitable AI things.
I went with the 5070 Ti since the 5080 didn't seem like a real step up, and the 5090 was just too expensive and wasn't in stock for ages.
If I had a bit more patience, I would have waited till the next node refresh, or for the 5090. I don't think any of the other current 50-series cards are worth besides the 5090 it if you're coming from a 2080. And by worth it I mean will give you a big boost in performance.
For cheaper guys like me, I'll just give my son indie and low graphic games which he enjoys
It's hard to get too offended by them shirking the consumer marker right now when they're printing money with their enterprise business.
Nvidia could have said "we're prioritizing enterprise" but instead they put on a big horse and pony show about their consumer GPUs.
I really like the Gamer's Nexus paper launch shirt. ;)
- outbid Apple on new nodes
- sign commitments with TSMC to get the capacity in the pipeline
- absolutely own the process nodes they made cards on that are still selling way above retail
NVIDIA has been posting net earnings in the 60-90 range over the last few years. If you think that's going to continue? You book the fab capacity hell or high water. Apple doesn't make those margins (which is what on paper would determine who is in front for the next node).
These are the same question Apple Fans asking Apple to buy TSMC. The fact is isn't so simple. And even if Nvidia were willing to pay for it TSMC wouldn't do it just for Nvidia alone.
Big if, I I get that.
BS! Nvidia isn't entitled. I'm not obligated. Customer always has final say.
The problem is a lot of customers can't or don't stand their ground. And the other side knows that.
Maybe you're a well trained "customer" by Nvidia just like Basil Fawlty was well trained by his wife ...
Stop excusing bs.
I have been rocking AMD GPU ever since the drivers were upstreamed into the linux kernel. No regrets.
I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy. But consumer gotta consoooooom and then cry and outrage when they are exploited instead of just walking away and doing something else.
Same with magic the gathering, the game went to shit and so many people got outraged and in a big huff but they still spend thousands on the hobby. I just stopped playing mtg.
My main hobby is videogames, but since I can consistently play most games on Linux (that has good AMD support), it doesn't really matter.
After 3 years, I haven't missed Windows a single day.
Efficiency: https://tpucdn.com/review/gigabyte-geforce-rtx-5050-gaming-o...
Vsync power draw: https://tpucdn.com/review/gigabyte-geforce-rtx-5050-gaming-o...
The variance within Nvidia's line-up is much larger than the variance between brands, anyway.
AMD even admits that they don't want to compete in the high range. I have no loyalty to any company but there is just nothing out there that beats a 5080.
I guess there games that you can only play on PC with Nvidia graphics. That begs the question why someone create a game and ignore large console market.
With their current generation of cards AMD has caught up on all of those things except CUDA, and Intel is in a similar spot now that they've had time to improve their drivers, so it's pretty easy now to buy a non-Nvidia card without feeling like you're giving anything up.
AMD doesn't care about consumers anymore either. All the money in AI.
I mean, this also describes the quality of NVIDIA cards. And their drivers have been broken for the last two decades if you're not using windows.
I suspect the thing you're referring to is ZLUDA[0], it allows you to run CUDA code on a range of non NVidia hardware (for some value of "run").
Traditionally the NVIDIA drivers have been more stable on Windows than the AMD drivers. I choose an AMD card because I wanted a hassle free experience on Linux (well as much as you can).
It's not that I can't live like this, I still have the same card, but if I were looking to do anything AI locally with a new card, for sure it wouldn't be an AMD one.
Wrt/ a1, it worked at one point (a year ago) after 2-3 hours of tinkering, then regressed to not working at all, not even from fresh installs on new, different Linuxes. I tried the main branch and the AMD specific fork as well.
Wrt/ Open WebUI, it works, but the thing uses my CPU.
And... they don't need to. Most of the most played video games on PC are all years old [0]. They're online multiplayer games that are optimized for average spec computers (and mobile) to capture as big a chunk of the potential market as possible.
It's flexing for clout, nothing else to it. And yet, I can't say it's anything new, people have been bragging, boasting and comparing their graphics cards for decades.
[0] https://activeplayer.io/top-15-most-popular-pc-games-of-2022...
The past few years (2018 with the introduction of RT and upscaling reconstruction seems as good a milestone as any) feel like a transition period we're not out of yet, similar to the tail end of the DX9/Playstation3/Xbox360 era when some studios were moving to 64bit and DX11 as optional modes, almost like PC was their prototyping platform for when they made completed the jump with PS4/Xbox one and more mature PC implementations. It wouldn't surprise me if it takes more years and titles built targeting the next generation consoles before it's all settled.
I understand the reason for moving to real time ray-tracing. It is much easier for development, and apparently the data for baked/pre-rendered lighting in these big open worlds was getting out of hand. Especially with multiple time-of-day passes.
But, it is only the "path tracing" that top end Nvidia GPUs can do that matches baked lighting detail.
The standard ray-tracing in the latest Doom for instance has a very limited number of entities that actually emit light in a scene. I guess there is the main global illumination source, but many of the extra lighting details in the scene don't emit light. This is a step backward compared to baked lighting.
Even shots from the plasma weapon don't cast any light into the scene with the standard ray-tracing, which Quake 3 was doing.
You don't get headlines and hype by being an affordable way to play games at a decent frame rate, you achieve it by setting New Fps Records.
Software. AMD has traditionally been really bad at their drivers. (They also missed the AI train and are trying to catch up).
I use Linux and have learned not to touch AMD GPUs (and to a lesser extent CPUs due to chipset quality/support) a long time ago. Even if they are better now, (I feel) Intel integrated (if no special GPU perf needed) or NVidia are less risky choices.
The situation completely changed with the introduction of the AMDGPU drivers integrated into the kernel. This was like 10 years ago.
Before then the AMD driver situation on Linux was atrocious. The open source drivers performed so bad you'd get better performance out of Intel integrated graphics than an expensive AMD GPU, and their closed source drivers were so poorly updated you'd have to downgrade the entire world for the rest of your software to be compatible. At that time Nvidia was clearly ahead, even though the driver needs to be updated separately and they invented their own versions of some stuff.
With the introduction of AMDGPU and the years after that everything changed. AMD GPUs now worked great without any effort, while Nvidia's tendency to invent their own things really started grating. Much of the world started moving to Wayland, but Nvidia refused to support some important common standards. Those that really wanted their stuff to work on Nvidia had to introduce entirely separate code paths for it, while other parts of the landscape refused to do so. This started improving again a few years ago, but I'm not aware of the current state because I now only use Intel and AMD hardware.
Then there is the (in)famous AMD reset bug that makes AMD a real headache to use with GPU passthrough. The card can't be properly reset when the VM shuts down so you have to reboot the PC to start the VM a second time. There are workarounds but they only work on some cards & scenarios [1] [2]. This problem goes back to around the 390 series cards so they've had forever to properly implement reset according to the pci spec but haven't. nvidia handles this flawlessly
[0] https://gitlab.freedesktop.org/drm/amd/-/issues/3911
Lesser OpenGL version, and I never managed to have hardware accelerated video until it died last year.
* Purchase always AMD.
* Purchase never Nvidia.
* Intel is also okay.
Because the AMD drivers are good and open-source. And AMD cares about bug reports. The one from Nvidia can and will create issues because they’re closed-source and avoided for years to support Wayland. Now Nvidia published source-code and refuses to merge it into Linux and Mesa facepalmWhile Nvidia comes up with proprietary stuff AMD brought us Vulkan, FreeSync, supported Wayland well already with Implicit-Sync (like Intel) and used the regular Video-Acceleration APIs for long time.
Meanwhile Nvidia:
https://registry.khronos.org/OpenGL/extensions/NV/NV_robustn...
It’s not a bug, it’s a feature!
Their bad drivers still don’t handle simple actions like a VT-Switch or Suspend/Resume. If a developer doesn’t know about that extension the users suffer for years.Okay. But that is probably only a short term solution? It is Nvidias short term solution since 2016!
They do care about but reports, and their drivers — when given time to stabilize — provide the best experience across all operating systems (easy updates, etc), but IME mainline kernels should be treated as alpha-to-beta material.
NVIDIA's drivers also recently completely changed how they worked. Hopefully that'll result in a lot of these long term issues getting fixed. As I understand it, the change is this: The nvidia drivers contain a huge amount of proprietary, closed source code. This code used to be shipped as a closed source binary blob which needed to run on your CPU. And that caused all sorts of problems - because its linux and you can't recompile their binary blob. Earlier this year, they moved all the secret, proprietary parts into a firmware image instead which runs on a coprocessor within the GPU itself. This then allowed them to - at last - opensource (most? all?) of their remaining linux driver code. And that means we can patch and change and recompile that part of the driver. And that should mean the wayland & kernel teams can start fixing these issues.
In theory, users shouldn't notice any changes at all. But I suspect all the nvidia driver problems people have been running into lately have been fallout from this change.
Sadly, a couple of years ago someone seriously misunderstood the news about "open sourcing" their drivers and spread that misunderstanding widely; many people now think their whole driver stack is open, when in reality it's like 1% of the code — the barest minimum they could get away with (I'm excluding GSP code here).
The real FOSS driver is Nova, and it's driven by the community with zero help from Nvidia, as always.
Yes, there are translation layers[1] which you have to know about and understand how to install correctly, which partially solve the problem by translating from VAAPI to NVDEC, but this is certainly not for the average user.
Hopefully, in the future browsers will add support for the new Vulkan Video standard, but for now, unfortunately, one has to hardcode the browser launch parameters in order to use the integrated graphics chip's driver (custom XDG-application file for AMD APU in my case: ~/.local/share/applications/Firefox-amdgpu.desktop): `Exec=env LIBVA_DRIVER_NAME=radeonsi DRI_PRIME=0 MOZ_ENABLE_WAYLAND=1 __NV_PRIME_RENDER_OFFLOAD=0 __GLX_VENDOR_LIBRARY_NAME=radeons i /usr/bin/firefox-beta %u`.
On my Steam deck, I have to use vulkan. AV1 decoder is straight up buggy, have to disable it with config or extensions.
I have no NVIDIA hardware, but I understand that the drivers are even worse than AMD's.
Intel seems to be, at the moment, the least worse compromise between performance and stability,
In my experience an AMD card on linux is a great experience unless you want to do something AI related, in which case there will be random kernel panics (which, in all fairness, may one day go away - then I'll be back on AMD cards because their software support on Linux was otherwise much better than Nvidia's). There might be some kernel upgrades that should be skipped, but using an older kernel is no problem.
Err, what? While you're right about Intel integrated GPUs being a safe choice, AMD has long since been the GPU of choice for Linux -- it just works. Whereas Nvidia on Linux has been flaky for as long as I can remember.
Not OP, I had same experience in the past with AMD,I bought a new laptop and in 6 months the AMD decided that my card is obsolete and no longer provided drivers forcing me to be stuck with older kernel/X11 , so I switched to NVIDIA and after 2 PC changes I still use NVIDIA since the official drivers work great, I really hope AMD this time is putting the effort to keep older generations of cards working on latest kernels/X11 maybe next card will be AMD.
But this is an explanations why us some older Linux users have bad memories with AMD and we had good reason to switch over to NVIDIA and no good reason to switch back to AMD
That said, I've been avoiding AMD in general for so long the ecosystem might have really improved in the meantime, as there was no incentive for me to try and switch.
Recently I've been dabbling in AI where AMD GPUs (well, sw ecosystem, really) are lagging behind. Just wasn't worth the hassle.
NVidia hw, once I set it up (which may be a bit involved), has been pretty stable for me.
I have no opinion on GPUs (I don't play anything released later than about 2008), but Intel CPUs have had more problems over the last five years than AMD, including disabling the already limited support for AVX-512 after release and simply burning themselves to the ground to get an easy win in initial benchmarks.
I fear your perception of their products is seriously out of date.
How's the chipset+linux story these days? That was the main reason for not choosing AMD CPU for me the last few times I was in the market.
Now wayland support is an important factor and AMD is a perfectly acceptable and indeed economical choice.
Basically 15 years inertia is hard to counter.
its been great. flawless in fact.
Maybe there's a difference for the people who buy the absolute top end cards but I don't. I look for best value and when I looked into it amd looked better to me. Also got an amd CPU which has aso been great.
My favorite part about being a reformed gaming addict is the fact that my MacBook now covers ~100% of my computer use cases. The desktop is nice for Visual Studio but that's about it.
I'm still running a 5700XT in my desktop. I have absolutely zero desire to upgrade.
No mesh shader supports though. I bet more games will start using that soon
Just got PRO 6000 96GB for models tuning/training/etc. The cheapest 'good enough' for my needs option.
Same boat. I have 5700XT as well and since 2023, used mostly my Mac for gaming.
They will complain endlessly about the price of a RTX 5090 and still rush out to buy it. I know people that own these high end cards as a flex, but their lives are too busy to actually play games.
Now it is hard to draw a straight comparison. Gamers may spend a lot more time playing so $/h isn't a perfect metric. And some will frequently buy new games or worse things like microtransactions which quickly skyrocket the cost. But overall it doesn't seem like the most expensive hobby, especially if you are trying to spend less.
1. Nvidia cards
2. Hooked up to Linux boxes
It turns out that Nvidia tends to work pretty well on Linux too, despite the binary blob drivers.
Other than gaming and ML, I'm not sure what the value of spending much on a GPU is... AMD is just in a tough spot.
I'd really love to try AMD as a daily driver. For me CUDA is the showstopper. There's really nothing comparable in the AMD camp.
Is there "forwards compatibility" to the same code working on the next cards yet like PTX provided Nvidia?
Last time (4 years ago?) I looked into ROCM, you seemed to have to compile for each revision of each architecture.
The reason I'm not completely sure is because I'm just doing this as a hobby, and I only have a single card, and that single card has never seen a revision. I think that's generally the best way to be happy with ROCM. Accept that it's at the abstraction level of embedded programming, any change in the hardware will have to result in a change in the software.
I think more and more people will realize games are a waste of time for them and go on to find other hobbies. As a game developer, it kinda worries me. As a gamer, I can't wait for gaming to be a niche thing again, haha.
You could still sink a ton of time into it if you wanted do, but you could also crank out a decent amount of fun in 5-15 minutes.
Recently games seem to have been optimized to maximize play time rather than for fun density.
These days, AAA games are optimized for "reduced friction", which in practice usually means dumbing down the mechanics and the overall gameplay to remove everything that might annoy or frustrate the player. I was playing Avowed recently and the sheer amount of convenience features (e.g. the entire rest / fast travel system) was boggling.
There was so so so much trial and error in games in the 90s, with some you basically had to press different inputs to even figure out what does what. No QoL features, really poor save systems that forced you to play the same section over and over, terrible voice acting, crappy B-movie plotlines (this hasn't changed that much tbf but there are some amazingly written games too at least to somewhat counterbalance that) etc.
And then there's the age group younger than me, for whom games are not only a hobby but also a "social place to be", I doubt they'll be dropping gaming entirely easily.
> more and more people will realize games are a waste of time for them and go on to find other hobbies
This is what I'm arguing against, more and more people will realize exactly what sort of games they like and home in on that is a much more likely scenario.
And just in case your point is that games used to be more engaging and fresh, well, Indie games exist. So many games are doing many new things, or fusing existing genres into something fresh. There's a lot more variety to be had in games than most other media.
Nah. Games will always be around.
I'm with you - in principle. Capital-G "Gamers" who turned gaming into an identity and see themselves as the real discriminated group have fully earned the ridicule.
But I think where the criticism is valid is how NVIDIA's behavior is part of the wider enshittification trend in tech. Lock-in and overpricing in entertainment software might be annoying but acceptable, but it gets problematic when we have the exact same trends in actually critical tech like phones and cars.
You don’t even have to walk away. You pretty much never need the latest GPUs to have a great gaming experience
...and even if you're all in on video games, there's a massive amount of really brilliant indie games on Steam that run just fine on a 1070 or 2070 (I still have my 2070 and haven't found a compelling reason to upgrade yet).
Systematic fixes are required because as we know advocating for abstinence isn’t an effective solution ;)
1. The number of sets per year increased too much, there are too many cards being printed to keep up
2. Cards from new sets are pushed to be very strong (FIRE design) which means that the new cards are frequently the best cards. Combine this with the high number of new sets means the pool of best cards is always churning and you have to constantly be buying new cards to keep up.
3. Artificial scarcity in print runs means that the best cards in the new sets are very expensive. We are talking about cardboard here, it isn’t hard to simply print more sheets of a set.
4. The erosion of the brand identity and universe. MTG used to have a really nicely curated fantasy universe and things meshed together well. Now we have spongebob, deadpool, and a bunch of others in the game. It like if you put spongebob in the star wars universe, it just ruins the texture of the game.
5. Print quality of cards went way down. Older cards actually have better card stock than the new stuff.
6. Canadians MTG players get shafted. When a new set is printed stores get allocations of boxes (due to the artificial scarcity) and due to the lower player count in Canada, usually Canadian stores get much lower allocations than their USA counterparts. Additionally, MTG cards get double tariffs as they get printed outside of the USA, imported into the USA and tariffed, and then imported into Canada and tariffed again. I think the cost of MTG cards when up like 30-40% since global trade war.
Overall it boils down to hasbro turning the screws on players to squeeze more money, and I am just not having it. I already spent obscene amounts of money on the game before this all happened.
My local shop has an entire wall of the last ~70 sets, everything from cyberpunk ninjas to gentlemen academic fighting professors to steampunk and everything in between. I think they are releasing ~10 sets per year on average? 4 main ones and then a bunch of effectively novelty ones. I hadn't been in a store in years (most of my stuff is 4th edition from the late 1990s) I did pull the trigger on the Final Fantasy novelty set recently though, for nostalgia's sake.
But yeah it's overwhelming, as a kid I was used to a new major set every year and a half or so with a handful of new cards. 10 sets a year makes it feel futile to participate.
Long answer: the introduction of non-magic sets like SpongeBob SquarePants, Deadpool, or Assassin's Creed are seen as tasteless money grabs that dilute the quality and theme of magic even further, but fans of those things will scoop them up.
The competitive scene has been pretty rough, but I haven't played constructed formats in a while so I'm not as keyed into this. I just know that there have been lots of cards released recently that have had to be banned for how powerful they were.
Personally, I love the game, but I hate the business model. It's ripe for abuse and people treat cards like stocks to invest in.
So far there hasn't been enough of a performance increase for me to upgrade either for gaming or ML. Maybe AMDs rumored 9090 will be enough to get me to open my wallet.
Running most inference models (quantized of course) via Vulkan. Playing games using Wine and/or Steam+Proton on Linux.
Sweet spot in price.
My experience with running non-game windows-only programs has been similar over the past ~5 years. It really is finally the Year of the Linux Desktop, only few people seem to have noticed.
I play a lot of HellDivers 2. Despite what a lot of Linux YouTubers say. It doesn't work very well on Linux. The recommendations I got from people was to change distro. I do other stuff on Linux. Game slows down when you need it to be running smoothly doesn't matter what resolution/settings you set.
Anything with anti-cheat probably won't work very well if at all.
I also wanted to play the old Command and Conquer games. Getting the fan made patchers (not the games itself) to run properly that fix a bunch of bugs that EA/Westwood never fixed and mod support is more difficult than I cared to bother with.
Make sure to change your Steam launch options to:
PULSE_LATENCY_MSEC=84 gamemoderun %command%
This will use gamemode to run it, give it priority, put the system in performance power mode, and will fix any pulse audio static you may be having. You can do this for any game you launch with steam, any shortcut, etc.
It's missing probably 15fps on this card between windows and Linux, and since it's above 100fps I really don't even notice.
It does seem to run a bit better under gnome with Variable Refresh Rate than KDE.
I did get it running nice for about a day and then an update was pushed and it ran like rubbish again. The game runs smoothly when initially running the map and then massive dip in frames for several seconds. This is usually when one of the bugs is jumping at you.
This game may work better on Fedora/Bazzite or <some other distro> but I find Debian to be super reliable and don't want to switch distro. I also don't like Fedora generally as I've found it unreliable in the past. I had a look at Bazzite and I honestly just wasn't interested. This is due to it having a bunch of technologies that I have no interest in using.
There are other issues that are tangential but related issues.
e.g.
I normally play on Super HellDive with other players in a Discord VC. Discord / Pipewire seems to reset my sound for no particular reason and my Plantronics Headset Mic (good headset, not some gamer nonsense) will be not found. This requires a restart of pipewire/wireplumber and Discord (in that order). This happens often enough I have a shell script alias called "fix_discord".
I have weird audio problems on HDMI (AMD card) thanks to a regression in the kernel (Kernel 6.1 with Debian worked fine).
I could mess about with this for ages and maybe get it working or just reboot into Windows which takes me all of a minute.
It is just easier to use Windows for Gaming. Then use Linux for work stuff.
Honestly? Fedora is really the premier Linux distro these days. It's where the most the development is happening, by far.
All of my hardware, some old, some brand new (AMD card), worked flawlessly out of the box.
There was a point when you couldn't get me to use an rpm-based distro if my life depended on it. That time is long gone.
It the same reason I don't want to use Bazzite. It misses the point of using a Linux/Unix system altogether.
I also learned a long time ago Distro Hopping doesn't actually fix your issues. You just end up either with the same issues or different ones. If I switched from Debian to Fedora, I suspect I would have many of the same issues.
e.g. If a issue is in the Linux kernel itself such as HDMI Audio on AMD cards having random noise, I fail to see how changing from one distro to another would help. Fedora might have a custom patch to fix this, however I could also take this patch and make my own kernel image (which I've done in the past btw).
The reality is that most people doing development for various project / packages that make the Linux desktop don't have the setup I have and some of the peculiarities I am running into. If I had a more standard setup, I wouldn't have an issue.
Moreover, I would be using FreeBSD/OpenBSD or some other more traditional Unix system and ditch Linux if I didn't require some Linux specific applications. I am considering moving to something like Artix / Devuan in the future if I did decide to switch.
I just switched over to it last night and my audio in Helldivers 2 in particular is awful and I'm having framerate dives.
If I got back to Gnome3, it's much more stable in fps and my audio problems go away.
This is with VRR on/off in both.
I also don't play any games that require a rootkit, so..
When I am in front of windows, I know I can permit myself to relax, breath easy and let off some steam. When I am not, I know I am there to learn/earn a living/produce something etc. Most probably do not need this, but my brain does, or I would never switch off.
The vast majority of my gaming library runs fine on Linux. Older games might run better than on Windows, in fact.
And yes, I rarely play anything online multiplayer.
areweanticheatyet.com
Last one I ever tried was https://www.protondb.com/app/813780 with comments like "works perfectly, except multiplayer is completely broken" and the workaround has changed 3 times so far, also it lags no matter what. Gave up after stealing 4 different DLLs from Windows. It doesn't even have anticheat, it's just cause of some obscure math library.
I literally never had to do that. Most tweaking I needed to do was switching proton versions here and there (which is trivial to do).
Age of empires 2 used to work well, without needing any babying, so I'm not sure why it didn't for you. I will see about spinning it up.
You're supposed to find a proton fork like "glorious eggroll" that has patches specifically for your game.
Microsoft fails consistently ... even when offered a lead on the plate... it fails, but these failures are eventually corrected for by the momentum of its massive business units.
Apple is just very very late... but this failure can be eventually corrected for by its unbeatable astroturfing units.
Perhaps AMD are too small keep up everywhere it should. But compared to the rest, AMD is a fast follower. Why Intel is where it is is a mystery to me but i'm quite happy about its demise and failures :D
Being angry about NVIDIA is not giving enough credit to NVIDIA for being on-time and even leading the charge in the first place.
Everyone should remember that NVIDIA also leads into the markets that it dominates.
My personal guess is something in the medical field, because surely all the AI search tools could help to detect common items in all the medical data. Maybe more of ozempyc, maybe for some other health issue. (Of course, who knows. Maybe it turns out that the next boom is going to be in figuring out ways to make things go boom. I hope not.)
You've been able to do that relatively cheaply for at least a decade. Nobody really does because the market for even minor surgeries that can essentially be replaced by having a pocket is pretty small.
Implanted neural interfaces have a lot of technical challenges that I think make them extremely unlikely as purely elective procedures in anything like the immediate future. AR glasses are way more plausible.
For as long as they have competition, I will support those companies instead. If they all fail, I guess I will start one. My spite for them knows no limits
The 1080ti can be analogized to Sandy Bridge/The 2600k: Insane performance per dollar, generous binning, and plenty of room left for overclocking. Held up with minimal concessions for a decade, still fine with mild compromises past that.
Every generation since? Gives less and less, all Nvidia can do is all they've ever done: press up against the aspect limit of the dies and lean into their massive scale. What's fun about this go round is that the lines are far blurrier between Consumer & Enterprise, a bunch of 5090s is not hamstrung from doing the things a B200 can in the way a 2600k or i9 is compared to a xeon processor. On top of this, there is an entire cottage industry dedicated to adding additional VRAM to old cards and harvesting GPUs from otherwise broken video cards and swapping them onto new boards.
The forum post you linked was an april fools joke.
"Kinda rather not do april 1st jokes like this as it does get cached and passed around after the fact without it being clear."
So gamers have to pay much more and wait much longer than before, which they resent.
Some youtubers make content that profit from the resentment so they play fast and loose with the fundamental reasons in order to make gamers even more resentful. Nvidia has "crazy prices" they say.
But they're clearly not crazy. 2000 dollar gpus appear in quantities of 50+ from time to time at stores here but they sell out in minutes. Lowering the prices would be crazy.
Also, in some sense there can be some fear 5090s could cannibalize the data center hardware in some aspects - my desktop has a 3060 and I have trained locally, run LLMs locally etc. It doesn't make business sense at this time for Nvidia to meet consumer demand.
Liars or not, the performance has not been there for me in any of my usecases, from personal to professional.
A system from 2017/2018 with an 8700K and an 8GB 2080 performs so closely to the top end, expensive systems today that it makes almost no sense to upgrade at MSRP+markup unless your system is older than this.
Unless you need specific features only on more recent cards, there are very few use cases I can think of needing more than a 30 series card right now.
This is in no way true and is quite an absurd claim. Unless you meant for some specific isolated purposed restricted purely to yourself and your performance needs.
> there are very few use cases I can think of needing more than a 30 series card right now.
How about I like high refresh and high resolutions? I'll throw in VR to boot. Which are my real use cases. I use a high refresh 4K display and VR, both have benefited hugely from my 2080Ti > 4090 Shift.
And despite CPUs stagnating it’s absolutely still possible to be held back on a stronger GPU with an older CPU especially in areas such as 1% lows, stuttering etc.
You provided no evidence to back up this very strong statement; should we just take your word for it?
> especially in areas such as 1% lows, stuttering etc.
Oh, if you're willing to spend $1k to improve your 1% lows, I guess your argument makes sense.
Where is your evidence? You were the one making grand claims entirely unsupported by reality. Should we just take your word for it? It seems to have been your expectation given you backed it with literally nothing.
My evidence would be literally any benchmark in existence and the fact I actually owned the 2080ti and now own modern a modern high end GPU. They are not even remotely in the same class of performance in anything other than your head. But hey if that isn’t enough:
https://www.techpowerup.com/review/nvidia-geforce-rtx-5080-f...
Now go on, I eagerly await any evidence that supports your claim. Take all the time you need.
Oooh, got me hard with that ending. Now I really wanna spend my time engaging with you, great job on furthering the discussion and making a positive impact. Cheers on the world you make for yourself - you're the one who gets to experience it.
Unless nvidia's money printing machine breaks soon, expect the same to continue for the next 3+ years. Crappy expensive cards with a premium on memory with almost no actual video rendering performance increase.
This does not somehow give purchasers more budget room now, but they can buy 30-series cards in spades and not have to worry about the same heating and power deliveries as a little bonus.