Posted by simlevesque 8 hours ago
1. Gaming cards are their R&D pipeline for data center cards. Lots of innovation came from gaming cards.
2. Its a market defense to keep other players down and keep them from growing their way into data centers.
3. Its profitable (probably the main reason but boring)
4. Hedge against data center volatility (10 key customers vs millions)
5. Antitrust defense (which they used when they tried to buy ARM)
Instead we will be streaming games from our locked down tablets and paying a monthly subscription for the pleasure.
The push for 4K with raytracing hasn't been a good thing, as it's pushed hardware costs way up and led to the attempts to fake it with AI upscaling and 'fake frames'. And even before that, the increased reliance on temporal antialiasing was becoming problematic.
The last decade or so of hardware/tech advances haven't really improved the games.
Biggest flop is UE5 and it's lumen/nanite. Reallly everything would be fine if not that crap.
And yeah, our hardware is not capable of proper raytracing at the moment.
Somebody should tell that to the AAA game developers that think hitting 60fps with framegen should be their main framerate target.
Poor RT performance is more a developer skill issue than a problem with the tech. We've had games like Doom The Dark Ages that flat out require RT, but the RT lighting pass only accounts for ~13% of frame times while pushing much better results than any raster GI solution would do with the same budget.
Do I, as a player, appreciate the extra visual detail in new games? Sure, most of the time.
But, if you asked me what I enjoy playing more 80% of the time? I'd pull out a list of 10+ year old titles that I keep coming back to, and more that I would rather play than what's on the market today if they only had an active playerbase (for multiplayer titles).
Honestly, I know I'm not alone in saying this: I'd rather we had more games focused on good mechanics and story, instead of visually impressive works that pile on MTX to recoup insane production costs. Maybe this is just the catalyst we need to get studios to redirect budgets to making games fun instead of spending a bunch of budget on visual quality.
Sadly, the MX500 is now difficult to find in western europe. Only lower grade BX500, still quite reliable but not as fast as the MX500 with cache + DRAM.
Had quite a lot of controller issues (become sluggish for periods of time) with the sandisk/WD ones like green/blue and SSD plus.
They announced a month ago that their upstate NY fab was delayed by 2-3 years so the painting was on the wall
https://www.syracuse.com/micron/2025/11/micron-chip-factorie...
Anything the fab outputs will feed into Micron selling to datacenters
I don't want to have to start buying obscure keysmash chinese brands for normal looking affordable hardware.
God dammit Micron.
The MX500 1st gen (fw M3CR023) was the second best SATA SSD range with the kings the Samsung 860 Evo and Pro. P3 and P3+ were very good drives with great princing for some time, not comparable to the Samsung 970 Evo and Evo+ though.
Never had a failure on about 500 units of crucial MX300/500/P1/P3/P3+/P5. Always updated their firmwares, though.
Comparatively, had lot of sluggish controllers on Sandisk/WD green/blue SATA SSD, and some BX500. But a lot better than any entry level generic Phison S3111 based SSD.
Also very few failures with DDR3/4 DIMMs and SODIMMs. Less than with Kinston and Corsair modules. About the same as Samsung OEM modules from HP/Dell.
Now let's just hope Samsung will not follow in their tracks. I don't see WD-Sandisk going corporate only since they do not make DRAMs modules.