Top
Best
New

Posted by simlevesque 8 hours ago

Micron Announces Exit from Crucial Consumer Business(investors.micron.com)
334 points | 151 commentspage 2
httpz 4 hours ago|
I'm half joking but if this AI boom continues we're going to see Nvidia exit from consumer GPU business. But Jensen Huang will never do that to us... (I hope)
tomasphan 4 hours ago||
There is a couple reasons why Jensen won't take off the gaming leather jacket just yet:

1. Gaming cards are their R&D pipeline for data center cards. Lots of innovation came from gaming cards.

2. Its a market defense to keep other players down and keep them from growing their way into data centers.

3. Its profitable (probably the main reason but boring)

4. Hedge against data center volatility (10 key customers vs millions)

5. Antitrust defense (which they used when they tried to buy ARM)

BizarroLand 3 hours ago||
6. Techies who use NVidia GPUs in their PCs are more likely to play with AI and ultimately contribute to the space as either a developer or a user
darth_avocado 3 hours ago||
7. Maybe just don’t put all your eggs in one basket, especially when that basket is an industry that has yet to materialize its promise.
venturecruelty 3 hours ago|||
Why would anyone sell a handful of GPUs to nobodies like us when they could sell a million GPUs for thousands apiece to a handful of big companies? We're speedrunning the absolute worst corpo cyberpunk timeline.
CTDOCodebases 3 hours ago|||
The way things are going no one will be able to afford a PC.

Instead we will be streaming games from our locked down tablets and paying a monthly subscription for the pleasure.

20after4 3 hours ago||
You will own nothing and be happy.
goda90 4 hours ago|||
They are already making moves that might suggest that future. They are going to stop packaging VRAM with their GPUs shipped to third-party graphics card makers, who will have to source their own, probably at higher cost.
bpye 4 hours ago|||
I think Nvidia realises that selling GPUs to individuals is useful as it allows them to develop locally with CUDA.
pizlonator 1 hour ago||
This is a huge reason.
bluescrn 3 hours ago|||
Might almost be a good thing, if it means abandoning overhyped/underperforming high-end game rendering tech, and taking things in a different direction.

The push for 4K with raytracing hasn't been a good thing, as it's pushed hardware costs way up and led to the attempts to fake it with AI upscaling and 'fake frames'. And even before that, the increased reliance on temporal antialiasing was becoming problematic.

The last decade or so of hardware/tech advances haven't really improved the games.

whatevaa 2 hours ago|||
DLSS Transformer models are pretty good. Framegen can be useful but has niche applications dure to latency increase and artifacts. Global illumination can be amazing but also pretty niche as it's very expensive and comes with artifacts.

Biggest flop is UE5 and it's lumen/nanite. Reallly everything would be fine if not that crap.

And yeah, our hardware is not capable of proper raytracing at the moment.

theoldgreybeard 34 minutes ago||
> Framegen can be useful but has niche application

Somebody should tell that to the AAA game developers that think hitting 60fps with framegen should be their main framerate target.

swinglock 3 hours ago||||
The latest DLSS and FSR are good actually. Maybe XeSS too.
babypuncher 3 hours ago|||
The push for ray tracing comes from the fact that they've reached the practical limits of scaling more conventional rendering. RT performance is where we are seeing the most gen-on-gen performance improvement, across GPU vendors.

Poor RT performance is more a developer skill issue than a problem with the tech. We've had games like Doom The Dark Ages that flat out require RT, but the RT lighting pass only accounts for ~13% of frame times while pushing much better results than any raster GI solution would do with the same budget.

snuxoll 2 hours ago||
The literal multi-million dollar question that executives have never bothered asking: When is it enough?

Do I, as a player, appreciate the extra visual detail in new games? Sure, most of the time.

But, if you asked me what I enjoy playing more 80% of the time? I'd pull out a list of 10+ year old titles that I keep coming back to, and more that I would rather play than what's on the market today if they only had an active playerbase (for multiplayer titles).

Honestly, I know I'm not alone in saying this: I'd rather we had more games focused on good mechanics and story, instead of visually impressive works that pile on MTX to recoup insane production costs. Maybe this is just the catalyst we need to get studios to redirect budgets to making games fun instead of spending a bunch of budget on visual quality.

whatevaa 2 hours ago|||
They will constrain supply before exiting. It's just not smart exiting, you can stop developing and it will be a trickle, also will work as insurance in case AI flops.
leoc 2 hours ago||
In the words of Douglas Adams, there are those who say that this has already happened.
Brainlag 2 hours ago|||
Jensen is to paranoid to do it. But whoever comes after him will do it ASAP.
adrr 2 hours ago|||
Keep the retail investors happy so they keep pumping your stock.
officeplant 3 hours ago||
Honestly, I'd prefer it. It might get AMD and Intel more off their ass for GPU development. I already stopped buying Nvidia gpus ages ago before they saw value in the Linux/Unix market, and I'm tired of them sucking up all the air in the room.
elzbardico 5 hours ago||
MBA/Wall Street short term driven decision.
RedShift1 4 hours ago||
Their MX500 series SSDs were just king of price, performance and reliability. I even installed them in industrial PCs with intense vibrations and large temperature cycles, they're still chugging along like it's nothing.
Felger 2 hours ago||
Agreed, the first gen MX500 with M3CR023 fw proved IMHO to be the second most reliable SATA SSD 2.5" form factor with the Samsung 860 range SSDs (860 Evo / Pro).

Sadly, the MX500 is now difficult to find in western europe. Only lower grade BX500, still quite reliable but not as fast as the MX500 with cache + DRAM.

Had quite a lot of controller issues (become sluggish for periods of time) with the sandisk/WD ones like green/blue and SSD plus.

tkfoss 4 hours ago||
Just bought one last week, sadly not many options left for 2.5" SSDs
haunter 7 hours ago||
(Crucial the brand not the adjective)

They announced a month ago that their upstate NY fab was delayed by 2-3 years so the painting was on the wall

https://archive.md/WSsLm

https://www.syracuse.com/micron/2025/11/micron-chip-factorie...

delfinom 6 hours ago|
They don't need the fab for Crucial the brand.

Anything the fab outputs will feed into Micron selling to datacenters

m4r1k 5 hours ago||
I have also bought Crucial for decades. Great quality and reliability for a fair price. Anybody doing anything semi-professional will be impacted by this questionable decision.
lambchoppers 5 hours ago|
If you are going to sell shovels for a gold rush its pretty silly to keep rational market compatibility on things like prices, defect rate, packaging, correct contents.. Probably better to spare the brand (for a reboot?) and also not compete as much with all that junk when it shows up again on eBay.
trashface 3 hours ago||
Bummer. Crucial had a lifetime warranty on memory. So when my 2012 Macbook pro cooked the aftermarket memory I had put in it, they replaced it for free. A few years later same system cooked the new memory chip. Again they replaced it no charge. Got six years out of it before keyboard failed, would have bricked itself far earlier if not for crucial.
rietta 2 hours ago||
This sucks! I know more about software than hardware. Crucial is the only ram I have bought for decades now. As a practical matter does this mean one needs to buy DDR5 ram now from MicroCenter for a build planned for next year? I just put 128gb into my latest Linux workstation. I had been planning to build a new NAS to replace my aging TrueNas (nee FreeNAS). I was just thinking about possibly building another dev box after being very happy with this AMD 9950x performance.
zvolsky 2 hours ago||
In May this year, the 2x48GB Crucial SODIMM kit sold for £180. Today, the same kit is £275 in the Crucial Amazon store, and sales are limited to one unit per customer. The free market price seems to be over £500. Not a good time to be building a Mini PC or DIY laptop.
officeplant 3 hours ago||
Nearly every pc in my current collection has Crucial ram. My main desktop has a Crucial NVME drive, and I've got 2.5in SATA SSDs from Crucial in family computers.

I don't want to have to start buying obscure keysmash chinese brands for normal looking affordable hardware.

God dammit Micron.

Felger 2 hours ago|
Very sad news. Crucial Micron is (soon "was") an great brand for computer assembly and upgrade. It is sad to see the brand rushing to the "easy money" stream. This won't be forgotten when the current bubble will evenually pop and they might meet the same fate as the now forgotten Elpida (who bought Qimonda wich also failed).

The MX500 1st gen (fw M3CR023) was the second best SATA SSD range with the kings the Samsung 860 Evo and Pro. P3 and P3+ were very good drives with great princing for some time, not comparable to the Samsung 970 Evo and Evo+ though.

Never had a failure on about 500 units of crucial MX300/500/P1/P3/P3+/P5. Always updated their firmwares, though.

Comparatively, had lot of sluggish controllers on Sandisk/WD green/blue SATA SSD, and some BX500. But a lot better than any entry level generic Phison S3111 based SSD.

Also very few failures with DDR3/4 DIMMs and SODIMMs. Less than with Kinston and Corsair modules. About the same as Samsung OEM modules from HP/Dell.

Now let's just hope Samsung will not follow in their tracks. I don't see WD-Sandisk going corporate only since they do not make DRAMs modules.

More comments...