Posted by secure 7 days ago
Besides AMD CPUs of the early 2000s going up in smokes without working cooling, they all throttle before they become temporarily or permanently unstable. Otherwise they are bad.
I've never had a desktop part fail due to max temperatures, but I don't think I've owned one that advertises nor allows itself to reach or remain at 100c or higher.
If someone sells a CPU that's specified to work at 100 or 110 degrees and it doesn't then it's either defective or fraudulent, no excuses.
Max Operating Temperature: 105 °C
14900k: https://www.intel.com/content/www/us/en/products/sku/236773/...
Max Operating Temperature: 100 °C
Different CPUs, different specs.
And any CPU from the last decade will just throttle down if it gets too hot. That's how the entire "Turbo" thing works: go as fast as we can until it gets too hot, after which it throttles down.
>If you have an Intel Raptor Lake system and you're in the northern hemisphere, chances are that your machine is crashing more often because of the summer heat. I know because I can literally see which EU countries have been affected by heat waves by looking at the locales of Firefox crash reports coming from Raptor Lake systems.
13th and 14th gen Intel is also showing up in aggregated gaming crash data, though not sure if that's heat related
https://www.youtube.com/watch?v=QzHcrbT5D_Y
>beyond which they default to auto-shutdown.
That doesn't preclude the possibility of temperature related instability below 100...
Nothing that it's different generations - it's not the core ultras (15th) that are known to have these issues
>rotting people's brains
...
Does anything do? It could order a Toyota Corolla to my house when it hits 69 °C 420 times if it was programmed so.
> Nothing that it's different generations - it's not the core ultras (15th) that are known to have these issues
So it is a CPU specific issue then? Not a yuro heatwave issue? Not one he "should have" "mitigated" with a better cooling setup?
ah if only they had incremented that number by one… a new 286 even just in name would be sooo funny… not as funny as bringing back the number 8088 of course
Maybe RISC-V? It's right there in the name, but I haven't really looked at it. However, there are no RISC-V chips that have anywhere near the performance x86 or ARM has, so it remains to be seen if RISC-V can be competitive with x86 or ARM for these types of things.
RISC is one of those things that sounds nice and elegant in principle, but works out rather less well in practice.
RISC-V is specified as a RISC (and allows very space-/power-efficient lower-end designs with the classic RISC design), but designed with macro-op fusion in mind, which gets you closer to a CISC decoder and EUs.
It's a nice place to be in tooling-wise, but it seems too early to say definitively what extensions will need to be added to get 12900K/9950X/M4 -tier performance-per-core.
In either case though, a bunch of the tricks that make modern CPUs fast are ISA-independent; stuff like branch prediction or [0] don't depend on the ISA, and can "work around" needing more instructions to do certain tasks, for one side or the other.
A RISC architecture was actually one with simple control flow and a CISC architecture was one with complex control flow, usually with microcode. This distinction isn't applicable to CPUs past the year 1996 or so, because it doesn't make sense to speak of a CPU having global control flow.
https://www.extremetech.com/extreme/188396-the-final-isa-sho...
The CISC decoder is like a "decompressor" that saves memory bandwidth and cache usage.
Theoretically that’s likely true. But is there any empirical evidence?
Even underclocked Intel desktop chips are massively faster.
Yes, ARM is certainly competitive. But I don’t know how much is that down to Apple being good at making chips instead of the architecture itself.
Qualcomm of course makes decent chips but it’s not like they are that much ahead of x86 on laptops.
Even in Apple’s case, if you only care about raw CPU power instead of performance per watt M series is not that great compared to AMD/Intel.