Posted by thawawaycold 12/19/2025
I think that’s the clearest explanation of FPGAs I’ve ever seen.
I agree with another commenter: I think there are parallels to "the bitter lesson" here. There's little reason for specialized solutions when increasingly capable general-purpose platforms are getting faster, cheaper, and more energy efficient with every passing month. Another software engineering analogy is that you almost never need to write in assembly because higher-level languages are pretty amazing. Don't get me wrong, when you need assembly, you need assembly. But I'm not wishing for an assembly programming renaissance, because what would be the point of that?
FPGAs were a niche solution when they first came out, and they're arguably even more niche now. Most people don't need to learn about them and we don't need to make them ubiquitous and cheap.
This is such a severe problem that even now, (20+ year old) H.264 is the only codec that you can safely assume every end-user will be able to play, and H.264 consumes 2x (if not more) bandwidth compared to modern codecs at the same perceived image quality. There are still large subsets of users that cannot play any codecs newer than this without falling back to (heavy and power intensive) software decoding. Being able to simply load a new video codec into hardware would be revolutionary, and that's only one possible use case.
That relies on "FPGAs everywhere", which is much further out than "GPUs everywhere".
I'm not sure where the state of the art is on this, but given the way that codecs work - bitstream splitting into tiles, each of which is numerically heavy but can be handled separately - how is development of hybrid codecs, where the GPU does the heavy lifting using its general purpose cores rather than fixed function decoder pipeline?
Like, I get the aesthetic appeal, and I accept that there is a small subset of uses where an FPGA really makes a difference. But in the general case, it's a bit like getting upset at people for using an MCU when a 555 timer would do. Sure, except doing it the "right" way is actually slower, more expensive, and less flexible, so why bother?
In the same vein, no one is writing a smartwatch software stack in 100% bare-metal assembly, although in the hands of a capable developer, I'm sure it could prolong battery life.
My Ryzen agrees — the fans just spun up like it’s hitting 10,000 rpm.