Posted by thawawaycold 3 days ago
Gowin and Efinix, like Lattice, have some very interesting new FPGAs, that they've innovated hard on, but which still are only so-so available.
Particularly with AI about, having open source stacks really should be a major door opening function. There could be such an OpenROAD type moment for FPGAs!
They see themselves as CAD software companies. The chip is just a copy-protection dongle.
Yes, that's certainly a big misconception. Maybe not the one the author meant to call out, but... yes, a big misconception indeed.
I agree with another commenter: I think there are parallels to "the bitter lesson" here. There's little reason for specialized solutions when increasingly capable general-purpose platforms are getting faster, cheaper, and more energy efficient with every passing month. Another software engineering analogy is that you almost never need to write in assembly because higher-level languages are pretty amazing. Don't get me wrong, when you need assembly, you need assembly. But I'm not wishing for an assembly programming renaissance, because what would be the point of that?
FPGAs were a niche solution when they first came out, and they're arguably even more niche now. Most people don't need to learn about them and we don't need to make them ubiquitous and cheap.
This is such a severe problem that even now, (20+ year old) H.264 is the only codec that you can safely assume every end-user will be able to play, and H.264 consumes 2x (if not more) bandwidth compared to modern codecs at the same perceived image quality. There are still large subsets of users that cannot play any codecs newer than this without falling back to (heavy and power intensive) software decoding. Being able to simply load a new video codec into hardware would be revolutionary, and that's only one possible use case.
Like, I get the aesthetic appeal, and I accept that there is a small subset of uses where an FPGA really makes a difference. But in the general case, it's a bit like getting upset at people for using an MCU when a 555 timer would do. Sure, except doing it the "right" way is actually slower, more expensive, and less flexible, so why bother?
In the same vein, no one is writing a smartwatch software stack in 100% bare-metal assembly, although in the hands of a capable developer, I'm sure it could prolong battery life.
My Ryzen agrees — the fans just spun up like it’s hitting 10,000 rpm.