Posted by bhouston 4 days ago
Not surprising, since the few physical RISC-V implementations that are available under-perform a decade-old Raspi by a significant margin, and that platform is not a rocket-ship to begin with.
This is a bit surprising given that all these techniques have been in computer architecture textbooks since at least the 90s.
RISC-V gets to take advantage of being produced in 2024 and absorbing all the clock speed and transistor advantages we get for free today because of 6 decades of transistor production.
> The integrated 240Gb/s [Ed: 30GB/s] TSN-enabled Ethernet switch is far from the chips' only feature: the PIC64HX has no fewer than eight 64-bit SiFive Intelligence X280 RISC-V cores,
https://www.hackster.io/news/microchip-unveils-the-high-perf...
Actual speed of those X280's is TBD, but this seems like a huge bump in what's on the risc-v market.
Most of the chips we've been looking at are the open sourced C906, a not particularly fancy early core, open sourced in 2021. It wasn't even the highest offering in that release! https://riscv.org/news/2021/10/alibaba-open-sources-four-ris...
Google talked about using the X280 two years ago, and what they were able to get from it's vector unit, https://www.sifive.com/blog/sifive-intelligence-x280-as-ai-c...
- 5.75 CoreMarks/MHz
- 3.25 DMIPS/MHz
- 4.6 SpecINT2k6/GHz
This SOC is mostly interesting for the 512 wide (probably single 512 vector issue) vector extension support.
If the price is right this might usable as a nice little low power number cruncher.
The lockstep mode is also interesting for industry uses.
While possible, it's certainly nothing you'd want to run a desktop on.
I’m all for more architectures but RISC-V has an absurd fandom behind it now. It feels like the fandom behind Linux on the home desktop or Vulkan (which makes sense given the open nature) in the way that they’re trying to manifest its success as reality by just saying that it’ll inevitably be used, while ignoring the hurdles in the way.
That’s not to say those aren’t successes when used , but I often feel the comments that put them on the pedestal don’t acknowledge the immense delta between today and their imagined future, and have very little interest when it’s pointed out.
For all three it comes down to Software Compatibility and experience of use. And the proponents seem to have a “the underlying tech is built and people will flock to it when they open their eyes”, but the first part in fixing the software compatibility gulf is acknowledging it exists and actioning that. FOSS isn’t alone here, Apple does the same mistake with desktop gaming and Microsoft with almost any physical product they release that’s not running windows.
I really do hope that more open platforms and technologies happen, but I feel like the people who unabashedly push them without acknowledging how it needs to happen are doing them a disservice.
I don't understand what the fandom thinks they're going to get out of RISC-V "winning". You're not going to be able to download a new CPU like you can a new Linux kernel. An open source CPU core is useless without a factory to manufacture it.
There's not going to be a GNU equivalent to semiconductor manufacturing. The baseline cost to build a factory is billions of dollars. You also can't just slap any design on any manufacturing node or chemistry. There's a lot work to get a chip design working on a particular node.
A CPU is a very small part of a functional computing device. It's some magical thinking to assume just because a device was built on an open source CPU core that somehow the overall system will somehow be more open.
Most of the stuff people bitch about being binary blobs will remain binary blobs will remain so even with a RISC-V CPU core. Anything with a radio will remain a black box for regulatory reasons, even if the baseband core is a RISC-V chip. GPUs will remain black boxes only accessible via their drivers' interface. Peripheral controllers covered by patent pools or branding licenses won't cease to be covered even if the controllers are RISC-V.
If I can use that premise , then all the incongruity makes much more sense.
I do recognize that as a royalty free and well supported architecture, very flexible with all the optional extensions, it do have a better shot than others at becoming the standard architecture. But the sheer amount of closed source software written for architectures that keep track of arithmetic flags that would need to be emulated is daunting.
Universities and business can now cooperate much better better, people can work on a project in university and commercialize it far faster.
> You're not going to be able to download a new CPU like you can a new Linux kernel. An open source CPU core is useless without a factory to manufacture it.
A huge number of well designed FPGA cores can actually be download. Not so long ago good open FPGA cores weren't that common. Now there is a wealth of options. And you can get very high quality stuff like OpenTitan and use it yourself.
And designing your own CPU and letting it be manufactured more like a PCB isn't as crazy as it was. It used to cost many millions, now you can get it done for much less. 30 years ago hobbits designing their own boards and getting them within a few days wasn't a thing, now its common. Almost every hacker conference now has their own PCB per conference. This wasn't a thing not that long ago.
Google worked with SkyWater Technology to actually open source most of their process. You can use a fully open flow and order your own costume chips.
Contract manufacturing like that barley was a thing in the 80s, and now its the standard.
> A CPU is a very small part of a functional computing device. It's some magical thinking to assume just because a device was built on an open source CPU core that somehow the overall system will somehow be more open.
Nobody has this 'magical' thinking, that's a Straw man. Its about observing a longer term trend. Before RISC-V having even open-core wasn't much of a thing. And now that there are quality cores people can spend their time on other thing. Open implementation of different IPs are increasingly being designed. University research on new stuff being done by adding something to an open core.
Companies like Tenstorrent added a cool vector ISA to an Open Core. Chips Alliance is investing in tools like Verilator, and other needed things like TileLink interconnects.
Yes its not a guarantee that whoever uses RISC-V makes other things open, but at the very least it doesn't hurt. And the other side of the coin is that it makes it very EASY for people who want to make things open.
> GPUs will remain black boxes only accessible via their drivers' interface.
There are already early attempts at building RISC-V open GPUs, that you can interact with however you want. Sure your not gone get that from NVidea but that doesn't mean its not valuable.
> Peripheral controllers covered by patent pools or branding licenses won't cease to be covered even if the controllers are RISC-V.
Nobody is disagreeing with that, but its still the case that a lot of things can actually be open. A position that says 'its bad because its not absolute good' is dumb.
The point is that RISC-V existing and being successful is just one building block in the idea of information and technology being free and sharable, it moves cooperation and competition to a higher level.
It has lots of practical benefits that have already been demonstrated and RISC-V as a movement already had a huge impact on everything from the tooling to peripherals. Just look at RISC-V Foundation, Chip Alliance, CORE-V, Pulp Project and so on and so on.
> I don't understand what the fandom thinks they're going to get
A better world mostly, and many practical benefits along the way.
> A better world mostly, and many practical benefits along the way.
That's the magical thinking. There's logical leaps required to get from the current status quo to the proposed future state. A CPU architecture and core design is orthogonal to almost every other market force in the industry.
Let me make sure I'm explicit since people tend to get very tribal when anyone says anything about their "team". I have no problem at all with RISC-V as an architecture. I do not care if my next phone or laptop has an ARM chip or RISC-V chip. As long as my laptop does laptop stuff and my phone does phone stuff the CPU instructions executing do not matter to me. I'm also writing zero assembler for any non-trivial personal or professional project. So long as compilers and toolchains exist for my system the ISA is an academic discussion for me. I have no problem with RISC-V existing or "winning".
In terms of the world being "better" with RISC-V, that's just a weird statement. The architecture doesn't offer anything actually new to the industry. There's nothing fundamental about the architecture that makes anything better. The ISA has implementation gotchas that make for problematic or complicated compiler implementations. Its extensible nature also provides a huge surface area for minor implementation specific incompatibilities. Two similar RISC-V chips may not be drop-in replacements for one another. So it's not like the overall RISC-V design is objectively better than any other ISA.
The open source nature of RISC-V is an academic improvement over closed core designs. I'm a random guy and I have the same access to most of the same compilers as Google, Amazon, or anyone else. I can compile Linux on any commodity computer I own. I don't need a clean room or expensive equipment to do software development or even deployment. If I write something you want to use the marginal cost of acquiring it for you is effectively zero. Open source software is unreasonably effective because of that trivial marginal cost of reproduction.
Unlike the Linux kernel you can't compile a CPU core and reboot and get some performance gain. You might be able to build a 4004[0] in your garage you're not going to be building a CPU you can drop into your laptop or phone. At least not one that would be able run at any reasonable speed.
Open source hardware is not bad. It just doesn't solve any of the very real problems of producing hardware. It doesn't obviate the challenges or costs of developing new process nodes or chemistries. It doesn't help the marginal costs of producing hardware. If you're just buying fully finished chips it's not like you're getting a discount because the manufacturer saved some money on the core design. They'll still charge whatever the market will bear and pocket the savings.
The idea that open hardware will make the world better does not seem like a supportable statement. You're not getting a discount on an Android phone because someone patched a bug in the Linux kernel in their spare time.
[0] https://spectrum.ieee.org/the-high-school-student-whos-build...
There is more to live then technical specification. The change in the license and the development pattern and the business model does actually matter.
I suggest you watch some talks by Krste Asanovic who created the RISC-V project, he explained exactly this point.
> Open source software is unreasonably effective because of that trivial marginal cost of reproduction.
Its open hardware isn't as good as open software. Yes we know. Sadly we don't have a universal 3d printer. But that doesn't mean its worthless.
> Unlike the Linux kernel you can't compile a CPU core and reboot and get some performance gain.
Have you never used an modern FPGA?
> It just doesn't solve any of the very real problems of producing hardware.
If you have a narrow point of you of only considering manufacturing but if you broaden your point of view and look at the whole value chain, it absolutely does. And you know who agrees with me, tons of companies who have invested in RISC-V and the ecosystem.
If you don't believe me, I suggest you watch this video from Google where they explain why they are doing what they are doing:
https://www.youtube.com/watch?v=EczW2IWdnOM
You can find videos like that from other companies, including hardware companies.
> It doesn't obviate the challenges or costs of developing new process nodes or chemistries.
I didn't know that producing hardware was the same as developing knew nodes. In your mind, the only thing that matters is the cost of new node development? Nothing else in the whole world matters to producing computer hardware?
> It doesn't help the marginal costs of producing hardware.
It helps the fixed cost, and the smaller your run is, the more important that is. And it actually does improve marginal cost in many cases, if you don't pay license fee anymore. Again, go and watch the talks by Krste, he explains a lot of other point that matters around this question and why RISC-V took off with so many companies, both producers and consumers.
As I pointed out, PCB went threw similar progression (and is still going). And this has been incredibly helpful to the whole industry.
> If you're just buying fully finished chips it's not like you're getting a discount because the manufacturer saved some money on the core design. They'll still charge whatever the market will bear and pocket the savings.
If there is a high quality core in the class you are looking for, and each manufacturer has access to that same IP, then competition will drive the IP value of that core to zero. That's basic economics. And that's exactly what groups like CORE-V are trying to do.
This exact same thing happened with software as well. It used to you would pay for things like a compiler. Because it was an important value add. But once there is an open source compiler you can't charge money for it anymore.
Funny how plenty of companies who buy a lot of chips, like industrial manufacturing companies have invested into the CORE-V project. Yet you claim it provides no value. Do these people just hate money? Or are they doing it out of the good of their own heart? Or do they understand something you missed? Consider watching presentation from Thales on that topic for example.
> The idea that open hardware will make the world better does not seem like a supportable statement. You're not getting a discount on an Android phone because someone patched a bug in the Linux kernel in their spare time.
Again, you seem to lack a basic understanding of economics. I am not getting a discount for a bug fixed in linux because the linux is already free. So the discount has already happened. You are literally missing the whole point of open source. The whole point is that bugs are getting fixed DESPITE ME NOT PAYING ANYTHING.
As long as all competitors have access to the same code, non of them can extract value from it but are still 'forced' to provide it.
Open cooperation has produced a system where the improvement cost are incredibly wide spread and the benefit are even wider spread. To the point where almost nobody can actually demand money for it. And thanks to this economic reality, we all benefit from this process.
The literal exact same process works for for hardware designs as well. Chips before going into manufacturing are literally just code and configuration. The value of that can be driven down. We are just not as far along and of course manufacturing has cost.
I don't agree with this comparison (or I am misjudging your intention) - 90% of the people I know who run any sort of linux deskopt (usually developers at work) only don't switch their home desktops because of games. I know we're a tiny majority (and I am typing this from a windows machine) - but it's nothing like 10x worse in objective terms (e.g. speed).
That’s not a knock against Linux. It’s great but it’s also disingenuous when people push it as the year of Linux on the home desktop.
If you are capable of using an iPad as a gaming device, I literally do not understand how you wouldn't be able to use a GNOME desktop to achieve literally the exact same outcome.
Am I wrong? Getting Steam to use Proton is literally one click in the Steam settings - using an app like Bottles just has you open exes like normal. This is no worse than the Crossover Wine support from the Mac days of yore, if not more streamlined and not fighting against system integrity protection. And your fucking settings app doesn't give you a notification pip for not logging in.
That statement makes no sense. These things aren't in conflict, saying something is 'inevitable' doesn't mean you can't see the hurdles or that you are ignoring them. Its just born out of an understanding of what the hurdles are and how, in time they can be overcome.
> and have very little interest when it’s pointed out.
Maybe they aren't interested because they know its a long road and anytime there is any success its better to just be excited for a moment and not have somebody comes in with the 'but actually'. This is a totally normal social dynamic.
This concept is best addressed in The Big Lebowski, "Your not wrong your just an a*hole".
> For all three it comes down to Software Compatibility
And as an industry we have some understanding of how standards with open protocol work and how the situation improves. RISC-V isn't knew and we have some idea of how that process is going and how it is organized. Its reasonable to make assumptions about how this will continue.
We also have a large and strong open software community that is gone focus on these standards. Even for very large cooperation, maintaining its own standard is a huge pain in the ass.
There is a pretty good standards process with lots of people involved that has been making very good progress.
There is also real money and effort put into improving all the upstreams. The RISE project for example, brings together tons of major companies, universities, distros and so on. And this isn't just about RISC-V, lots of effort also put into tools, open designs and so on.
RISC-V went from no support to be comparable to long established ISA in a pretty short time. Its not unreasonable to project that forward.
> and experience of use
RISC-V has been adopted by a huge number of universities, its the default now anytime somebody upgrades those courses. Anybody doing things on a FPGA, is almost certainty gone use some open RISC-V core. Some FPGA manufactures are even pushing RISC-V as their example cores. RISC-V is also designed to be easy to learn and get into. It already has a lot of adoption all across the industry. Far more and far faster then any other ISA ever.
Experience is gained by people working on projects, and there are lots of them.
> Apple does the same mistake with desktop gaming and Microsoft with almost any physical product they release that’s not running windows.
That's different, because Apple just clearly doesn't care very much. That a very different situation.
> "the underlying tech is built and people will flock to it when they open their eyes"
No they are only observing that large cooperations all over the world are already moving in that direction. That China and India see the advantage as well. That there are major chip makers who have made it the core of their business. That Europe is making a major investment. That there are a whole boatload of AI companies are adopting it. A huge number of people have already seen it and its has been growing fast for a while now and that growth can be measured in a number of ways.
> I really do hope that more open platforms and technologies happen, but I feel like the people who unabashedly push them without acknowledging how it needs to happen are doing them a disservice.
I feel like those people mostly exist in your imagination. When speaking in support of something you are not required to follow it up with a 50-page development plan. You can just like something and be optimistic and that is totally fine, it doesn't mean that person is naive or unaware.
If you have an actual counter argument why optimism isn't warranted, then you can say that. Some people believe RISC-V is badly designed for example. Some people believe fragmentation will kill it. But just to outright state 'people who are optimistic are a problem' is a silly position.
You focused on the academic and niche use cases to try and counter an argument about standard user use cases. Something Linux hasn’t solved precisely because of people like yourself who keep talking about the things a regular user does not care about.
It’s also cute that in doing this, you have to resort to name calling to feel a sense of intellectual superiority.
And btw its fucking hilarious to call the OS that literally runs on almost every device in the world 'niche'.
> It’s also cute that in doing this, you have to resort to name calling to feel a sense of intellectual superiority.
I was not name calling, I was explaining a concept.
And given this second comment of yours, now I think the concept actually applies.
None of that is relevant to the discussion point at hand which is mass adoption of a new arch and what stands in the way.
And again, the intended derision of saying "Have fun eating at McDonalds"? WTF is that even meant to mean. I'll stop responding to you because I think you are incredibly hostile.
How desirable is this vs. complexity introduced, and can similar benefits more cleanly be achieved at the compiler or software architecture level?
Branch prediction / speculative execution then takes that further by making an assumption of which arm of the branch will be taken and executing the instructions in that arm too. When the instruction for the branch completes fully, and the prediction turned out to be correct, the processor continues as normal. If the prediction turned out to be wrong, the processor throws away the pending changes from the incorrectly speculated instructions and starts again from the correct branch arm.
An in-order CPU waits for each instruction to finish before it processes the next one. There's no way to work around that in software. At best the hardware can insert latches into the ALU etc so that instructions take the smallest number of clock cycles that they need (eg multiply takes six cycles but add takes only one).
The alternative is VLIW ISAs that rely on the compiler to encode multiple instructions that can execute in parallel into a single instruction. That didn't work out well in the past for general computing as demonstrated by Itanium etc, though it's still used for restricted domains like DSPs.
I don't buy this?
ARM is applied in embedded devices or mobile hardware, neither of which Intel really "owned" in the first place. x86 is still the first-class datacenter option and I frankly don't see ARM taking it over on the desktop or server. The people with architectural licenses aren't interested in competing, which leaves Apple as really the only flexible customer, and we all know Apple was going to replace Intel eventually. So... Intel persists. And with DXVK running fine on RISC-V, I'm well within my right to say it's faster than I think: https://youtu.be/5UMUEM0gd34
Honestly AMD has contributed more to Intel's demise than ARM ever did. When I read blog posts like this I really wonder how proximal the author actually is to the industry - it's an assertion that lacks evidence.
s/about/above, s/see/seen