4 things IT leaders should know about RISC-V

RISC-V is widely used in embedded systems and IoT devices that focus on cost, customization, and often, low power. But edge computing opens up new applications. What's next?
69 readers like this.
Leadership CIO with lightbulb

When talking about open source, people usually talk about software. Data may be part of the conversation as well. Hardware? Not so often. Especially when it comes to complex devices like microprocessors.

However, we’re starting to see significant projects that open up hardware to a large community of individuals and companies. Perhaps none is bigger than RISC-V.

[ Read also: Enterprise open source and edge computing: 4 stats to know]

What is RISC-V?

The first thing to know about RISC-V is exactly what it is – and what it isn’t.

When we recall historical open source processor projects, a design like OpenSPARC may come to mind. In 2005, Sun Microsystems released register-transfer level (RTL) Verilog code for its UltraSPARC T1 microprocessor under the GPL v2 license.

The problem with this approach in the words of RISC-V International CTO Mark Himelstein is that “When you hand something whole cloth as open source it's hard for people to really feel ownership around it. The one thing that Linux did was everybody felt pride of ownership. That was really hard to do.”

By contrast, RISC-V is an extensible instruction set architecture. Himelstein adds, “We're totally implementation-independent. We work with other sister organizations that are nonprofit like lowRISC, CHIPS Alliance, and Open Hardware who do specific things in hardware with RISC-V.” As of early 2022, about 300 institutions and corporations were participating in the RISC-V community.

Does extensibility imply fragmentation?

Extensibility and diversity have a flip side – fragmentation of designs. Is this a concern?

Part of the answer is that, with many RISC-V cores going into small, custom designs, some degree of fragmentation isn’t really a problem. That said, as Himenstein notes, “I have a bumper sticker statement that says, ‘Innovate. Don't duplicate.’ That's the only thing that keeps us together as a community. Why do you want to go ahead and implement addition and subtraction for the thousandth time?”

That said, RISC-V International recognizes that there is value in standards, especially as RISC-V starts to see increased usage in higher performance server roles. This is needed to attract off-the-shelf software development. The trick is to manage the optionality of the architecture while still proving well-defined compatibility. 

This is where the RISC-V Platform Specification comes in. Currently it defines two sets of platforms: one that’s intended to provide software binary compatibility and the other intended for source code compatibility. The platforms bind together specifications to create interoperable classes of systems and tooling.

Where is RISC-V being used?

RISC-V is already widely used. The initial focus has been on embedded systems and IoT applications which have a particular focus on cost, customization, and often, low power.

However, changes are happening at the edge of the network. As the quantity of data flowing into the network multiplies, so does the need to filter it and analyze it locally rather than shipping it all off to a central location. Data is also increasingly used to take rapid action on the network edge while avoiding the latency associated with traversing a geographically dispersed network. For these reasons and others, increasingly powerful compute and hefty storage is needed on the network edge, and RISC-V is a good candidate for tasks like machine learning workloads there.

[ Want best practices for AI workloads? Get the eBook: Top considerations for building a production-ready AI/ML environment. ]

Calista Redmond, CEO of RISC-V International, told me that: “With the shift to edge computing, we are seeing a massive investment in RISC-V across the ecosystem, from multinational companies like Alibaba, Andes Technology, and NXP to startups like SiFive, Esperanto Technologies, and GreenWaves Technologies designing innovative edge-AI RISC-V solutions."

Where is RISC-V headed?

2021 was something of a landmark year for RISC-V, with over 2 billion cores deployed by the end of the year according to RISC-V International. It also saw a significant ramp-up in standardization work. In 2020, RISC-V International produced one specification. In 2021 it did 16. Himelstein emphasized the process as being about prioritizing, putting governance in place, getting help where it was needed, and “trying to push things over the goal line.”

In addition to continuing to build on the existing design wins and extending the instruction set architecture in ways that are important for RISC-V International’s membership, an ongoing priority is building up the ecosystem. The ecosystem – such as compilers, optimizers, libraries, hypervisors, and operating systems – is a big part of any instruction set architecture's success, as x86 and ARM have clearly demonstrated. While important pieces exist today for RISC-V, it is an area of ongoing focus.

Closely related to this is sharing work and functioning as a community. Over time, it’s become clear that much of the power of open source software lies not just in the fact that it preserves user freedoms but that the open source development model is an extremely effective way to develop software. RISC-V has arguably succeeded so far because it’s an open and flexible instruction set architecture. However, more widespread collaborative development in the vein of Linux and Kubernetes could accelerate its adoption even further.

[ How can automation free up more staff time for innovation? Get the free eBook: Managing IT with Automation. ] 

Gordon Haff is Technology Evangelist at Red Hat where he works on product strategy, writes about trends and technologies, and is a frequent speaker at customer and industry events on topics including DevOps, IoT, cloud computing, containers, and next-generation application architectures.