All / RSS /

004 2021-03-15

The future of computing with RISC-V, FPGAs, and Forth

As pointed out by many other people, software is regressing and getting absurdly bloated. Our computers have become exponentially faster and more powerful, yet our software is slower than ever. Just try using Slack in Firefox on a 64-bit ARM computer with 4GB RAM. Let me know in 5 minutes once you've finished loading the chat page.

To many this is unacceptable, but we've never been able to do anything about it, because nobody wants to rewrite Linux from scratch. Nobody wants to recreate the web browser. Most people think we're doomed and stuck with what we have. Just deal with it.

But I disagree, with the pairing of "RISC-V + FPGA + Forth", it is now possible to re-invent what needs to be re-invented. We now have the tools and knowledge to bring our ideas forward and to recreate computing the way it should be, from the ground up.


I first learned about FPGAs about six years ago (quite late to the party, I know), when C. Wolf presented a talk about Project Icestorm at the 323C event. It blew my mind that a piece of hardware can be completely reprogrammed to do absolutely anything you want. It blew my mind that some genius out there reversed engineered an FPGA and wrote tools to fully reprogram it. That's one for the history books, and so I quickly purchased a Lattice iCE40 eval board to start learning this revolutionary technology.

To me, the genius of FPGAs is that we can now take back control of our CPUs. I love the thought of running a "soft" CPU that's void of hidden secrets and functionality that you didn't ask for, such as the Intel ME rootkit.


The only problem was actually writing that CPU. How do you even begin creating an instruction set and a full CPU (with all its components, memory, ALU, etc)? Enter RISC-V, another marvel of technology with an incredibly promising future.

The idea behind RISC-V was to use decades of prior knowledge and experience to design a fully open source CPU instruction set for 32 and 64-bit computing. It emulates a lot of functionality of previously closed ISAs, such as MIPS, but it improves on many things while giving people the freedom to build upon it.

The openness of the ISA has drawn many critics, and some have noted the dangers of blindly trusting RISC-V "because it's open". But that's where FPGAs come in. With a RISC-V CPU core written for a target FPGA, it is now possible to get a fully open source CPU on a practically open source FPGA, without worrying about a private corporation or government trying to usurp your data.

However, once RISC-V is running on your FPGA, you still need to write code for it. A real-time operating system? Linux?

Well, since we're already re-inventing the wheel (a much better and open wheel, by-the-way), why not re-think the OS?


This is where Andrew and Matthias's Forth implementations come in. Both have created their own Forth implementations running on RISC-V "hardware". Andrew's design is aimed at tiny microcontrollers, while Matthias' aims existing 32-bit RISC-V CPUs and FPGA "soft" CPUs.

Both implementations are interesting and serve different purposes, and I plan to follow them closely.

So the big question is, why Forth? I read the book Thinking Forth a few years ago, around the same time I picked up PicoLisp. I didn't read it to learn Forth though, but rather because it discussed some interesting and long-forgotten computer science concepts which are still totally valid today (ex: Decision Tables).

It wasn't until I discovered Andrew's Bronzebard Project, which had a Forth implementation for RISC-V, that I realized I should totally learn Forth and start using it on my FPGA.

Forth has some properties that no other programming languages have. For example: it's extremely simple, even more so than PicoLisp. It's technically portable across a plethora of systems, and more importantly, it allows one to design a system or OS from scratch, tailored to their hardware, to do only the things which need to be done. It can be seen as the most bare minimum set of building blocks to create even greater building blocks.

This excites me

I now see the future of computing. Of course the existing systems will likely continue as before, but we're now in a phase where there can and likely will be a new set of computers and systems, designed by people who care, not by companies trying to profit. They will be designed to solve specific tasks, safely and reliably. They will be regularly reprogrammed as needs change, and they will be fully open source for the end-user who wants to tweak them.

We still have some work to do, but I think with the combination mentioned above, the goal is definitely attainable.

Time to get to work!
2021-03-15 02:47:23 UTC