Back in the 1990s, Reduced Instruction Set Computing (RISC) architecture was widely seen as the future. UNIX workstations from Silicon Graphics and Sun Microsystems boasted powerful RISC CPUs at the time, but were prohibitively expensive and died out in the early 2000s as a result. Only the Acorn RISC Machine (ARM) architecture used in low-power devices (e.g. PDAs) survived past the early 2000s, and only because of the rise of mobile devices that relied on battery power.
Of course, the ARM CPUs used in mobile devices got far more powerful after the iPhone hit the market. Since then, iPhones, Android phones, tablets, cars, smart assistants, and nearly every other Internet-connected device has relied on ARM, which uses an open architecture that allows manufacturers to easily modify and extend its features and design. By the mid 2010s, multi-core ARM CPUs were starting to become powerful enough for general purpose desktop computing, and Microsoft announced Windows 10 for ARM in 2017. I saw this as a clear sign that we’d see ARM proliferate on the desktop soon afterwards and even wrote a blog post on it: Get ready for the chip wars.
Unfortunately, I was wrong. Windows 10 on ARM didn’t really take off at all, and still hasn’t. While some low-end Chromebooks use ARM, that majority of them use low-end Intel CPUs. And no one in their right mind would consider the Raspberry Pi to be a general purpose desktop computer.
Around that same time, cloud datacenters started experimenting with high-core ARM CPUs designed to run large compute workloads at lower power consumption. This combination of compute and power efficiency is ideal for cloud computing, as shown by the Graviton ARM CPUs that Amazon uses in part of their AWS cloud today. The only problem was (and is) mainstream adoption. Without ARM on the desktop, it’s difficult for ARM to gain traction in the cloud because software developers will still be bound to the Intel/AMD desktop and development mentality.
So, following several years of rumours, when Apple announced in 2020 that they’d be ditching Intel for their own custom ARM CPU, it should have spurred a lot of interest in ARM. But it didn’t, because Apple technology is a walled garden, and moving to ARM was largely seen as a way to keep people in their ecosystem while boosting their already high profit margins. And given Apple’s actions in the past 5 years, many of us in the tech industry (myself included) thought Apple wouldn’t do a great job of the transition and essentially be pushing underpowered Chromebook equivalents to their faithful fan base with deep pockets.
Fortunately, I was wrong. Their Apple Silicon M1 ARM CPU released last fall was incredibly well designed for general purpose desktop computing and quickly WOW’d everyone in the tech community. I started using it this past month and had to eat the Apple ARM jokes I made this past year ;-) Nearly all software for the macOS platform is available in native ARM-compiled format, and the desktop experience is close to flawless otherwise.
So it ended up being Apple - not Microsoft - that was first to bring ARM to the desktop. I think many software vendors and hardware manufacturers weren’t willing to bet a lot of money on ARM in an Intel/AMD-dominated desktop landscape, and were waiting for someone to go first and succeed. And now that that has happened, they’ll follow suit. We’re already starting to see a lot more custom ARM CPUs hit the market, and it’s just a matter of time before we see more powerful ARM-based Chrome, Linux and Windows PCs.
ARM on the desktop is here to stay this time, and that’s a good thing for both desktop and cloud computing. I imagine we’ll start to see a proliferation of mainstream ARM desktops over the next few years, as well as a large shift to ARM-based servers both in the cloud and within organizations.