x86 is deprecated


ARM will be the dominant CPU architecture going forward. In a few years, x86 will be relegated to specialty use cases. All new consumer devices and most cloud services will be based on ARM. Old workloads will probably run on x86 for decades, because nobody will bother to port and upgrade, but a decent amount will run on ARM under emulation no problem.

How did we get here?

ARM started out as the upstart competitor optimized for low power and great performance per watt. So we stuck em on smartphones and tablets. Now perf per watt matters everywhere - datacenters and mobile devices are where the volume is. In addition, the performance gap is closing. x86 is still ahead in many single-threaded workloads you just want completed as fast as possible. But once that Rubicon is crossed, there will be very few cases where x86 makes sense as a CPU architecture that anyone intentionally uses.

Most major hardware OEMs are either shipping or looking into ARM. The battery life implications are too hard to ignore. The Apple M1 means Apple is unlikely to release another x86 device ever again. Microsoft has been partnering with Qualcomm to design ARM CPUs for years now, and Windows 10 on ARM has x86-64 emulation in preview as of December 2020. In other words, we are about 1 or 2 hardware cycles away from the mass availability of ARM devices and platforms that support ~99% of use cases consumers need [1].

The major cloud providers will eventually be forced to migrate to ARM for the cost savings. At AWS there's already work underway to switch over internal workloads to ARM. Azure may be secretly testing ARM for internal workloads too. The cost savings is just too great not to do it. Google Cloud will come around to it once they start caring about margins.

For customer facing workloads, Amazon graviton2 VMs will soon be a reasonable default for customers, given it can be 40% faster and 20% cheaper than x86. For many cloud services that aren't VMs, the ARM switch will be completely transparent, since HTTP and other layer 7 protocols don't care what your underlying CPU architecture is. Or layers 1 - 6.

On a semi-related note, new Intel CEO Pat Gelsinger is taking over while under assault from all sides. Reports are that Intel has been in trouble for years now, with fat margins that are increasingly unsustainable and under attack from AMD, Amazon, and now Apple. Then you have dedicated fabs (but mostly TSMC) that have been pulling away and are now probably a full generation ahead of Intel. So in summary, Intel makes CPUs on a legacy architecture that consumes vastly more power than the competition, and is slowly losing its performance edge without anything on the horizon that can reverse the trend, with manufacturing capability is as much as 5 years behind the state of the art, and can't even produce 10nm chips in mass quantities with decent yield even in early 2021. Yikes.

[1] I'm not sure how x86 on ARM virtualization will work, which may be important for some workstation scenarios so developers can build and test all their software. But the single most important event of the last 2 years has been Apple getting ARM laptops to developers, which lit a fire under folks like Docker to get cracking on ARM support.