Meltdown, Spectre, and the Costs of Unchecked Innovation

Spectre fixes forced browsers to break the compatibility covenant of the web. Other unchecked technologies could cause even deeper damage.
HOTLITTLEPOTATO

When the blockbuster twin security exploits known as Meltdown and Spectre appeared in early 2018, Mozilla was among the first to respond, retroactively changing several behaviors of Firefox to help prevent them.

Both attacks rely on using high-speed timing measurements to detect sensitive information, so somewhat counterintuitively, the patches had to decrease the speed of seemingly mundane computations. The first change was to slow down the performance API for web browsers, which had previously been able to analyze the behavior of a page at speeds fast enough to be used in an attack; the second change removed SharedArrayBuffer, a new kind of data structure atop which similar timers could be trivially rebuilt. Similar changes were also soon also implemented by Microsoft for Internet Explorer and Edge browsers and also by WebKit, a tool for viewing the web that is used to build Safari, Mobile Safari, Android Browser, and the dedicated browsers embedded on many other devices. As of this writing, SharedArrayBuffer is now disabled in all major browsers.

Backpedaling on established features of the internet was necessary, but also strange and unexpected. The web is, among other things, a decentralized specification: It is an agreement about how to build things, and then also how to run the things that have been built. In order for a new feature to meaningfully exist on the web, developers and browsers and standards bodies must all first come to an understanding about how it will work. Once you add something to that agreement, you can't remove it, because you have no idea what problems might arise, or even in which far-flung corners they may appear.

In contrast, technology systems and programming languages that operate in narrower contexts—on a specific server, for example, or inside a specific app—can successfully withstand more dramatic changes to their behaviors. Any upgrade-related malfunctions are localized, and accordingly easy to fix. There are no such promises with a distributed web, though, so its technologies have always evolved in ways that maintain backwards compatibility. This is why old web pages pretty much always continue to work in newer browsers.

Spectre forced browsers to finally break the compatibility covenant of the web. It's entirely likely that no meaningful projects relying on those features even exist, and even if they do, there may still be simple, safer workarounds. Nonetheless, such a prominent episode in which the internet broke its own code retroactively comes with a cost, at least ideologically. The web can't quite be trusted as an infallible platform to the extent it had been.

Breaking Changes

There's a common practice in software engineering called semantic versioning, whereby the officially published new releases of software tools and packages are given slightly complex version numbers—think the more nuanced "version 2.4.1" instead of moving straight to "version 3"—so that shifts to the largest numbers can indicate to both users and automated systems that something important has changed. The system might no longer work the same as it did previously. These are referred to as "breaking changes," and they serve as safety checks, or at least warning flags.

We don't know how widespread the impacts were, but the patches made to web browsers to help protect against Meltdown and Spectre do meet the technical definition of breaking changes for the entire web. Of course, by now the web is far too old and chaotic to be subject to any versioning or planning whatsoever. That's precisely why, until now, it had always opted to safely preserve backwards compatibility.

Both Meltdown and Spectre are caused by the widespread use of a technique called "speculative execution" in which processors eagerly and proactively execute instructions even before they are actually needed by the program. The speculatively computed material is then faster, but the primary discovery of Meltdown and Spectre was that it is insufficiently secured, and thus provides a way to leak sensitive information. Meltdown most notably affects Intel hardware in which the speculative execution was previously assumed to be safe, and attempts to disable it at the software level can have a marked performance cost. This is not just about sluggish laptops—many cloud service providers charge clients varying rates that reflect the computational burden of the contract, so Meltdown and Spectre may show up as an increase in technical budgets, paid out as a literal dollar amount to services that now have to run more slowly as a result of the patch.

The only real fix for Meltdown is to eventually physically replace all the chips, a change which will take at least a full hardware generation to propagate. Spectre is more sophisticated, and may have no real fix at all. We might not have realized it until recently, but the speed and power of our computers until now has always been a lie, built atop a foundation that must now be undone if we also want to remain safe.

Beyond Bits

Responding to Meltdown and Spectre required immediately decreasing the speeds of both processors and online timing measurements, thereby reversing advances we thought we'd made both in hardware and with the general sophistication of the web as a platform.

Disruptive ideas can likewise attain a pace that manifests outside the silicon: In New York, Uber's reach and market dominance emerged fast enough to capitalize on the ongoing financial struggles of the subway system, and Airbnb's alongside the rising barriers to home ownership for the modern middle class.

Even if Intel wouldn't quite agree that Moore's law is over, its real-world performance benefits may be substantially erased after Meltdown and Spectre are tamed. The long-standing computing trope that should be even more concerning in this context, however, is "it's all just ones and zeroes." We're not just talking about bits once those bits drive our robots, drones, and 3-D printers. New technologies now often manifest in the real world, since for now that is still where most of the money is, but even Bitcoin melts the polar icecaps.

On the mind-boggling cosmic scale, these exploits will affect our ability to create and edit organisms, but on a more tangible level, they also they also slowed down both the internet and our hardware. In both fields, we had quite literally been racing toward something terrible.

We’ve built technology too quickly for our own good, quantifiable now in dollars and microseconds, using a wide range of tools and metrics even though SharedArrayBuffer is no longer around to take the measurements. Anything that seeks to reshape the infrastructure built by our past selves should deserve our most aggressive scrutiny, regulation, and suspicion. If backtracking overeager technology is already proving so catastrophic for the cheap chips in our laptops and phones, then we certainly have no hope of reversing its changes to our homes, cities, and oceans. Some things can't be patched or safely versioned. We just have to get it right the first time.

More Meltdown