One of the underlying principles of the Internet’s design is that of robustness and decentralization - large parts of our physical infrastructure might fail, and yet the packets will still flow. The great irony is that the fragility of the Internet comes from somewhere else entirely: the winner-take-all dynamics of software itself.

Heartbleed highlights this to the extreme. While a handful of different open-source TLS libraries exist, one in particular, OpenSSL, protects the majority of the Internet. This is because it is the default SSL library for the two most popular web-servers, Apache and Nginx, which run 70+% of the busiest sites on the web.

So why such concentration? The world of software isn’t constrained by locality - multiple libraries that don’t serve particular ideological or technical niches are truly wasteful and duplicative effort. And this brings us to the brutal reality of software-physics : open-source efforts tend towards collaboration & concentration (ideological & technical concerns being equal) resulting in concentration of important libraries, and thus concentration of usage in the wild.

The unfortunate side-effect is a fragile internet. Vulnerabilities can, do, and will exist in any given library. If we had a multitude of libraries, we’d have a multitude of vulnerabilities, but no single attack would be so devastating. However, software-physics push us towards fewer, higher-quality libraries with devastating consequences when they are compromised.

Heartbleed is indeed one such terrifying vulnerability, and it won’t be the last as long as the physics of software development & deployment remain the same.