Moore's Law is probably dead

Long live the other technology laws (here are ten you may not have heard of)

Is there any concept in technology circles more talked about than Moore’s Law? Coined in 1965 by Intel co-founder Gordon Moore, Moore’s Law predicted the doubling of transistors per chip every two years. Originally based on only five data points, Moore’s Law turned out to be an astonishingly accurate prediction, recently celebrating its 50th anniversary to much media acclaim. It’s served as an iron rule of innovation and a figurehead for the accelerating speed of progress. With the global economy increasingly dominated by things with transistor chips inside them, the implications of Moore’s Law have been profound.

However, we’ve been seeing signs for some time now that our assumptions about Moore’s Law can’t be taken for granted. As journalists like John Markoff from the New York Times have pointed out, technology companies are finding it increasingly difficult to keep pace, and while we’re still pulling off some impressive engineering feats that technically keep Moore’s Law alive, the benefits haven’t been passed on to consumers for many years now.In July 2015, for example, Intel said its two year upgrade cycle would now take 2.5 years to accommodate the new challenges. The pace of acceleration for computing power, at least, is slowing as semiconductor features approach the atomic scale.

Some, like Ray Kurzweil, argue that this is nothing to worry about, since Moore’s Law is part of a much larger process of technological progress, stretching back a long way through time. Kurzweil claims that before the integrated circuit even existed, four previous ancestor technologies — electromechanical, relay, vacuum tube and transistor — all improved along the same trajectory. He formulated this as the Law of Accelerating Returns; and he believes that its logical conclusion is that one day, we will all get to live forever, finally say sorry to our departed parents, and commune in an ecstatic godlike state with the universe.

It’s also worth pointing out the common (but mistaken) belief that Moore’s Law makes predictions regarding all forms of technology, when it was originally intended to apply only to semiconductor circuits. A lot of people now use Moore’s Law in place of the broader ideas put forth by Kurzweil. This is a mistake — Moore’s Law is not a scientific imperative, but rather a very accurate predictive model. Moore himself says that his predictions as applied to integrated circuits will no longer be applicable after about 2020, when integrated circuit geometry will be about one atom thick. Whether other technologies such as biochips 0r nanotechnology will come to the forefront to move digital progress forward at that point is still unclear.

That’s not what this article is about though. Instead, it’s about what happens when multiple technology laws begin to take hold at the same time. I use this language loosely, since most of these ‘laws’ are, like Moore’s, better described as predictions or observations of patterns. What’s interesting is just how many of them are out there.

Koomey’s Law for example, states that the energy of computation is halved every 18 months. This less well-known trend in electrical efficiency has been remarkably stable since the 1950s — long before the microprocessor was invented. Currently, it’s actually faster than Moore’s Law, with the number of computations per kilowatt-hour doubling approximately every 1.57 years, as compared to Moore’s which, right now, is doubling closer to every two years.

Kryder’s Law is the storage equivalent of Moore’s; it states that our ability to cram as many bits as possible onto shrinking hard drives is also doubling roughly every 18 months. In 1980, Seagate introduced the world’s first 5.25-inch hard disk drive (remember floppy disks?) which could store up to 5MB of data at a price-tag of US$1500. Today, 35 years later, you can buy a a 6000GB drive from the same company for $600. That represents a million-fold increase in capacity, combined with a seven-fold decrease in price (accounting for inflation). Not even silicon chips can boast that kind of progress.

Communications technologies are also progressing exponentially. If you look at the number of possible simultaneous conversations (voice or data) that can be conducted over a given area of the radio spectrum, it turns out these have doubled every 30 months for the past 104 years. This observation was made by a guy named Marty Cooper, probably the most influential man nobody has ever heard of. He’s the father of the mobile phone; the modern day equivalent of Alexander Graham Bell. While working for Motorola in the 1970s he looked at the cellular technology used in carphones and decided that this ought to be small enough to be portable. Not only did he conceive of the mobile phone (citing Star Trek as his inspiration) he subsequently led the team that developed it and brought it to market in 1983. He was also the first person in history to make a handheld cellular phone call in public.

Marty Cooper, hanging out with a brick (AP/Eric Risberg)

Marty Cooper, hanging out with a brick (AP/Eric Risberg)

Cooper’s Law is even more remarkable than Moore’s Law. It’s held true since the first ever radio transmission by Marconi in 1895. Radio technology a century ago meant that only about 40 separate conversations could be accommodated on the surface of the earth. The effectiveness of personal communications has improved by over a trillion times since then. And unlike Moore’s Law, there’s no physical limit since there’s no limitation on the re-use of radio spectrum. These networks can be expanded indefinitely by merely running more lines, more bandwidth, to more terminals.

Running side by side with this theoretically unlimited increase in wireless capacity is an exponential increase in the amount of data we can transmit through optical fiber cables. Butters’ Law says that the amount of data coming out of an optical fiber doubles every nine months, meaning that the cost of transmitting a piece of data on an optical network decreases by half during the same time period. Unfortunately, that rate of progress doesn’t quite filter down to us as consumers — Nielsen’s Law states that the bandwidth available to average home users only doubles once every 21 months. Still, it’s an exponential function, and is the reason why telecoms companies have been able to make so much money while still bringing down the cost of data traffic. Anyone remember dial up modems? Imagine trying to stream Netflix on something that still sounds like this.

So what happens when you get increased connectivity through improved data transmissions and falling costs? Well, you get larger networks; and according to something known as Reed’s Law, the utility of large networks (particularly social networks) increases exponentially with the number of participants. The precise size of that increase is a topic of debate — Metcalfe’s Law for example, states that the value of a telecommunications network is proportional to the square of the number of connected users of the system. In other words if your network is ten people, then its value is 100. But if that network doubles to 20 people its value goes up by four times, to 400.

The enormous value of network effects is why the digital divide may disappear far more quickly than most people realise; it’s why media companies like Quartz are excited about the mobile web (see their Next Billion events), and it’s why Alphabet are pushing hard on Project Loon. The sooner these kinds of companies get everyone connected, the more value they derive from the exponential increase in the size of the global internet.

Even more interestingly, it may be that it’s the tail wagging the dog. Steve Jurvetson, a well known venture capitalist, suggests that the rapid advance in communications technology only makes sense once you substitute ‘ideas’ for participants. In other words, technology is driving its own progress by steadily expanding our capacity to bring ideas together. The implication is that the genie’s already out the bottle; we couldn’t stop the march of increased connectivity even if we wanted to.

What about other technology fields outside personal computing devices and communications? Well, if you’ve been following renewable energy trends for any time you’ll be familiar with Swanson’s Law. This states that the cost of the photovoltaic cells needed to generate solar power falls by 20% with each doubling of global manufacturing capacity. It’s represented below, in a now famous graph showing the fall of solar costs in the last forty years.

Today the panels used to make solar-power plants cost $0.36 per watt of capacity — a 80% decline since 2008. Power-station construction costs add about $4 to that, but these, too, are falling as builders work out how to do the job better. And running a solar power station is cheap because the fuel is free. Obama’s former Energy Secretary Steven Chu says that solar becomes price-competitive with fossil fuels at a cost of around 50 cents per watt. Thanks to Swanson’s Law, we’re already up and over the tipping point when deciding on whether to use fossil fuels or renewables t0 build new power stations. We’re starting to see this effect take hold. In the last year, 11 out of 20 OECD countries have reduced their carbon emissions. And 2014 was the first year ever that global carbon emissions flatlined (while economic growth continued).

In the field of biotechnology, advances are also exponential. In 1990 the US government set out to complete one of the most ambitious scientific projects ever undertaken — to map the human genome. They committed more than $3.5 billion, and gave themselves 14 years to complete it. Seven years in, they’d only completed 1% and were through more than half their funding. The government and sponsors started panicking. Yet by 2003, the Human Genome Project was finished ahead of schedule and $500 million under budget. This was made possible by exponential improvements in genome sequencing technology; past a certain point they started outpacing Moore’s. This kind of progress is astonishing when you think about it. It cost three billion dollars and took 13 years to sequence the first human genome. Today, it only takes a few days and costs $1000.

Source: (Alex Pearlman/ Motherboard )

Source: (Alex Pearlman/Motherboard)

Then there’s Haitz’s Law, which states that every decade the amount of light generated per LED increases by a factor of 20, and the cost per lumen (unit of useful light emitted) falls by a factor of ten for a given wavelength of light. Exponential improvements in LED technology mean it’s becoming the dominant way we produce light. At home we get brighter more efficient lighting at lower costs, while commercially it means that LED lighting can now be used for more specialized applications such as large stadiums and amphitheatres. This results in lower electricity consumption, and a reduction in overall carbon emissions and the use of toxins used in old lighting technology such as mercury. It’s no surprise that these kinds of technologies are improving so rapidly. They all run on silicon, the foundation for the semiconductor materials found in computers and communications networks.

So what does it all mean? It means we need to reset our intuition. The rate of change is getting faster every day, and yet few of us have adopted this acceleration into our future expectations. Our recency bias — expecting that the future will continue in the same manner as the recent past — is more wrong than it’s ever been. Change, once incremental and predictable, now comes in massive and unexpected waves, traversing huge milestones in shorter and shorter time periods.

Of course, one of the difficulties in making predictions about the future is that it’s almost impossible to know what people will actually do with new technologies. It’s easy to predict a fall in the costs of computing power; it’s far more difficult to predict that this will lead to things like drone racing leagues or cures for cancer.

In short: expect the unexpected.