The performance of the world’s fastest supercomputer (according to the Top500 list) has plateaued in recent years, raising the question of what’s causing the deceleration in progress. In addition, the signs of a slowing Moore’s Law continue to proliferate, with Intel delaying production of its next-generation 10nm process technology. The heart of the issue may be that physics filled the gas tank of technological development around the turn of the twentieth century, and we’re now running on fumes.
Theories Fueling Innovation
In 1831, British physicist Michael Faraday (1791–1867) discovered electromagnetic induction, which allows, for instance, a spinning turbine and some magnets to generate electricity. This momentous development paved the way for the harnessing of electrical energy, forming the foundation of many modern technologies. Scottish scientist James Clerk Maxwell (1831–1879), known for his eponymous equations, brought together the work of Faraday and others to form a concise description of classical electromagnetic phenomena, including the waves that carry information wirelessly.
When classical electromagnetic theory failed to explain a number of physical phenomena, however, the stage was set for German physicist Max Planck to take the first major steps in developing quantum theory at the turn of the twentieth century. During subsequent decades, this theory became the topic of intense study and expansion by some of the greatest names in physics (Einstein, Schroedinger, Bohr, Feynman and so on), eventually leading to a detailed model of semiconducting materials like silicon. One of the remarkable results, when mixed with engineering, was the transistor and the integrated circuit.
These theories are the foundation of the trend known as Moore’s Law, which describes the improvement in chip technology in terms of the number of transistors, which roughly doubles every two years. (The exact definition seems to vary depending on whom you ask, but this rough one is approximately representative.) But in this form, Moore’s Law describes an exponential increase in transistor density—a trend that any physicist would reject as being unsustainable. The question isn’t whether Moore’s Law will end, but when. Some of the signs have already appeared.
No Unending Exponential Increases
The world’s population, for instance, looks on some scales like an exponential curve—a feature that has driven all sorts of fearmongering about starvation and so on as resources per capita dwindle. But behind the scenes is a declining growth rate that may well lead to a surprising fast decline in population according to some forecasts. The point here is simply that exponential increases are unsustainable for a variety of reasons, and the switch from growth to plateau or even decline can be sudden and unexpected.
Lauren J. Young asked at IEEE Spectrum, “Why aren’t supercomputers getting faster like they used to?” According to a Top500 chart of supercomputer performance, the fastest system in the world has stagnated in capability for several years. (Note that the chart is a log plot; a line represents an exponential curve in a standard linear plot.) Of course, such plateaus have precedent: they seem to have occurred regularly since the 1990s, with each one followed by an improvement spurt of similar duration. Perhaps more telling, however, is that the data representing the sum of capabilities of systems in the top 500, as well as that representing the capability of the slowest system to make the list (#500), exhibits a clear deviation away from the historical linear (in logarithmic terms) increase. In other words, the growth of supercomputing capability seems to be leveling off—at least in the sense that it is losing its exponential character.
Young notes, “There are a number of technical aspects and economic factors that interfere with supercomputing improvements. Experts disagree on the cause, but the result could be a slowing of the pace of improvement in some scientific fields.” Deviation from Moore’s Law is likely a culprit, but so is economics. Young quoted John Gunnels, IBM senior manager of Data Centric Systems, as asking, “Can somebody make a computer that has higher performance? Probably. But it would take a lot more money and power than someone would be willing to supply.” In other words, economics is also a factor—and that’s no surprise. It will always be a greater limiter of innovation than technical possibility, since improvement is ever possible but often unaffordable. In that sense, Moore’s Law cannot be divorced from economics: regardless of whether the next process technology is possible, it is of no value if it is economically (or otherwise) practical.
So what’s really at the foundation of the possible slowdown in computer progress?
Running on Empty
Although physics has made notable progress over the past century since Planck started down the road of quantum theory, it has failed to reproduce the kind of revolutionary advancements that this theory and classical electromagnetic theory represented. Young perhaps understated the matter by saying that “the result could be a slowing of the pace of improvement in some scientific fields.” Today, theoretical physics is currently bogged down in string theory—a purely pie-in-the-sky model that has yet to offer any testable hypotheses, let alone something that engineers can use to serve the market. In other words, the revolutionary progress described by Moore’s Law is chugging along on hundred-year-old developments in science. To be sure, quantum computing is, in theory at least, an outworking of quantum physics, but again, not everything that’s conceivable is practical—let alone possible. In fact, its status as the heir apparent of classical computing raises questions; far too often, what’s expected fails to deliver. Revolutions tend to be unexpected rather than clearly forecast.
That’s not to say there’s no more room for working out the implications of classical and quantum theories further; legions of graduate students (and professional scientists, of course) continue to do so to this day, although their work on average seems to be growing less and less momentous with each passing year. Absent some new theoretical foundation for progress, the result may well be that innovation becomes a much slower and more difficult process. Should Moore’s Law break down completely, leaving, say, the 7nm process node as the “ultimate” chip-fabrication technology, no more will compute and power-efficiency improvements be simply a matter of waiting a couple years. For decades, the industry has had time on its side in this regard. Eventually, that trend will cease, although the precise date of the end is uncertain.
Innovation won’t end regardless of what happens with Moore’s Law, but it will become more subtle and hard won. Dreams of the “singularity” and artificial intelligence will likely be dashed—computers will resume their rightful reputation as dumb machines that happen to be good at repetitive number crunching. They may be able to fake intelligence—and fake it quite well—but they are not the purpose of existence.
Decades of incredible progress in computer technology rests on a theoretical foundation that is aging quickly. It should be no surprise that a century (or more) after the initial development of classical electrodynamics and quantum theory, engineers are approaching the mileage limit of what these principles enable. The slowing of Moore’s Law is just one outworking, and it seems to be manifesting itself in the world of supercomputing. Is innovation dead? Of course not. But we may be entering a new world in which the development of technology is no longer the regular headline that it once was.
Image courtesy of sirexkat