Moore’s Law has defied predictions of its imminent demise for decades now, and it seems to still be going strong. Most technology observers would admit that eventually Moore’s Law must come to an end (at least in terms of current silicon manufacturing technology) eventually, and some theoretical work has even attempted to prove an ultimate limit on computing power regardless of the technology (“Computers Faster Only for 75 More Years”). So what would happen if technological progress in computing grinds to a halt? Would there be anything left to live for?
The Limits of Physics
Few consumers or company workers can remember a time when computer technology wasn’t changing quickly. A computer bought just two or three years previously would be outdated, and keeping pace with current software generally required an upgrade of some kind—or a complete replacement. Much of this technology growth is the result of Moore’s Law (okay, technically Moore’s Law just describes it): roughly stated, computing power for a given cost doubles approximately every two years. So, eventually computer technology will reach sci-fi proportions, right? Well, maybe not.
For semiconductor (silicon) manufacturing, the limits of computer technology may already be in sight (“Single-Atom Transistor: Good or Bad News for Moore’s Law?”). Eventually, the size of transistors bumps up against the size of atoms: imagine trying to build a microscopic castle out of normal-size Lego blocks and you have some idea of the problem. Many scientists and engineers have pinned their hopes on quantum computing for an extension of Moore’s Law. In its current state, quantum computing offers some hope, but it must overcome numerous technological hurdles—including feasibility for mass manufacturing and broad use—to compete with silicon technology. Yes, some government-funded lab somewhere might develop a quantum computer that blows traditional semiconductor-based computers out of the water, but if it doesn’t offer performance per watt per dollar similar to (or, preferably, better than) a typical Intel-based desktop, it doesn’t have any bearing on Moore’s Law.
You may well be thinking something along the lines of, “Well, how many times have we heard that technological progress will stop, only to see it keep on moving ahead?” True, indeed. But just because all the previous predictions were wrong doesn’t mean the latest ones are as well. Even if we were to grant that computing power could potentially scale to infinity, other factors also pose limitations.
Few people alive today were around when automobiles were introduced. But imagine you witnessed the amazing move from horse-drawn carriages to self-powered machine vehicles. You might have even thought that the next step would be flying cars (which are now the butt of jokes in the comments section of many an article discussing the limits of technology). Flying cars don’t seem to be beyond our technological capability, but they pose numerous practical problems, such as the greater danger of a part failure, much greater cost and infrastructure challenges (where’s the left lane and where’s the right?).
In particle physics, the quest for greater insight into the underlying stuff that constitutes the physical world requires ever more powerful accelerators—equipment that must invariably be funded not through private research or donations, but through taxes (the only revenue source capable of supporting these projects). But as profligate government spending leads to broken economies, inflation, or a combination of both, citizens will start to wonder whether their inability to buy food is really worth hearing yet again how discovery of the Higgs boson is just a few years away.
So, what do these situations mean for computing? Simply this: physical limitations are not the only consideration. At some point, economic limitations may also prohibit further development. Development of a new process technology for semiconductor manufacturing involves years of research and billions of dollars of infrastructure and other investment—all before the first products begin rolling off the assembly line.
Now consider that if you own a desktop computer that’s two or three years old, you probably can’t get something that’s really all that much better now—sure, a new machine may offer some benefits, but chances are (unless you’re an extreme gamer) you wouldn’t even begin to tap into those upgraded capabilities. For most consumers and even most company employees, web browsing, productivity software (Microsoft Office, Adobe products and so on) and solitaire are what really counts—and the computers of a few years ago can still handle those tasks just fine.
The question then is whether the market can sustain Moore’s Law. Cloud computing offers some impetus, as the centralization of computing resources into billion-dollar data centers means there’s still an appetite for more, smaller and less power-hungry computer technology. Computer technology markets for end-users (either consumers or businesses) seem to largely be dividing into mobile and cloud segments—that is, mobile devices and infrastructure (servers and such). The performance of desktop computers just doesn’t have the appeal it once did.
What Would a Future Without Moore’s Law Look Like?
Even if, say, 4nm process technology (or pick your favorite number—and even add a third dimension if you like, a la Intel’s FinFET technology) is the best that silicon manufacturing can do, that doesn’t mean that innovation comes to an end. It does mean that one of the main sources of faster, smaller and lower-power silicon will have dried up, however. The focus would then be forced toward better use of existing tools rather than simply use of better tools. Some good things might even come out of it: developers of bloatware would be forced to make better (i.e., less buggy and more efficient) products, not just larger ones.
Of course, the computer technology industry would also undergo some serious changes. Perhaps AMD would even stand more of a fighting chance against Intel, since it has for so long had to make the best of its situation despite a process technology lag behind its dominating competitor. This industry would likely begin to resemble the automobile industry: fewer real innovations and more focus on repackaging of the same technology, albeit with some small upgrades in this or that area.
Intel CEO Paul Otellini said, “The world needs Moore’s Law to continue and Intel is committed to make this happen” (“Warning: Moore’s Law ends in two generations”). Likely, the world will do fine if Moore’s Law fails—it will simply have to make some adjustments. In fact, one might even argue that after about a century of unbelievable technological progress, the world is due for a pause that forces it to consider what it all means. Is the multi-billion-dollar computer and electronic infrastructure industry ultimately really just about delivering more videos to people anywhere and anytime (and in high definition)? A little philosophical introspection can’t hurt, especially coming out of a century or so of the most barbaric wars (aided by technology, computer and otherwise) history has ever witnessed.
The end of Moore’s Law seems to be a moving target, so making a firm prediction is a pointless venture. Don’t be surprised when that end arrives, however, as it probably will sometime in the next 10 to 20 years. (But then, it might take even longer.) Whatever the time frame, the end of Moore’s Law doesn’t mean the world has nothing left to live for. It may even turn out that the end of Moore’s Law is really just a pause, or new developments simply require more time and effort. Either way, companies will no doubt have plenty of room to optimize existing technologies and focus on better software. Moore’s Law could potentially end as the result of having reached a physical limit, but economic or other limitations could just as likely be the stumbling block that trips it up.
Photo courtesy of Marcin Wichary