Looking back at the way the Industrial Revolution altered the market for many job types, it’s easy to extend that thinking to the “Digital Revolution” and foresee humans being all but entirely replaced by robots and (or perhaps with) artificial intelligence. A recent TechRepublic article, titled “Why AI could destroy more jobs than it creates, and how to save them,” poses this dystopian vision as one possible future, but it downplays or ignores some technical and economic realities that would tend to belie the notion that technological creations will replace humanity.
No Exponential Curves
Truly unending (rising) exponential curves have no place in the real world. Eventually, such curves run into limits that cause them to even off or reverse. An exponentially growing population will eventually be topped by resource limits, exponentially growing asset prices (think the stock market) can be crushed by profit taking or a bubble burst, and so on.
In the case of computer technology, Moore’s Law is running into the limits of transistor size and other physical characteristics. Some engineers argue that it’s already dead, and that semiconductor progress is becoming more linear. But the TechRepublic article cites the book The Second Machine Age as saying, “The accumulated doubling of Moore’s Law, and the ample doubling still to come, gives us a world where supercomputer power becomes available to toys in just a few years, where ever-cheaper sensors enable inexpensive solutions to previously intractable problems, and where science fiction keeps becoming reality.” Nevertheless, even ignoring the likely overstatement of the amount of gas still in the Moore’s Law tank, computers (and robotics) still largely fail at astoundingly simple tasks, like object recognition and certain manual tasks (for example, picking up an egg and tossing it back and forth without breaking it).
Computerized, Robotized Jobs?
The article states, “For most of the second half of the twentieth century the economic value generated in the US—the country’s productivity—grew hand-in-hand with the number of workers. But in 2000 the two measures began to diverge.” The graphs below obtained courtesy of the Federal Reserve Bank of St. Louis show how the growth in productivity (estimated as GDP divided by total private hours worked) rises fairly steadily through about 2010, but the growth of private employees essentially ceases and begins to look more cyclical beginning at the end of the year 2000.
Estimate of U.S. productivity: real GDP divided by private hours worked.
Now, class, what happened at the end of the year 2000? The technology bubble burst, leading to a recession. The number of employees then recovered, only to be hammered back down by the Great Recession of 2009. The number is again climbing, but given the likelihood of new asset bubbles being formed thanks to Federal Reserve money printing and near-zero interest rates (i.e., free money), it is likely to be hammered down once again when the next bubble bursts.
Given the roughly linear growth of the U.S. population over the timespan shown in the figures above (see below), the slackening in the number of private employees seems problematic. But given the financial realities, attributing this change to technology seems unwarranted. Yet the book says, “Unlike much of the 20th century we’re now seeing a falling ratio of employment to population and that’s something that concerns us. We don’t think it’s inevitable but we do think that many of the underlying trends in technology are likely to accelerate this so it’s something we need to pay some serious attention to.”
Furthermore, the article adds, “For the first time since the Great Depression, over half the total income in the United States went to the top 10 percent of Americans in 2012.” But again, the financial explanation holds far more water than the notion that technology is a net destroyer of jobs. For every tractor that replaces several farm workers, numerous other jobs are created for manufacturing, repairing and maintaining that tractor. (A good farmer knows that the ratio of time spent fixing a tractor to the time spent using a tractor is disturbingly high.) Attempting to explain the wealth gap without considering the Federal Reserve’s “free money” policy is preposterous—after all, it’s only the ultra-wealthy individuals and companies who have access to that money.
“The prediction that society is heading towards a period of technological unemployment was made decades ago by John Maynard Keynes, who forecast it as an inevitable outcome of society discovering ways to make labor more efficient more rapidly than finding new uses for labor,” according to the article. But the current Federal Reserve policies and enormous government debt (which is likely to be involved the next time a bubble bursts) is an outworking of Keynesian economic theory; clearly he wasn’t quite up to snuff on the economic realities.
So, the issue of technology is actually drowned out by the fiscal realities, which are more likely driving shaky growth in private employment. Technology, by itself, may have some impact, but that impact is uncertain in light of the larger trends.
Descending Into Absurdity
Overreaction to a perceived threat (think the inane “war on terror” in the U.S.) leads to all manner of perverse outcomes. According to the article, overcoming the threat to jobs that technology poses requires fundamentally changing education: “Better education, [sic] doesn’t mean continuing to teach the same subjects in the same way, and certainly not focusing primarily on the three Rs—reading, writing, and arithmetic.” From a job-skills perspective, subjects like art, history and language may not be at the top of an employer’s list, but from a life perspective, they are critical.
Ultimately, despite leaps in technology, humans still are hard pressed to live past 100 years. Diseases that killed many decades ago are still killing many today (e.g., cancer). If life is nothing more than technology, a few decades of “working for the man” and narrowing the wealth gap, then perhaps humanity deserves to be enslaved by machines—what difference does it make anyway, if everyone is essentially dead already?
But technology is a tool and will always remain so. Some individuals and organizations may treat it like a deity or as a controlling force, but despite all the glitz, glamor, bells and whistles, computers are still just stupid machines that are occasionally good at faking it. Ironically, the problems that society faces are less accessible to those who only learn a hard skill (even a technical one that requires lots of brainpower) but little of the three Rs (particularly reading), but clearer to those who read, study history and think critically. The threat to jobs is less from technology—something that can be as beneficial as it is “destructive”—and more from social engineers, politicians and financial barons, who would no doubt prefer an illiterate population that believes it is (or will be) enslaved to machines.