“You can see the computer age everywhere but in the productivity statistics,” quipped Robert Solow in 1987, a renowned MIT economics professor and author of the workhorse “Solow Model” of GDP growth that dominated the field. The Apple II had been launched a decade earlier, and the Personal Computer was in the early stages of phenomenal adoption throughout American homes and business. In late December 1982, Time magazine named the personal computer its "Machine of the Year" — the first non-human recognized as the “Person of the Year.” The magazine's January 1983 issue featured the personal computer on its cover, heralding a new machine poised to shape the future. Solow recognized the importance of computers, but the hard truth was that PCs weren’t having an outsize effect on economic growth.
25-Year Average Annual Growth Rates in GDP per capita (1800–2025)
(LEGEND: Interval. Starting level~Ending level. Annualized Growth Rate)
1800–1825. $1,100~$1,200. ~0.35%
1825–1850. $1,200~$1,600. ~1.10%
1850–1875. $1,600~$2,000. ~0.90%
1875–1900. $2,000~$3,500. ~2.10%
1900–1925. $3,500~$5,000. ~1.40%
1925–1950. $5,000~$10,800. ~2.90%
1950–1975. $10,800~$22,000. ~2.85%
1975–2000. $22,000~$38,000. ~2.15%
2000–2025. $38,000~$56,000. ~1.56%
In his book The Great Stagnation (2011), Tyler Cowen argued that society had “picked the low-hanging fruit” of innovation, leading to slower growth. It was a nice metaphor, reflecting the consensus view that scientific advance was getting harder. Cowen was hardly the first to notice the general deceleration, though. Princeton’s Paul Krugman made the first big splash alerting the general public to the slowdown in his 1990 book The Age of Diminished Expectations. Krugman reflected on a so-called golden age of growth that ended abruptly in 1973. He, like the novelist Stephen King, might have been guilty of nostalgia for the years when they were in their 20s, what psychologists call the “reminiscence bump,” but in this case the GDP metrics said the same thing.
Krugman’s early diagnosis was prescient. The oil crisis of the 1970s and subsequent recession led to a profound confusion about the American economy. Was it a standard short-term recession in the business cycle? Or was it a long-term sea change in the economy? Or was it some kind of political mismanagement? Without a doubt, the reigning Keynesian ideology was rudely and suddenly discredited by the subsequent stagflation. The lesson seemed to be: No, government leaders cannot “manage” an economy by fine-tuning aggregate demand. As Milton Friedman reflected decades later, “During the 1970s, you had a combination that under Keynesian analysis could not exist. You had high inflation and high unemployment at the same time—named stagflation. A thousand essays agree that simple Keynesianism died that decade.
There wasn’t just one recession causing the confusion, there were three. Three recessions, each severe in their own way, shook American and the wider world during the nine years from November 1973 to November 1992. Bookended by Republican presidents, it was President Jimmy Carter, a one-term Democrat, whose reputation really took the blame.
November 1973 – March 1975: The Oil Shock Recession.
January 1980 – July 1980: The Energy Crisis I Recession.
July 1981 – November 1982: The Energy Crisis II Recession.
When the dust settled - meaning when the data was revised and finalized - a clearer picture emerged that the three shocks in the business cycle masked a greater trend in the fundamental deceleration of GDP growth. Whether the numbers were assessed in terms of simple labor productivity (output divided by population) or more nuanced total-factor productivity (the residual growth rate after accounting for increases in labor and capital), the golden age was over. Why?
Krugman speculated that the American economy was less productive on purpose, in the sense that major regulations affecting pollution and labor rights caused output growth to slow but for good reasons. He’s beaten that drum since 1990. For instance, in a relatively recent New York Times essay, Krugman discussed a fifty-year regulatory reduction of workplace injuries, quoting a retrospective report from the U.S. Department of Labor: “The incidence rate of nonfatal injuries and illnesses among private industry workplaces occurred at a rate of 10.9 cases per 100 full-time equivalent workers in 1972 and 2.8 cases in 2018.” As Krugman wryly opined, that sure looks like progress.1
Ask most social scientists for the most in-depth exploration of the great productivity slowdown, and you’ll almost surely be directed to a single book. Robert Gordon’s The Rise and Fall of American Growth, published in 2016, is a monumental work of economic history and a New York Times best-seller. It was adored with glowing reviews and multiple book awards. This large-scale history of innovation is a piece of my favorite genre - grand economic history - like Charles van Doren’s epic A History of Knowledge: Past, Present, and Future (1992), Francis Fukuyama’s The End of History and the Last Man (1992), and Jared Diamond’s Guns, Germs, and Steel: The Fates of Human Societies (1999). The problem is that most of these deep tomes chronicle century after century of marvelous ingenuity and then come to the final chapter where they have to shift from historical analysis to forecasting. And they universally conclude, wrongly, that there’s nothing more to invent.
Consider this New York Review of Books review of Gordon’s magnum opus by the great William D. Nordhaus:
"The book is without peer in providing a statistical analysis of the uneven pace of growth and technological change, in describing the technologies that led to the remarkable progress during the special century, and in concluding with a provocative hypothesis that the future is unlikely to bring anything approaching the economic gains of the earlier period.” (italics added)
Let’s set aside the psycho analysis of why economists are bad at predicting tech future when they are steeped in the statistical wizardry of purely mathematical trend prediction. It’s kind of hilarious (“Wages have been rising inexorably for two thousand years. I expect to stop suddenly tomorrow. Same goes for crop yields. Same goes for energy production. Same goes for …) but not relevant. What matters is that even our smartest wizards cannot see the future.
Gordon documented the amazing impacts of internal combustion engine, the microchip, and satellite communications. But he could not imagine AI. It was literally beyond his capacity. Instead, he claimed that there were headwinds that are “sufficiently strong to leave virtually no room for growth over the next 25 years in median disposable real income per person.” (emphasis added)
Is it fair to say Gordon’s 2016 claim was shattered less than a decade after it was published? ChatGPT is changing everything, is it not?
Maybe. Techno-pessimists will make the counterargument of Robert Solow: Where is the GDP growth? “You can see the computer age everywhere but in the productivity statistics,” said Solow in 1987. 1987! That year was in the peak of the Reagan boom, before the Clinton boom of the 1990s. The eventual figures of that quarter century 1975-2000 proved Solow right: growth was 2.15% per year, much slower than the 2.85% pace of 1950-75.
Our current quarter century 2000-2025 - the era of the iPhone, Google search, and Amazon’s multiple revolutions, not to mention immunotherapy, CRISPR, mRNA, and cloning - has even slower growth. Just 1.56% per year, half as fast as growth during 1950-75. What is happening?
I’m not sure the optimists have a good answer. The usual suspects are:
Technology slowdown is real (this is the low-hanging fruit hypothesis). Maybe we’re overhyping our inventions. The evidence before our eyes is that the computer-internet-mobile diffusion of digital communications is the biggest cultural shift in human history, but maybe our eyes are wrong.
Demographic trends. The notion is that an older population means fewer brains per capita doing active research. This seems speculative at best, and the upsurge in patent activity seems to contradict the theory. Besides, if anything, a smaller population with the same amount of capital is de facto more productive.
Policy & uncertainty. An evergreen complaint is that current President and Congress are bad economic managers, that growth will pick up when our team is back in power. This critique doesn’t stand the literal test of time: a 25 year cycle has a steady mix of Republicans and Democrats. Don’t forget that the Keynesian humbling of the 1970s was a rebuke to an economic philosophy that was dominant in both political parties, not just one.
Investment shortfalls. There’s an argument that public infrastructure isn’t being built at the scale of the highways of the 1950s and the railroads of the 1880s. Two problems with that thesis: (1) the communications infrastructure buildout of recent decades is unparalleled (e.g., Starlink, 5G, TCP/IP); and (2) a second interstate highway system won’t be twice as productivity-enhancing as the first. Plus, the investment numbers from the private sector seem robust.
Structural economy shift. This answer has some teeth. As consumption shifts from goods to services, it’s harder to see productivity gains. Maybe impossible. How can GDP measure tastier, more eclectic dining options? It’s ironic that the increased consumption of raw fish is in fact a great technological breakthrough, but if you’ve dined in Kansas city on a plate of sashimi that was swimming in the deep Pacific twelve hours ago, well, you’re living in an age of miracles. But the data can’t see that. And also, the data cannot distinguish the productivity of much better haircuts or massages.
Measurement issues. This one I like, as you know. What if the gains of modern growth are qualitative in nature? That’s my theory.
My theory is that progress is commoditized.
A microwave oven in 1967 was a new appliance, with a price of $495 (equivalent to $4,000 today). Sixty years on, much improved microwaves are available for less than $100 at Wal-mart.
Using 1980 as a reference year, the cost of five household appliances have dropped significantly. A typical midrange personal computer cost around $3,000 in 1980, which is equivalent to approximately $10,650 in 2023 dollars after adjusting for inflation. Today, a midrange PC that is better by orders of magnitude can be purchased for about less than $1000, representing a dramatic 90% decrease in raw price. The price of computing power that a penny buys now (one hundred GFLOPS, or Giga Floating-Point Operations per Second) would have cost billions of dollars in 1980. Similarly, a dishwasher that cost around $700 in 1980 (or $2,485 in today’s dollars) now sells for roughly $400. A Kenmore window-unit air conditioner was priced the equivalent of $1400 (nominally $349) in the 1980 Sears summer catalog, while the equivalent unit at Walmart today is offered for one-sixth the cost.2
These massive reductions highlight how high-utility household goods have become commoditized over time. Yet in GDP statistics, for each unit sold, it would actually appear that output has declined. There is an adjustment made to real GDP growth to account for quality improvements, to be sure. This hedonic adjustment distinguishes pure price changes from quality improvements, factoring in features in new products. A newer AC unit, for example, uses half the energy and generates half the decibels as its 1980 model, with automatic adjustment and remote controls. The technicians at the Bureau of Economic Analysis try to factor those hedonic adjustments into their prices deflators. But there’s only so much quality that can be quantified.
And if we expand the scope, society-wide progress in how GDP is made are not on the GDP scale. The factory floor in America is safer, it’s pollution load is lighter. Too bad if conservatives don’t like to hear this, but Paul Krugman got this one right.
If history is truly a guide, we should expect the unexpected when it comes to economic innovation, growth, and broadly shared prosperity. The experts of yesterday predicted doom and devastation from industrial automation - displacing workers without benefiting consumers. They warned that television would dumb down society. They thought personal computers were little more than electronic toys. Instead, the commoditization of progress simply masks rising living standards. I think this is the explanation for the great productivity slowdown. Growth in well-being is far higher than growth in output per capita suggests.
Krugman, https://www.nytimes.com/2023/02/07/opinion/regulation-productivity-epa-osha.html
https://christmas.musetechnical.com/ShowCatalogPage/1980-Sears-Spring-Summer-Catalog/1019
Thanks to Bob Litan for the shoutout to Hedonic adjustments.