Why has new technology failed to boost productivity growth?
By Nicholas Bloom
People's Daily app
1534140931000

微信图片编辑_20180813141430.jpg

(Photo: VCG)

Recent US productivity growth has been depressing. In the 1950s American productivity was rising by almost 4 percent a year – a period of incredible progress driven by the rapid expansion of research universities like Harvard, MIT and Stanford, and Research labs in firms like General Electric and Ford. But by the 1980s this productivity growth had halved to 2 percent, and has now fallen to 1percent per year.

This slowdown has sparked a debate among economists over the sources of the problem. Are statisticians underestimating output? Is the United States mired in “secular stagnation”—a prolonged period of low economic growth caused by too much saving and too little investment ? Or are recent innovations simply not as productive for society as those of the past?

In research I carried out with three fellow economists (Chad Jones and Mike Webb of Stanford and John Van Reenen of MIT) I argue that ideas productivity — the productivity of science and discovery — has been falling for decades. Scientific discoveries and technical advances are getting harder and harder to find, so that innovation is slowing down.

The creation of ideas is central to a growing economy. This is driven by two things: the number of researchers - scientist and engineers - and the research productivity of these researchers. Our analysis found that while there are a rising number of researchers, each one is becoming less productive over time. So while research and development efforts have been rising steeply for decades, research productivity — the number of ideas being produced per researcher — has fallen rapidly.

The analysis revealed that more than 20 times as many Americans are engaged in R&D today as were in 1930, yet their average productivity has dropped by a factor of more than 40. The only way the United States has been able to maintain even its current lackluster GDP growth rate has been to throw more and more scientists and engineers at research problems. The US economy has had to double its research efforts every 13 years just to sustain the same overall rate of economic growth.

One example of this is the power of silicon chips, which have double roughly every two years, a phenomenon known as Moore’s Law, after Gordon Moore, the co-founder of the computer chip giant Intel. The resulting advances have enabled the creation of ever more powerful computers, which have affected every aspect of society. But over 18 times as many researchers are now required just to maintain that regular doubling compared to the early 1970s.

A similar pattern shows up in agricultural and pharmaceutical industries. For agricultural yields, research effort went up by a factor of two between 1970 and 2007, while research productivity declined by a factor of 4 over the same period, at an annual rate of 3.7 percent. For pharmaceuticals, research efforts went up by a factor of 9 between 1970 and 2014 while research productivity declined by a factor of 5, an annual rate of 3.5 percent.

We also examined the track records of data over 15,000 US public firms between 1980 and 2015 and found that even as spending on R&D rose, a vast majority of the firms experienced rapid declines in ideas productivity. The average firm now needs 15 times as many researchers as it did 30 years ago to produce the same rate of growth.

So why has the productivity of scientists and engineers fallen so much? The simple reason is ideas are getting harder to find.

To explain my view we should travel back to 1750, which was the start of the Industrial Revolution in England. Before then productivity growth was close to zero. Most of the population in 1700 still worked on farms and were not much more productive than their ancestors under the Romans 2,000 years before. But from the late 1700s until about 1950 productivity growth began to accelerate. This is the era of “Standing on the shoulders of giants” – each new invention, like the steam engine, electric lighting, penicillin etc made future inventors more productive. Growth took off as more and more firms started creating industrial R&D labs – starting with Thomas Edison labs in 1876 - while universities began to focus more on science and engineering research. By 1950, however, the tide began to turn. The US was reaching peak productivity growth of around 4 percent per year and the third phase – “the Apple Tree model” - was setting in. Humanity had made many of the quickest discoveries, and so unearthing new scientific truths started getting harder.

But what about the future? Recent history rather than science fiction tends to be the best predictors of the near future. So the next 10 to 20 years are likely to herald similar trends 1 percent productivity growth as the recent past. Compared to the 1950s to 1990s that is slow, but compared to the longer sweep of history going back 2000 years that is a blistering rate of advance.

But looking more than 20 years ahead technology growth is hard to call. We have had at least two changes in productivity growth since 1750 and maybe another one will be heralded by the information age of the 22nd century.

The author is the William D. Eberle Professor of Economics at Stanford University