Published on 17 July 2023
Share Tweet  Share

The cost of computing and the productivity puzzle

While the processing power of computing could account for the apparent productivity slowdown in ICT, the binding constraint on growth is more likely to lie in organisational constraints, writes Lucy Hampton.

Computation is extensive in the modern economy. Yet information and communications technology (ICT) is one of the largest contributors to the productivity slowdown, with productivity growth in the sector more than halving between 1998 and 2007, and 2008 and 2019.

This is puzzling given the pace of innovation within the sector. Not only have there been several technical advances (see table below), data suggests that both businesses and consumers have embraced these new technologies. For example, 53% of all UK businesses purchased cloud services in 2020, and ChatGPT reached one million users in just five days, – showing the considerable public enthusiasm for artificial intelligence (AI). Economist Robert Solow quipped in 1987 that the computer age was everywhere except for the productivity statistics – more than 30 years later, the Solow paradox looms large.

Post-2005 innovationApplications
SmartphonesHandsets, operating systems
3G-5G networksData-enabled network services
GPUs, TPUs, H100Specialist processors; also reduced instruction set chips (RISC)
Deep learning, reinforcement learningMachine learning advances e.g., AlphaFold
Large language or foundation modelsAI systems e.g., GPT4, Dall-E
Cloud computingStorage, SaaS (Software as a Service), IaaS, ML
Additive manufacture3D printing of components
Robotics in distributionDrones, delivery robots
Robotic process automationAutomates business processes
Augmented and virtual realitySome existing applications: “Metaverse”?
Quantum computingAt the R&D stage
WearablesDevices e.g., smart watches, clothing, medical implants
Advances in nanotechnologyConstruction of miniaturised components in computer chips, sensors etc.

Potential explanations for the paradox are numerous. Some, including Erik Brynjolfsson, emphasise the role of complementary investments in intangibles required to yield productivity gains from new technology, such as training workers and restructuring business processes. On this view, the computer age has not appeared in the productivity statistics yet, given time, it will. Others, such as Robert Gordon and Nicholas Bloom argue that the productivity of innovation itself has slowed down.

Another group points to the difficulty of accurately measuring productivity growth. One such obstacle is estimating the rate of price decline and adjusting this appropriately for quality change. Underestimating the rate of decline would result in real productivity growth being undermeasured.

To investigate this possibility for IT, economist Bill Nordhaus constructed very long-run price indices in the period 1950 to 2005 based on the cost of computing performance. The resulting price declines were much steeper than official statistics, indicating substantial under-measured progress in computing.

Twenty first century progress in computing

A new working paper by Diane Coyle and I  follows Nordhaus in constructing an updated computing price-performance series for the 21st century. We obtain central processing unit (CPU) performance data from SPEC (Standard Performance Evaluation Corporation), an industry-standard performance benchmarking corporation, and match this to price data scraped from the web to obtain price-performance.

The graphs below display the results. The four coloured lines represent the cost of computation for different measures of performance, and the grey dot-dashed line represents the official semiconductor Producer Price Index (PPI).

Source: Coyle and Hampton (2023)

While there appears to have been a slight slowdown around 2010, the overall picture is one of rapid progress in cheap computing that far outpaces official statistics. Depending on the measure used, CPU computing is between 10 and 1,000 times cheaper than it was two decades ago. The PPI, on the other hand, has declined by just a factor of 3.

New ways of computing: GPUs and the cloud

While these declines are impressive, they do not tell the whole story. Focusing on the cost of CPU computing ignores innovation in the ways in which computation can be done. Graphics Processing Units (GPUs), for example, were originally designed for video and image processing but are now widespread in the training of neural networks for AI. It is also increasingly common for firms to access computing resources remotely via the cloud.

To build a more comprehensive picture, the CPU price indices are combined with a GPU price index. This requires information about the relative use of CPUs and GPUs. As no direct evidence exists on this question, we choose a variety of weights to represent different scenarios about the rate at which GPU use has increased.

On conservative assumptions, computation is around 100 times cheaper than it was in 2000. Assuming more rapid growth in GPU use, it is closer to being 300 times cheaper. Not only does accounting for innovation in types of chips increase the overall rate of decline, it also partially offsets the slowdown around 2015, as GPU price declines did not slow as much as CPUs.

Source: Coyle and Hampton (2023), and EpochAI for GPU data

The story is similar for cloud prices, which are less than a tenth of what they were a decade ago when adjusted for performance improvements.

Source: Coyle and Hampton (2023)

What does this mean for productivity?

This work rules out one potential explanation of the productivity slowdown: cheap computation is not a bottleneck in the economy. While there has been a slowdown since 2015, it does not sync up with the timing of the whole-economy slowdown in 2008.

What then could be causing the apparent productivity slowdown in ICT?

Firstly, it’s important to note how different modern computers are to their older counterparts. While mid-century computers were primarily calculating machines, modern computers are information hubs, connected to each other via networks and a multitude of online platforms. They use sophisticated software to do tasks that would have been unthinkable on older computers. But this expanded functionality comes at a price: increased complexity in use, and interdependencies between inputs. Consider a smartphone camera: on top of hardware, they require software to convert signals into a photo, image processing algorithms, storage, and integration with other apps for editing or sharing – the latter also requiring network connectivity.

These interdependencies in turn create the potential for bottlenecks. The explanation for the slowdown could therefore lie in these increasingly important other inputs. These include not only the processing power considered in our paper but also software, energy, networks, storage and AI, and the broader organisational and human capital required to fully adopt and exploit these.

Complexity also makes measurement harder. Creating a sector-wide price index requires weighting together different services, with different weights resulting in different decline rates. Ideally, these weights would correspond to economic value, but this raises difficult and unanswered questions about where value is generated.

With these issues in mind, it is not surprising that the pace of innovation could be rapid while productivity growth lags behind – perhaps the Solow paradox is no longer such a paradox after all.

Published by Telecommunications Policy: 21st century progress in computing

Working paper: Twenty-first century progress in computing

Image: By Science History Institute, CC BY-SA 3.0.

Gordon Moore, the Co-Founder of Intel and discoverer of the eponymous Moore’s Law; the observation that the number of transistors on an integrated circuit doubles every two years. It is widely argued that Moore’s Law will soon slow down as transistor size approaches physical limits.

The views and opinions expressed in this post are those of the author(s) and not necessarily those of the Bennett Institute for Public Policy.

Back to Top