“Computing performance doubles every couple of years” is the popular re-phrasing of Moore’s Law, which describes the 500,000 – fold in crease in the number of transistors on modern computer chips. But what impact has this 50 –
year expansion of the technological frontier of computing had on the productivity of firms?
This paper focuses on the surprise change in chip design in the mid-2000s, when Moore’s Law faltered. No longer could it provide ever-faster processors, but instead it provided multicore ones with stagnant speeds.
Using the asymmetric impacts from the changeover to multicore, this paper shows that firms that were ill-suited to this change because of their software usage were much less advantaged by later improvements from Moore’s Law.
Each standard deviation in this mismatch between firm software and multicore chips cost them 0.5-0.7pp in yearly
total factor productivity growth. These losses are permanent, and without adaptation would reflect a lower long term growth rate for these firms. These findings may help explain larger observed declines in the productivity growth
of users of information technology.