Design

There’s plenty of room at the Top: What will drive computer performance after Moore’s law?

ribCo-authors: Charles E. Leiserson, Neil C. Thompson, Joel S. Emer, Bradley C. Kuszmaul, Butler W. Lampson, Daniel Sanchez, Tao B. Schardl

rib Paper: Science June 2020

ribTechnologyReview, Spectrum, TechnologyReview, StatetechMagazine

The miniaturization of semiconductor transistors has driven the growth in computer performance for more than 50 years. As miniaturization approaches its limits, bringing an end to Moore’s law, performance gains will need to come from software, algorithms, and hardware. We refer to these technologies as the “Top” of the computing stack to distinguish them from the traditional technologies at the “Bottom”: semiconductor physics and silicon-fabrication technology. In the post-Moore era, the Top will provide substantial performance gains, but these gains will be opportunistic, uneven, and sporadic, and they will suffer from the law of diminishing returns. Big system components offer a promising context for tackling the challenges of working at the Top.

Design

The Decline of Computers as a General Purpose Technology: Why Deep Learning and the End of Moore’s Law are Fragmenting Computing

ribCo-author: Svenja Spanuth

rib Paper, revised version forthcoming in Communications of the ACM

rib SSRN Top 5 Paper of the Week

ribNextPlatform, InvestorPlace, NZZ

It is a triumph of technology and of economics that our computer chips are so universal – the staggering variety of calculations they can compute make countless applications possible. But, this was not always the case. Computers used to be specialized, doing only narrow sets of calculations. Their rise as a ‘general purpose technology (GPT)’ only happened because of ground-breaking technical advancements by computer scientists like von Neumann and Turing, and virtuous economics common to general purpose technologies, where product improvement and market growth fuel each other in a mutually reinforcing cycle.

Read More

Design

The Economic Impact of Moore’s Law

rib Paper

“Computing performance doubles every couple of years” is the popular re-phrasing of Moore’s Law, which describes the 500,000 – fold increase in the number of transistors on modern computer chips. But what impact has this 50 – year expansion of the technological frontier of computing had on the productivity of firms?

This paper focuses on the surprise change in chip design in the mid-2000s, when Moore’s Law faltered. No longer could it provide ever-faster processors, but instead it provided multicore ones with stagnant speeds.

Read More

Design

The Importance of (Exponentially More) Computing Power

ribCo-authors: Shuning Ge and Gabriel Manso

Denizens of Silicon Valley have called Moore’s Law “the most important graph in human history,” and economists describe the Moore’s Law-powered I.T. revolution as one of the most important sources of national productivity. But data substantiating these claims tend to either be abstracted – for example by examining spending on I.T., rather than I.T. itself – or anecdotal. In this paper, we assemble direct evidence of the impact that computing power has had on five domains: two computing bellwethers (Chess and Go), and three economically important applications (weather prediction, protein folding, and oil exploration). In line with economic theory, we find that exponential increases in computing power are needed to get linear improvements in outcomes, which helps clarify why Moore’s Law has been so important.