Sanjeev Gupta, Michael Keen, Alpa Shah, Geneviève Verdier, 07 March 2018

Digitalisation has vastly increased our ability to collect and exploit the information that governments use to implement macroeconomic policy. The column argues that the ability of governments to use the vast amounts of information held in the private sector on financial transactions are already making fiscal policy more efficient and effective. Problems of access to digital technology, cybersecurity risks, and the difficulty of organisational change in the public sector may slow the pace at which these opportunities are exploited.

Domenico Giannone, Michele Lenza, Giorgio Primiceri, 08 February 2018

The availability of large datasets has sparked interest in predictive models with many possible predictors. Using six examples of data from macroeconomics, microeconomics, and finance, this column argues that it is not usually possible to identify sparse models by selecting a handful of predictors from these larger pools. The idea that economic data are informative enough to identify sparse predictive models might be an illusion.

Edward Glaeser, Hyunjin Kim, Michael Luca, 17 January 2018

Economic and policy research can often suffer from a scarcity of up-to-date data sources. This column explores the potential for digital data to supplement official government statistics by providing more up-to-date snapshots of the economy. A comparison of data from Yelp with US County Business Patterns data reveals that the Yelp data provide a good indication of underlying economic trends. But although digital data from online platforms offer faster and geographically detailed images of the economy, they should be seen as a complement rather than a substitute for official government statistics.

Hidemichi Fujii, Shunsuke Managi, 16 June 2017

Patent applications are a good indicator of the nature of technological progress. This column compares trends in applications for artificial intelligence patents in Japan and the US. One finding is that the Japanese market appears to be less attractive for artificial intelligence technology application, perhaps due to its stricter regulations on the collection and use of data.

Stephen Redding, David Weinstein, 03 October 2016

Big data stands to transform economic measurement in substantial ways. The volume and precision of data available allows economists to revisit the foundational assumptions underpinning common indexes. This column presents a new empirical methodology that leverages big data to translate nominal numbers into real output or welfare. ‘The unified approach’ nests major price indexes and addresses implicit biases in these measures. An examination with barcode data suggests that standard methods of measuring welfare overstate cost of living increases by ignoring new products and demand shifts.

Lucrezia Reichlin, 28 September 2016

How can we build models that handle uncertainty? In this video Lucrezia Reichlin discusses her work on building models that handle multiple time series simultaneously. This video was recorded during the European Economic Association's Congress held in Geneva at the end of August 2016. 

Christian Hansen, 27 May 2016

Big data is changing economics and the way causal relationships are studied. In this video, Christian Hansen and Soumaya Keynes discuss the importance of big data for econometrics. Big data offers a lot of information and it is easier to draw policy lessons. It also gives more flexibility without forcing researchers to impose control variables, allowing more reliable conclusions to be obtained. This video was recorded in March 2016 during the Royal Economic Society’s Annual Conference held at the University of Sussex.

Alberto Cavallo, Roberto Rigobon, 24 April 2016

Big Data is changing the world, even economics. This column describes MIT’s Billion Prices Project and discusses key lessons for both inflation measurement and some fundamental research questions in macro and international economics. Online prices can be used to construct daily price indexes in multiple countries and to help avoid measurement biases. 

Charles Bean, 31 March 2016

The increasingly digital 21st century economy – with many zero-priced goods and services – is a challenging place for those striving to measure economic activity. This column reviews some of the main themes of the Bean Report on UK economic statistics. It suggests the exploitation of new data sources and the creation of a network of academics, private sector actors, and expert users. They would undertake research and development into the measurement of the economy and propose experimental statistics to capture the new phenomena.

Events

CEPR Policy Research