Pierre Régibeau, 13 March 2021

On 15 December 2020, the European Commission approved the acquisition of Fitbit by Alphabet, subject to a number of commitments. The case caused considerable concern that Google will gain unfair advantages in the online advertising market and ensure its dominance in digital health, with dire consequences for privacy. Critics also feared the acquisition would reduce Google’s incentives to keep its Android ecosystem open to rival wearable products. This column argues that the decision is appropriate, addressing the four main concerns. The suggested theories of harm have remedies or they are not supported by evidence to the requisite legal standard. 

Nicolas Woloszko, 19 December 2020

A pre-requisite for good macroeconomic policymaking is timely information on the current state of the economy, particularly when economic activity is changing rapidly. Given that GDP figures are usually only available on a quarterly basis, the current crisis has prompted a search for alternative high‑frequency indicators of economic activity. This column presents evidence from a new tracker developed by OECD which uses Google Trends and machine learning to provide real-time estimates of GDP growth in countries all over the world.

Elena Argentesi, Paolo Buccirossi, Emilio Calvano, Tomaso Duso, Alessia Marrazzo, Salvatore Nava, 04 March 2020

Dominant companies in the digital market may use merger and acquisitions – especially ‘killer’ or ‘zombie’ acquisitions – and the (under)enforcement of merger control to stifle competition and cement their market dominance. This column analyses acquisition activity by Amazon, Facebook, and Google between 2008 and 2018, and finds that they often targeted very young firms. Because the evolution of young firms is still uncertain, it is difficult for competition authorities to assess the effects of these mergers, especially when the focus is on single acquisitions without considering the overall acquisition strategy.

Neil Cummins, 08 December 2019

Sharp declines in the concentration of declared wealth occurred across Europe and the US during the 20th century. But the rich may have been hiding much of their wealth. This column introduces a new method to measure this hidden wealth, in any form. It finds that between 1920 and 1992, English elites concealed 20-32% of their wealth. Accounting for hidden wealth eliminates one-third of the observed decline of top 10% wealth share over the past century.

James Poterba, Lawrence H. Summers, 25 September 2019

Martin Feldstein, who passed away in June 2019, was one of the most important applied economists of the last half-century. This column, by two of his students and close colleagues, celebrates his intellectual legacy, outlining his seminal contributions on a wide range of topics in public economics and beyond, his pioneering use of large data sets, and his influential voice in US public policy over many decades. As president of the National Bureau of Economic Research for nearly 30 years, Feldstein advanced the conduct and dissemination of economic research, and helped to create the modern economics profession.


The intensive course in Competition in Digital Markets will be held at the Barcelona GSE from November 20 to 22, 2019. This course offers the opportunity to understand how the digital economy works, and under what conditions competition may not function as it should in this sector. It provides participants with a thorough understanding of how to evaluate the substitutability between different offerings and when to view practices such as tying, exclusive contracts, price-parity clauses, and discriminatory access to platforms as anti-competitive (but also explain in what circumstances they are likely to be beneficial).

Course lecturers includes leading international competition scholars and practitioners with extensive experience of the application of economic techniques to competition cases in this area:

Giulio Federico (Head of Unit, CET, DG Competition European Commission)

Chiara Fumagalli (Associate Professor of Economics, Bocconi University)

Massimo Motta (Professor of Economics, ICREA-UPF and Barcelona GSE; former Chief Competition Economist, European Commission) - course director

Martin Peitz (Professor of Economics, University of Mannheim)

An Early Bird discount will be offered to participants confirming their attendance before October 20. A reduced course fee is also available to Regulators, Competition Authorities, Academics and Barcelona GSE Alumni.

Lucrezia Reichlin, 15 July 2019

Lucrezia Reichlin discusses the need for parsimonious models that exploit a data-rich environment to capture the simple drivers of the economy from complex data sets.

Laura Veldkamp, 01 March 2019

The digital economy makes it possible for data-savvy firms to grow very large, very quickly. Laura Veldkamp of Columbia Business School tells Tim Phillips about her new project to model the Big Data economy.

Dirk Bergemann, Alessandro Bonatti, 11 October 2018

The growth of social media over the past decade has brought a parallel explosion in the size and value of information markets. This column presents the findings from a comprehensive model of data trading and brokerage. The model identifies three aspects of information markets – the value of information, the nature of competition in these markets, and consumers’ incentives – which are in particular need of further research and understanding. 

Sanjeev Gupta, Michael Keen, Alpa Shah, Geneviève Verdier, 07 March 2018

Digitalisation has vastly increased our ability to collect and exploit the information that governments use to implement macroeconomic policy. The column argues that the ability of governments to use the vast amounts of information held in the private sector on financial transactions are already making fiscal policy more efficient and effective. Problems of access to digital technology, cybersecurity risks, and the difficulty of organisational change in the public sector may slow the pace at which these opportunities are exploited.

Domenico Giannone, Michele Lenza, Giorgio Primiceri, 08 February 2018

The availability of large datasets has sparked interest in predictive models with many possible predictors. Using six examples of data from macroeconomics, microeconomics, and finance, this column argues that it is not usually possible to identify sparse models by selecting a handful of predictors from these larger pools. The idea that economic data are informative enough to identify sparse predictive models might be an illusion.

Edward Glaeser, Hyunjin Kim, Michael Luca, 17 January 2018

Economic and policy research can often suffer from a scarcity of up-to-date data sources. This column explores the potential for digital data to supplement official government statistics by providing more up-to-date snapshots of the economy. A comparison of data from Yelp with US County Business Patterns data reveals that the Yelp data provide a good indication of underlying economic trends. But although digital data from online platforms offer faster and geographically detailed images of the economy, they should be seen as a complement rather than a substitute for official government statistics.

Hidemichi Fujii, Shunsuke Managi, 16 June 2017

Patent applications are a good indicator of the nature of technological progress. This column compares trends in applications for artificial intelligence patents in Japan and the US. One finding is that the Japanese market appears to be less attractive for artificial intelligence technology application, perhaps due to its stricter regulations on the collection and use of data.

Stephen Redding, David Weinstein, 03 October 2016

Big data stands to transform economic measurement in substantial ways. The volume and precision of data available allows economists to revisit the foundational assumptions underpinning common indexes. This column presents a new empirical methodology that leverages big data to translate nominal numbers into real output or welfare. ‘The unified approach’ nests major price indexes and addresses implicit biases in these measures. An examination with barcode data suggests that standard methods of measuring welfare overstate cost of living increases by ignoring new products and demand shifts.

Lucrezia Reichlin, 28 September 2016

How can we build models that handle uncertainty? In this video Lucrezia Reichlin discusses her work on building models that handle multiple time series simultaneously. This video was recorded during the European Economic Association's Congress held in Geneva at the end of August 2016. 

Christian Hansen, 27 May 2016

Big data is changing economics and the way causal relationships are studied. In this video, Christian Hansen and Soumaya Keynes discuss the importance of big data for econometrics. Big data offers a lot of information and it is easier to draw policy lessons. It also gives more flexibility without forcing researchers to impose control variables, allowing more reliable conclusions to be obtained. This video was recorded in March 2016 during the Royal Economic Society’s Annual Conference held at the University of Sussex.

Alberto Cavallo, Roberto Rigobon, 24 April 2016

Big Data is changing the world, even economics. This column describes MIT’s Billion Prices Project and discusses key lessons for both inflation measurement and some fundamental research questions in macro and international economics. Online prices can be used to construct daily price indexes in multiple countries and to help avoid measurement biases. 

Charles Bean, 31 March 2016

The increasingly digital 21st century economy – with many zero-priced goods and services – is a challenging place for those striving to measure economic activity. This column reviews some of the main themes of the Bean Report on UK economic statistics. It suggests the exploitation of new data sources and the creation of a network of academics, private sector actors, and expert users. They would undertake research and development into the measurement of the economy and propose experimental statistics to capture the new phenomena.


CEPR Policy Research