,

The intensive course in Competition in Digital Markets will be held at the Barcelona GSE from November 20 to 22, 2019. This course offers the opportunity to understand how the digital economy works, and under what conditions competition may not function as it should in this sector. It provides participants with a thorough understanding of how to evaluate the substitutability between different offerings and when to view practices such as tying, exclusive contracts, price-parity clauses, and discriminatory access to platforms as anti-competitive (but also explain in what circumstances they are likely to be beneficial).

Course lecturers includes leading international competition scholars and practitioners with extensive experience of the application of economic techniques to competition cases in this area:

Giulio Federico (Head of Unit, CET, DG Competition European Commission)

Chiara Fumagalli (Associate Professor of Economics, Bocconi University)

Massimo Motta (Professor of Economics, ICREA-UPF and Barcelona GSE; former Chief Competition Economist, European Commission) - course director

Martin Peitz (Professor of Economics, University of Mannheim)

An Early Bird discount will be offered to participants confirming their attendance before October 20. A reduced course fee is also available to Regulators, Competition Authorities, Academics and Barcelona GSE Alumni.

Lucrezia Reichlin, 15 July 2019

Lucrezia Reichlin discusses the need for parsimonious models that exploit a data-rich environment to capture the simple drivers of the economy from complex data sets.

Laura Veldkamp, 01 March 2019

The digital economy makes it possible for data-savvy firms to grow very large, very quickly. Laura Veldkamp of Columbia Business School tells Tim Phillips about her new project to model the Big Data economy.

Dirk Bergemann, Alessandro Bonatti, 11 October 2018

The growth of social media over the past decade has brought a parallel explosion in the size and value of information markets. This column presents the findings from a comprehensive model of data trading and brokerage. The model identifies three aspects of information markets – the value of information, the nature of competition in these markets, and consumers’ incentives – which are in particular need of further research and understanding. 

Sanjeev Gupta, Michael Keen, Alpa Shah, Geneviève Verdier, 07 March 2018

Digitalisation has vastly increased our ability to collect and exploit the information that governments use to implement macroeconomic policy. The column argues that the ability of governments to use the vast amounts of information held in the private sector on financial transactions are already making fiscal policy more efficient and effective. Problems of access to digital technology, cybersecurity risks, and the difficulty of organisational change in the public sector may slow the pace at which these opportunities are exploited.

Domenico Giannone, Michele Lenza, Giorgio Primiceri, 08 February 2018

The availability of large datasets has sparked interest in predictive models with many possible predictors. Using six examples of data from macroeconomics, microeconomics, and finance, this column argues that it is not usually possible to identify sparse models by selecting a handful of predictors from these larger pools. The idea that economic data are informative enough to identify sparse predictive models might be an illusion.

Edward Glaeser, Hyunjin Kim, Michael Luca, 17 January 2018

Economic and policy research can often suffer from a scarcity of up-to-date data sources. This column explores the potential for digital data to supplement official government statistics by providing more up-to-date snapshots of the economy. A comparison of data from Yelp with US County Business Patterns data reveals that the Yelp data provide a good indication of underlying economic trends. But although digital data from online platforms offer faster and geographically detailed images of the economy, they should be seen as a complement rather than a substitute for official government statistics.

Hidemichi Fujii, Shunsuke Managi, 16 June 2017

Patent applications are a good indicator of the nature of technological progress. This column compares trends in applications for artificial intelligence patents in Japan and the US. One finding is that the Japanese market appears to be less attractive for artificial intelligence technology application, perhaps due to its stricter regulations on the collection and use of data.

Stephen Redding, David Weinstein, 03 October 2016

Big data stands to transform economic measurement in substantial ways. The volume and precision of data available allows economists to revisit the foundational assumptions underpinning common indexes. This column presents a new empirical methodology that leverages big data to translate nominal numbers into real output or welfare. ‘The unified approach’ nests major price indexes and addresses implicit biases in these measures. An examination with barcode data suggests that standard methods of measuring welfare overstate cost of living increases by ignoring new products and demand shifts.

Lucrezia Reichlin, 28 September 2016

How can we build models that handle uncertainty? In this video Lucrezia Reichlin discusses her work on building models that handle multiple time series simultaneously. This video was recorded during the European Economic Association's Congress held in Geneva at the end of August 2016. 

Christian Hansen, 27 May 2016

Big data is changing economics and the way causal relationships are studied. In this video, Christian Hansen and Soumaya Keynes discuss the importance of big data for econometrics. Big data offers a lot of information and it is easier to draw policy lessons. It also gives more flexibility without forcing researchers to impose control variables, allowing more reliable conclusions to be obtained. This video was recorded in March 2016 during the Royal Economic Society’s Annual Conference held at the University of Sussex.

Alberto Cavallo, Roberto Rigobon, 24 April 2016

Big Data is changing the world, even economics. This column describes MIT’s Billion Prices Project and discusses key lessons for both inflation measurement and some fundamental research questions in macro and international economics. Online prices can be used to construct daily price indexes in multiple countries and to help avoid measurement biases. 

Charles Bean, 31 March 2016

The increasingly digital 21st century economy – with many zero-priced goods and services – is a challenging place for those striving to measure economic activity. This column reviews some of the main themes of the Bean Report on UK economic statistics. It suggests the exploitation of new data sources and the creation of a network of academics, private sector actors, and expert users. They would undertake research and development into the measurement of the economy and propose experimental statistics to capture the new phenomena.

Events

CEPR Policy Research