Explore by…

More

Replication and Transparency in Economic Research


In 2003, McCullough and Vinod wrote, “Research that cannot be replicated is not science, and cannot be trusted either as part of the profession’s accumulated body of knowledge or as a basis for policy.”(1)

Twelve years later, the economics profession has made some progress: journals such as the American Economic Review and the Journal of Political Economy have data archives for empirical research and provide material, such as the code of the statistical software, which is necessary to obtain the published results, but these journals are in the minority.(2) The American Economic Review was leading the initiative for data archives, but it was criticized after the replication (3) of its famous paper Growth in a Time of Debt (4) drew attention to the fact that the data availability policy was not binding for its papers and proceedings issues, in which short papers based on presentations held at the American Economic Association Annual Meeting are published. While it is a good sign that this policy has now been changed, the replication and critique was not accepted for publication by the journal as it does not take comments on its papers and proceedings.

In fact, many replications, if they get published at all, end up in far less prestigious journals than the original studies that may have become famous with sensational results that do not hold under scrutiny. The initiative for a replication journal could help to overcome the lack of outlets for replication studies,(5) as even journals that have policies for replication material often do not enforce them. Journals should have special editors responsible for reproducibility who carefully check that results are only published when they can be reproduced with the submitted material or, an important step further, when they can show how the datasets used for analysis were produced from the raw data. Some journals have a policy that authors should do this at the request of researchers who want to replicate empirical studies, though in our experience this rarely happens, and better guidelines are needed.

Improved data transparency would enormously help to investigate how researchers often come to different conclusions on the same research questions and would allow experts to compare and evaluate themselves. If even those who work on the topic cannot replicate more than half of the published studies on an important issue such as the growth of the US economy,(6) how can politicians be advised, and how can voters be convinced whose expertise they should trust? We as economists should invest in transparency because it will help to improve the credibility we need in order to make use of the knowledge we create.

To help address these issues, we created the ReplicationWiki which compiles the most comprehensive set of replications to date. Currently the wiki covers 280 replications as well as information on the availability of replication material, data, methods, and software used for more than 2000 empirical studies published in journals with data archives. This helps both for research, as it makes it easier to replicate previous studies one would like to base one’s research on, and for teaching, as the tedious search for studies that can be used for student projects is facilitated.(7) The wiki already has more than 100 registered users, and its pages have been accessed more than 850,000 times. We hope that the wiki will continue to be used by researchers around the world, that the option to vote which studies should be replicated will be used more in the future, and that our workshop after the Annual Meeting of the American Economics Association in January 2016 will be the start for a long lasting young scholar initiative that can contribute also to the wiki. By replicating many studies and assembling information on further replications we will help to establish replication as a common practice.

Join our workshop on Replication and Transparency in Economic Research

A session on replication will also be held during the 2016 AEA meetings


Share your perspective