Replication and Ethics in Economics: Thirty Years After Dewald, Thursby and Anderson

Paper Session

Saturday, Jan. 7, 2017 1:00 PM – 3:00 PM

Hyatt Regency Chicago, Grand Ballroom CD North
Hosted By: American Economic Association
  • Chair: Deirdre McCloskey, University of Illinois-Chicago

What is Meant by ‘Replication’ and Why Does It Encounter Such Resistance in Economics?

Maren Duvendack
,
University of East Anglia
Richard Jones
,
University of East Anglia
Robert Reed
,
University of Canterbury

Abstract

This paper discusses recent trends in the use of replications in economics. We identify a number of sources of progress, including the results of recent replication studies that have attempted to identify replication rates within the discipline. These studies generally find that replication rates are relatively low, though they may be higher for laboratory experiments in economics. We also identify two web-based resources for replications, the Replication in Economics wiki and The Replication Network. We then consider obstacles to undertaking replication studies in economics. Two obstacles are the lack of publishing outlets and difficulties in obtaining data and code for published studies. We identify journals that publish replication studies and that “regularly” include data and code as supplementary files for their published research. Finally, we highlight replication initiatives in psychology and political science, behind which economics appears to lag. Whether this is because the problems that beset those disciplines are less severe in economics, or because economics is more resistant to replications, is arguable.

Replication and Economics Journal Policies

Jan H. Hoeffler
,
University of Goettingen

Abstract

We investigate the impact of the introduction of replication policies for leading journals in economics on citations. As has previously been shown for other social sciences, there is an indication that the introduction of a replication policy increases the number of citations for a journal, presumably because readers use the data for their own investigation, possibly also because of a reliability effect. We see our results as an incentive for journals to introduce and enforce replication policies. Lamentably, only a minority of journals so far enforce their policies in a way that ensures replicability of most of the empirical work. With several examples we show how replication becomes difficult if policies are not enforced, and we suggest a pool of replicability editors as a solution: Since it would be too much to expect from journals to have experts for every single topic and software package, a joint effort of journals for such a pool of experts could help to ensure each empirical study is published with data, code, and instructions how to use them together such that all published results can easily be replicated. Reviewers can join the effort for replicability by following the principles of the Agenda for Open Research and refuse to comprehensively review empirical work that does not guarantee fully replicable empirical results. Further study is needed to investigate the citation impact on single articles, and we suggest a design for such research.

Replication versus Meta-Analysis in Economics: Where Do We Stand 30 Years After Dewald, Thursby and Anderson?

Richard G. Anderson
,
Lindenwood University
Areerat Kichkha
,
Lindenwood University

Abstract

Although earlier authors (notably, Ed Kane and Tom Mayer) had emphasized the importance of replication for the intellectual integrity of empirical economic analysis, modern discussion of the issue largely dates from the 1986 American Economic Review paper by Dewald, Thursby and Anderson that summarized their work at the Journal of Money, Credit and Banking. Their work has been extended to other journals by Bruce McCullough and others. Moving beyond replication, some authors have suggested that metadata analysis might be at least as powerful a tool. This analysis surveys the literature since Dewald et al, and compares/contrasts its advocacy for replication (using authors’ datasets and program to recalculate empirical results) with the concept of meta-analysis widely used in the “hard” sciences. For empirical work, the principal difference is that data in empirical economic studies are nonstochastic in the sense that they seldom are created via experiments. In the hard sciences, replication often entails repeating the process that generated the data, an inherently stochastic endeavor. In economics, meta-analysis promises to be useful in assessing the importance of studies based on simulated data, including DSGE modeling.

Is Economics Research Replicable? Sixty Published Papers From Thirteen Journals Say “Usually Not”

Andrew C. Chang
,
Federal Reserve Board
Phillip Li
,
Office of the Comptroller of the Currency

Abstract

We attempt to replicate 67 papers published in 13 well-regarded economics journals using author-provided replication files that include both data and code. Some journals in our sample require data and code replication files, and other journals do not require such files. Aside from 6 papers that use confidential data, we obtain data and code replication files for 29 of 35 papers (83%) that are required to provide such files as a condition of publication, compared to 11 of 26 papers (42%) that are not required to provide data and code replication files. We successfully replicate the key qualitative result of 22 of 67 papers (33%) without contacting the authors. Excluding the 6 papers that use confidential data and the 2 papers that use software we do not possess, we replicate 29 of 59 papers (49%) with assistance from the authors. Because we are able to replicate less than half of the papers in our sample even with help from the authors, we assert that economics research is usually not replicable. We conclude with recommendations on improving replication of economics research.
Discussant(s)
Bruce McCullough
,
Drexel University
Jack Tatom
,
Johns Hopkins University
Stan Liebowitz
,
University of Texas-Dallas
Maren Duvendack
,
University of East Anglia
JEL Classifications
  • A1 - General Economics
  • B4 - Economic Methodology