2017 Annual Meeting

Chicago IL
January 6 – 8

The 2017 Annual Meeting was held in Chicago, IL (January 6-8, 2017 - Friday, Saturday, & Sunday). View Photos from the meeting.

Webcasts from selected AEA sessions and AEA Poster Presenter Videos are available, compliments of the AEA.

Friday, January 6

AEA Poster Session

Poster Session

Friday, Jan. 6, 2017 – Sunday, Jan. 8, 2017

Hyatt Regency Chicago, Columbus Hall
Hosted By: American Economic Association
  • Poster title: Is Deflation Costly After All? Evidence from Noisy Historical Data
  • Abstract: I study the link between real activity and deflation, taking into account measurement problems in 19th century CPI data. Replications based on modern data show that measurement problems spuriously increase the volatility of inflation as well as the number of deflationary episodes, and they lower inflation persistence. As a consequence, estimates of the link between real activity and deflation may be attenuated because of the errors-in-variables problem. I find that real activity was on average substantially lower during 19th century deflations in the US, after controlling for measurement error using an IV-regression approach. Moreover, the average short-fall in real activity was not significantly different compared to the Great Depression. Using well-measured data for a panel of 17 industrialized economies shows that milder deflations were associated with a lower output gap. But, the association with GDP growth is not statistically significant.

Saturday, January 7

Replication and Ethics in Economics: Thirty Years After Dewald, Thursby and Anderson (A1, B4)

Paper Session

Saturday, Jan. 7, 2017 8:00 AM – 10:00 AM

Hyatt Regency Chicago, Grand Ballroom CD North
Hosted By: American Economic Association
  • Chair: Deirdre McCloskey, University of Illinois-Chicago
  • Paper title: What is Meant by ‘Replication’ and Why Does It Encounter Such Resistance in Economics?
  • Keywords: Replication
  • Abstract: This paper discusses recent trends in the use of replications in economics. We identify a number of sources of progress, including the results of recent replication studies that have attempted to identify replication rates within the discipline. These studies generally find that replication rates are relatively low, though they may be higher for laboratory experiments in economics. We also identify two web-based resources for replications, the Replication in Economics wiki and The Replication Network. We then consider obstacles to undertaking replication studies in economics. Two obstacles are the lack of publishing outlets and difficulties in obtaining data and code for published studies. We identify journals that publish replication studies and that “regularly” include data and code as supplementary files for their published research. Finally, we highlight replication initiatives in psychology and political science, behind which economics appears to lag. Whether this is because the problems that beset those disciplines are less severe in economics, or because economics is more resistant to replications, is arguable.
  • Paper title: Replication and Economics Journal Policies
  • Keywords: Replication
  • Abstract: We investigate the impact of the introduction of replication policies for leading journals in economics on citations. As has previously been shown for other social sciences, there is an indication that the introduction of a replication policy increases the number of citations for a journal, presumably because readers use the data for their own investigation, possibly also because of a reliability effect. We see our results as an incentive for journals to introduce and enforce replication policies. Lamentably, only a minority of journals so far enforce their policies in a way that ensures replicability of most of the empirical work. With several examples we show how replication becomes difficult if policies are not enforced, and we suggest a pool of replicability editors as a solution: Since it would be too much to expect from journals to have experts for every single topic and software package, a joint effort of journals for such a pool of experts could help to ensure each empirical study is published with data, code, and instructions how to use them together such that all published results can easily be replicated. Reviewers can join the effort for replicability by following the principles of the Agenda for Open Research and refuse to comprehensively review empirical work that does not guarantee fully replicable empirical results. Further study is needed to investigate the citation impact on single articles, and we suggest a design for such research.
  • Paper title: Replication versus Meta-Analysis in Economics: Where Do We Stand 30 Years After Dewald, Thursby and Anderson?
  • Keywords: Replication
  • Abstract: Although earlier authors (notably, Ed Kane and Tom Mayer) had emphasized the importance of replication for the intellectual integrity of empirical economic analysis, modern discussion of the issue largely dates from the 1986 American Economic Review paper by Dewald, Thursby and Anderson that summarized their work at the Journal of Money, Credit and Banking. Their work has been extended to other journals by Bruce McCullough and others. Moving beyond replication, some authors have suggested that metadata analysis might be at least as powerful a tool. This analysis surveys the literature since Dewald et al, and compares/contrasts its advocacy for replication (using authors’ datasets and program to recalculate empirical results) with the concept of meta-analysis widely used in the “hard” sciences. For empirical work, the principal difference is that data in empirical economic studies are nonstochastic in the sense that they seldom are created via experiments. In the hard sciences, replication often entails repeating the process that generated the data, an inherently stochastic endeavor. In economics, meta-analysis promises to be useful in assessing the importance of studies based on simulated data, including DSGE modeling.
  • Paper title: Is Economics Research Replicable? Sixty Published Papers From Thirteen Journals Say “Usually Not”
  • Keywords: Replication
  • Abstract: We attempt to replicate 67 papers published in 13 well-regarded economics journals using author-provided replication files that include both data and code. Some journals in our sample require data and code replication files, and other journals do not require such files. Aside from 6 papers that use confidential data, we obtain data and code replication files for 29 of 35 papers (83%) that are required to provide such files as a condition of publication, compared to 11 of 26 papers (42%) that are not required to provide data and code replication files. We successfully replicate the key qualitative result of 22 of 67 papers (33%) without contacting the authors. Excluding the 6 papers that use confidential data and the 2 papers that use software we do not possess, we replicate 29 of 59 papers (49%) with assistance from the authors. Because we are able to replicate less than half of the papers in our sample even with help from the authors, we assert that economics research is usually not replicable. We conclude with recommendations on improving replication of economics research.

Terrorism, Government Surveillance and Individual Well-Being (F5)

Paper Session

Saturday, Jan. 7, 2017 8:00 AM – 10:00 AM

Swissotel Chicago, St Gallen 3
Hosted By: Peace Science Society International
  • Chair: Solomon W. Polachek, Binghamton University
  • Paper title: Covering the Campaign: Automated Extraction of Election Events in 2014 South Africa
  • Abstract: What constitutes coverage of "election events" by media during political campaigns? Advances in computation allow for automated coding of events in International Relations, but have thus far largely been confined to conflict events (e.g., GDELT). However, classifying events in the electoral context is poses new difficulties because what constitutes an "election event'' depends on theories of how, and to what extent, media events persuade voters during campaigns. Prior approaches in democratic politics differ in their conceptualization of the role of media in campaigns and how it affects political behavior and electoral outcomes, establishing no consistent framework for defining election coverage. We apply machine learning methods to classify election-related events in a corpus of more than 200,000 news stories and social media posts related to South Africa's 2014 election. We use a theoretically informed classification of election coverage to demonstrate how variations in definitions produce variation in "election-related'' or "election salient'' codings on a host of campaign activities (including violence, protest, and riots) by the media and social media, resulting in radically distinct representations of the electoral landscape. We discuss the challenges and opportunities this research and method pose for replication to study (electoral and non-electoral) events in other settings.

Replication in Microeconomics (B4, C8)

Paper Session

Saturday, Jan. 7, 2017 10:15 AM – 12:15 PM

Swissotel Chicago, Zurich D
Hosted By: American Economic Association
  • Chair: Muriel Niederle, Stanford University
  • Paper title: Assessing the Rate of Replication in Economics
  • Abstract: We assess the rate of replication for empirical papers in the 2010 American Economic Review. Across seventy empirical papers, we find that 29 percent have one or more citation that partially replicates the original result. While only a minority of papers has a published replication, a majority (sixty percent) have either a replications, robustness test or an extension. Surveying authors within the literature we find substantial uncertainty over the number of extant replications.