« Back to Results

Improving the Transparency and Credibility of Economics Research

Paper Session

Saturday, Jan. 4, 2020 2:30 PM - 4:30 PM (PDT)

Marriott Marquis, Grand Ballroom 13
Hosted By: American Economic Association
  • Chair: Edward Miguel, University of California-Berkeley

Open Science Practices Are on the Rise in Economics

David Birke
,
University of California-Berkeley
Garret Christensen
,
U.S. Census Bureau
Rebecca Littman
,
Massachusetts Institute of Technology
Edward Miguel
,
University of California-Berkeley
Elizabeth Levy Paluck
,
Princeton University
Nicholas Swanson
,
University of California-Berkeley
Zenan Wang
,
University of California-Berkeley

Abstract

This study provides the first comprehensive assessment of awareness of, attitudes towards, perceived norms regarding, and adoption of open science practices within a broadly representative sample of economics researchers. We observe a steep increase in adoption over the last decade, with an accelerating trend: as of 2017, over 90% of economists had used at least one such practice, rising from less than a quarter a decade earlier. Notably, most economists appear to underestimate the trend toward open science in the field. We document extensive variation in the adoption in open science practices across economics subfields.

A Proposed Specification Check for P-Hacking

Abel Brodeur
,
University of Ottawa
Nikolai Cook
,
University of Ottawa
Anthony Heyes
,
University of Ottawa

Abstract

This paper proposes a specification check for p-hacking. More specifically, we advocate the reporting of t-curves and mu-curves - the t-statistics and estimated effect sizes derived from regressions using every possible combination of control variables from the researchers set - and introduce a standardized and accessible implementation. Our specification check allows researchers, referees and editors to visually inspect variation in effect sizes, significativity and sensitivity to the inclusion of control variables. We provide a Stata command which implements the specification check. Given the growing interest in estimating causal effects in the social sciences, the potential applicability of this specification check to empirical studies is very large.

Do Pre-analysis Plans Hamper Publication?

George Ofosu
,
London School of Economics
Daniel Posner
,
University of California-Los Angeles

Abstract

A critique of pre-analysis plans (PAPs) is that they generate boring, lab-report style papers that are disfavored by reviewers and journal editors, and hence hampered in the publication process. To assess whether this is the case, we compare the publication rates of experimental NBER working papers with and without PAPs. We find that papers with PAPs are, in fact, slightly less likely to be published. However, conditional on being published papers with PAPs are significantly more likely to land in top-5 journals. We also find that journal articles based on pre-registered analyses generate more citations. Our findings suggest that the alleged trade-off between career concerns and the scientific credibility that comes from registering and adhering to a PAP is less stark than is sometimes alleged, and may even tilt in favor of pre-registration for researchers most concerned about publishing in the most prestigious journals and maximizing citations to their work.

Forecasting the Results of Economic Research

Stefano DellaVigna
,
University of California-Berkeley
Nicholas Otis
,
University of California-Berkeley
Eva Vivalt
,
Australian National University

Abstract

Credible identification of social science findings is central to the transparency revolution. We argue that collecting forecasts of research results is a useful addition to the transparency toolkit. Were the experimental results anticipated? How much would experts update relative to their initial forecasts, given the results? We briefly discuss these and other uses of experimental forecasts, using examples from this nascent literature. We then consider practical decisions related to eliciting forecasts, such as the unit of elicitation. Finally, we provide evidence regarding these decisions from a sample of non-expert forecasters.
Discussant(s)
Fiona Burlig
,
University of Chicago
Michael Gechter
,
Pennsylvania State University
David McKenzie
,
World Bank
JEL Classifications
  • B4 - Economic Methodology
  • C1 - Econometric and Statistical Methods and Methodology: General