Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics
AbstractThe credibility revolution in economics has promoted causal identification using randomized control trials (RCT), difference-in-differences (DID), instrumental variables (IV) and regression discontinuity design (RDD). Applying multiple approaches to over 21,000 hypothesis tests published in 25 leading economics journals, we find that the extent of p-hacking and publication bias varies greatly by method. IV (and to a lesser extent DID) are particularly problematic. We find no evidence that (i) papers published in the Top 5 journals are different to others; (ii) the journal "revise and resubmit" process mitigates the problem; (iii) things are improving through time.
CitationBrodeur, Abel, Nikolai Cook, and Anthony Heyes. 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics." American Economic Review, 110 (11): 3634-60. DOI: 10.1257/aer.20190687
- A14 Sociology of Economics
- C12 Hypothesis Testing: General
- C52 Model Evaluation, Validation, and Selection