The replicability of some scientific findings has recently been called into question. To contribute data about replicability in economics, we replicated 18 studies published in the American Economic Review and the Quarterly Journal of Economics between 2011 and 2014. All of these replications followed predefined analysis plans that were made publicly available beforehand, and they all have a statistical power of at least 90% to detect the original effect size at the 5% significance level. We found a significant effect in the same direction as in the original study for 11 replications (61%); on average, the replicated effect size is 66% of the original. The replicability rate varies between 67% and 78% for four additional replicability indicators, including a prediction market measure of peer beliefs.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Thursday, March 31, 2016
Another social science, Economics, looks at itself.
Mindblog recently noted a large study that tested the replicability of studies in psychology journals. Only 36% of the findings were repeated. The quality of results has also been questioned in many fields such as medicine, neuroscience, and genetics. Camerer et al. have now tested the replicability of experiments published in top-tier economics journals, finding that two-thirds of the 18 studies examined yielded replicable estimates of effect size and direction. Their abstract:
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment