print
References for Transparency in empirical economic research
-
Further reading
- Christensen, G., Miguel, E. "Transparency, reproducibility, and the credibility of economics research" Journal of Economic Literature 56:3 (2018): 920–980.
- Doucouliagos, C., Stanley, T. D. "Are all economic facts greatly exaggerated? Theory competition and selectivity" Journal of Economic Surveys 27:2 (2013): 316–339.
-
Key references
-
Leamer, E. E. "Let's take the con
out of econometrics" American Economic
Review 73:1 (1983): 31–43. Key reference: [1]
-
Brodeur, A., Lé, M., Sangnier, M., Zylberberg, Y. "Star wars: The
empirics strike back" American Economic
Journal: Applied Economics 8:1 (2016): 1–32. Key reference: [2]
-
Chang, A. C., Li, P. Is Economics
Research Replicable? Sixty Published Papers from Thirteen
Journals Say “Usually Not.” Federal Reserve
Board Finance and Economics Discussion Paper No.2015–083, 2015. Key reference: [3]
-
Birke, D. Open Science
Practices Are on the Rise Across Four Social Science
Disciplines Presented at Berkeley Initiative for Transparency in the
Social Sciences Annual Conference, December 10, 2018, 2018. Key reference: [4]
-
Goodhill, G. J. "Practical costs of
data sharing" Nature 509:33 (2014). Key reference: [5]
-
Gerber, A., Malhotra, N. "Do statistical
reporting standards affect what is published? Publication bias
in two leading political science journals" Quarterly Journal
of Political Science 3:3 (2008): 313–326. Key reference: [6]
-
Brodeur, A., Cook, N., Heyes, A. Methods Matter:
P-Hacking and Causal Inference in Economics IZA Discussion
Paper No.11796, 2018. Key reference: [7]
-
Mueller-Langer, F., Fecher, B., Harhoff, D., Wagner, G. G. "Replication
studies in economics—How many and which papers are chosen for
replication, and why?" Research
Policy 48:1 (2019): 62–83. Key reference: [8]
-
Camerer, C. F., Dreber, A., Forsell, E. "Evaluating
replicability of laboratory experiments in economics" Science 351:6280 (2016): 1433–1436. Key reference: [9]
-
Blanco-Perez, C., Brodeur, A. "Publication bias
and editorial statement on negative findings" BITSS
Preprints (2018). Key reference: [10]
-
Kidwell, M. C., Lazarević, L. B., Baranski, E. "Badges to
acknowledge open practices: A simple, low-cost, effective method
for increasing transparency" PLOS
Biology 14:5 (2016). Key reference: [11]
-
Olken, B. A. "Promises and
perils of pre-analysis plans" Journal of
Economic Perspectives 29:3 (2015): 61–80. Key reference: [12]
-
Casey, K., Glennerster, R., Miguel, E. "Reshaping
institutions: Evidence on aid impacts using a preanalysis
plan" Quarterly Journal
of Economics 127:4 (2012): 1755–1812. Key reference: [13]
-
Leamer, E. E. "Let's take the con
out of econometrics" American Economic
Review 73:1 (1983): 31–43.
-
Additional References
- Coffman, L. C., Niederle, M. "Pre-analysis plans have limited upside, especially where replications are feasible" Journal of Economic Perspectives 29:3 (2015): 81–98.
- Franco, A., Malhotra, N., Simonovits, G. "Publication bias in the social sciences: Unlocking the file drawer" Science 345:6203 (2014): 1502–1505.
- Gertler, P., Galiani, S., Romero, M. "How to make replication the norm" Nature 555:7698 (2018): 580–580.
- Ioannidis, J. P. A. "Why most published research findings are false" PloS Medicine 2:8 (2005): E124.
- McCloskey, D. N., Ziliak, S. T. "The standard error of regressions" Journal of Economic Literature 34:1 (1996): 97–114.
- McCullough, B. D. "Got replicability? The Journal of Money, Credit and Banking Archive" Econ Journal Watch 4:3 (2007): 326–337.
- McCullough, B. D., McGeary, K. A., Harrison, T. D. "Lessons from the JMCB archive" Journal of Money, Credit, and Banking 38:4 (2006): 1093–1107.
- Merton, R. K. The Sociology of Science: Theoretical and Empirical Investigations. Chicago: University of Chicago Press, 1973.
- Miguel, E., Camerer, C., Casey, K. "Promoting transparency in social science research" Science 343:6166 (2014): 30–31.
- Nosek, B. A., Alter, G., Banks, G. C. "Promoting an open research culture" Science 348:6242 (2015): 1422–1425.
- Open Science Collaboration "An open, large-scale, collaboration effort to estimate the reproducibility of psychological science" Perspectives on Psychological Science 7:6 (2012): 657–660.
- Rosenthal, R. "The file drawer problem and tolerance for null results" Psychological Bulletin 86:3 (1979): 638–641.
- Simonsohn, U., Nelson, L. D., Simmons, J. P. "P-curve: A key to the file-drawer" Journal of Experimental Psychology: General 143:2 (2014): 534–547.
- Stanley, T. D. "Beyond publication bias" Journal of Economic Surveys 19:3 (2005): 309–345.
- Stanley, T. D., Doucouliagos, H. "Picture this: A simple graph that reveals much ado about research" Journal of Economic Surveys 24:1 (2010): 170–191.
- Vivalt, E. "Specification searching and significance inflation across time, methods and disciplines" Oxford Bulletin of Economics and Statistics 81:4 (2019): 797–816.
- Ziliak, S. T., McCloskey, D. N. The Cult of Statistical Significance: How the Standard Error Costs Us Jobs, Justice, and Lives. Ann Arbor: University of Michigan Press, 2008.
- Zimmermann, C. On the Need for a Replication Journal Federal Reserve Bank of St. Louis Research Division Working Paper No.2015-016A, 2015.