Reproducibility and transparency can be regarded (at least in experimental research) as a hallmark of research (DFG, 2017 ). The ability to reproduce research results in order to check their validity is an important cornerstone of research that helps to guarantee the quality of research and to build on existing knowledge.The digital turn has brought more opportunities to document, share and verify research processes and outcomes. Consequently, there is an increasing demand for more transparency with regard to research processes and outcomes (Munafò etal., 2017). This fits well with the open scholarship agenda requiring, among others, open software, open data, and open access to publications even if openness alone does not guarantee replicability (Chen et al., 2019). But despite the fact that most researchers agree that reproducibility and transparency are desirable goals and are part of good scientific practice,efforts to enhance reproducibility are still not part of everyday research.
Therefore,the purpose of this new activity of Knowledge Exchange is to conduct a gap analysis and investigate researchers' need in order to make research outputs more reproducible and how infrastructures (both technical and social) can support them. The main focus is on requirements to enable researchers to publish reproducible research output. As a sub-goal, KE would like to explore disciplinary differences and map different research areas on a spectrum of reproducibility.
In line with the KE strategy, we will explore the social and technical dimension as described in the Open Scholarship Framework, a model to address specific aspects within Open Scholarship. The social dimension can relate to incentive structures (publication bias favoring novel provocative results, no rewards for sharing data or methods alongside publications), poor training in research methods, cognitive biases (Munafò et al., 2017) or advantages by not sharing research results (Collins & Tabak, 2014). The technical dimension relates to infrastructures such as journals or technical solutions that help researchers conduct, document, and publish their research during the whole research process in a transparent and reproducible way (Konkol, Nüst & Goulier, 2020).
- Chen, X., Dallmeier-Tiessen, S. Dasler, R., Feger, S., Fokianos, P., Gonzalez, J. B., Hirvonsalo, H. et al. (2018). Open is not enough. Nature Physics, 15,113119. https://doi.org/10.1038/s41567-018-0342-2
- Collins, F. S. & Tabak, L. A. (2014). NIH plans to enhance reproducibility, Nature, 505, 612 613.
- Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) (2017). Replicability of Research Results: A Statement by the German Research Foundation, 1-5.
- Konkol, M., Nüst, D., & Goulier, L. (2020). Publishing computational research A review of infrastructures for reproducible and transparent scholarly communication. Res Integr Peer Rev 5, 10 (2020). https://doi.org/10.1186/s41073-020-00095-y
- Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N. et al. (2017). A manifesto for reproducible science. Nature Human Behavior, 21, 1-9. https://doi.org/10.1038/s41562-016-0021
Slide Deck: Literature Findings
This slide deck presents an overview of the research reproducibility landscape, focusing on the publication and dissemination phases in the research process. The slides are based on an extensive literature review of almost 130 sources, including academic articles and grey literature from around the world. Topics covered include: definitions and terminology, benefits of publishing reproducible research outputs, barriers and possible solutions, and the role of funders and infrastructure providers. This literature review is going to inform the next stages of the Knowledge Exchange activity on publishing reproducible research outputs (KE-PRRO).
The Task & Finish Group for Publishing Reproducible Research Output
The activity is led by Juliane Kant (DFG) and Anna Mette Morthorst (DeiC). The Task & Finish Group for this activity consists of researchers and infrastructure experts:
- Birgit Schmidt, Göttingen State and University Library (expert lead) [ORCID: https://orcid.org/0000-0001-8036-5859]
- Birte Christensen-Dalsgaard, Aarhus University [ORCID: https://orcid.org/0000-0002-7172-6875]
- Daniel Nüst, University of Münster [ORCID: https://orcid.org/0000-0002-0024-5046]
- Jeroen Sondervan, Utrecht University (expert co-lead) [ORCID: https://orcid.org/0000-0002-9866-0239]
- Matthew Jacquiery, University of Oxford
- Pierre Carl Langlais, Paris Sorbonne-CELSA [ORCID: https://orcid.org/0000-0001-9035-1127]
- Saskia Woutersen, Leiden University Library [ORCID: https://orcid.org/0000-0003-0120-266X]
- Verena Heise, University of Oxford / Hanse-Wissenschaftskolleg [ORCID: https://orcid.org/0000-0002-5625-4128]
- Yrsa Neuman, Åbo Akademi University [ORCID: https://orcid.org/0000-0002-9323-0694]