Publishing Reproducible Research Output

We are investigating the need to make research outputs more reproducible and how infrastructures can support them.

1 September 2020 - 1 September 2021,  00:00 - 00:00, 

Reproducibility and transparency can be regarded (at least in experimental research) as a hallmark of research (DFG, 2017 ). The ability to reproduce research results in order to check their validity is an important cornerstone of research that helps to guarantee the quality of research and to build on existing knowledge.The digital turn has brought more opportunities to document, share and verify research processes and outcomes. Consequently, there is an increasing demand for more transparency with regard to research processes and outcomes (Munafò etal., 2017). This fits well with the open scholarship agenda requiring, among others, open software, open data, and open access to publications even if openness alone does not guarantee replicability (Chen et al., 2019). But despite the fact that most researchers agree that reproducibility and transparency are desirable goals and are part of good scientific practice,efforts to enhance reproducibility are still not part of everyday research.

Therefore,the purpose of this new activity of Knowledge Exchange is to conduct a gap analysis and investigate researchers' need in order to make research outputs more reproducible and how infrastructures (both technical and social) can support them. The main focus is on requirements to enable researchers to publish reproducible research output. As a sub-goal, KE would like to explore disciplinary differences and map different research areas on a spectrum of reproducibility.

In line with the KE strategy, we will explore the social and technical dimension as described in the Open Scholarship Framework, a model to address specific aspects within Open Scholarship. The social dimension can relate to incentive structures (publication bias favoring novel provocative results, no rewards for sharing data or methods alongside publications), poor training in research methods, cognitive biases (Munafò et al., 2017) or advantages by not sharing research results (Collins & Tabak, 2014). The technical dimension relates to infrastructures such as journals or technical solutions that help researchers conduct, document, and publish their research during the whole research process in a transparent and reproducible way (Konkol, Nüst & Goulier, 2020).


  • Chen, X., Dallmeier-Tiessen, S. Dasler, R., Feger, S., Fokianos, P., Gonzalez, J. B., Hirvonsalo, H. et al. (2018). Open is not enough. Nature Physics, 15,113119.
  • Collins, F. S. & Tabak, L. A. (2014). NIH plans to enhance reproducibility, Nature, 505, 612 613.
  • Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) (2017). Replicability of Research Results: A Statement by the German Research Foundation, 1-5.
  • Konkol, M., Nüst, D., & Goulier, L. (2020). Publishing computational research A review of infrastructures for reproducible and transparent scholarly communication. Res Integr Peer Rev 5, 10 (2020).
  • Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N. et al. (2017). A manifesto for reproducible science. Nature Human Behavior, 21, 1-9.

The Task & Finish Group for Publishing Reproducible Research Output

The activity is led by Juliane Kant (DFG) and Anna Mette Morthorst (DeiC). The Task & Finish Group for this activity consists of researchers and infrastructure experts:

A collaboration between

  • dfg
  • csc-it center for science
  • jisc
  • cnrs
  • surf
  • deic