Improving reproducibility of geospatial conference papers – lessons learned from a first implementation of reproducibility reviews
DOI:
https://doi.org/10.7557/5.5601Keywords:
reproducible research, peer review, open science, reproducibilityAbstract
See RECORDING (starts at 00:30:30).
In an attempt to increase the reproducibility of contributions to a long-running and established geospatial conference series, the 23rd AGILE Conference on Geographic Information Science 2020 (https://agile-online.org/conference-2020) for the first time provided guidelines on preparing reproducible papers (Nüst et al., 2020) and appointed a reproducibility committee to evaluate computational workflows of accepted papers ( https://www.agile-giscience-series.net/review_process.html). Here, the committee’s members report on the lessons learned from reviewing 23 accepted full papers and outline future plans for the conference series. In summary, six submissions were partially reproduced by reproducibility reviewers, whose reports are published openly on OSF ( https://osf.io/6k5fh/). These papers are promoted with badges on the proceedings’ website (https://agile-giss.copernicus.org/articles/1/index.html).
Compared to previous years’ submissions (cf. Nüst et al. 2018), the guidelines and increased community awareness markedly improved reproducibility. However, the reproduction attempts also revealed problems, most importantly insufficient documentation. This was partly mitigated by the non-blind reproducibility review, conducted after paper acceptance, where interaction between reviewers and authors can provide the input and attention needed to increase reproducibility. However, the reviews also showed that anonymisation and public repositories, when properly documented, can enable a successful reproduction without interaction, as was the case with one manuscript. Individual and organisational challenges due to the COVID-19 pandemic and the conference’s eventual cancellation increased the teething problems. Nevertheless, also under normal circumstances, future iterations will have to reduce the reviewer’s efforts to be sustainable, ideally by more readily executable workflows and a larger reproducibility committee.
Furthermore, we discuss changes to the reproducibility review process and their challenges. Reproducibility reports could be made available to “regular” reviewers, or the reports could be considered equally for acceptance/rejection decisions. Insufficient information or invalid arguments for not disclosing material could then lead to a submission being rejected or not being sent out to peer review. Further organisational improvements are a publication of reviewers’ activities in public databases, making the guidelines mandatory, and collecting data on used tools/repositories, spent efforts, and communications.
Finally, we summarise the revision of the guidelines, including their new section for reproducibility reviewers, and the status of the initiative “Reproducible Publications at AGILE Conferences” (https://reproducible-agile.github.io/initiative/), which we connect to related undertakings such as CODECHECK (Eglen et al., 2019). The AGILE Conference’s experiences may help other communities to transition towards more open and reproducible research publications.
Metrics
References
Eglen, S., and Nüst, D. 2019. CODECHECK: An open-science initiative to facilitate the sharing of computer programs and results presented in scientific publications. Septentrio Conference Series, (1). https://doi.org/10.7557/5.4910
Nüst, D., Granell, C., Hofer, B., Konkol, M., Ostermann, F. O., Sileryte, R., and Cerutti, V. 2018. Reproducible research and GIScience: an evaluation using AGILE conference papers. PeerJ 6:e5072. https://doi.org/10.7717/peerj.5072
Nüst, D., Ostermann, F. O., Sileryte, R., Hofer, B., Granell, C., Teperek, M., Graser, A., Broman, K., and Hettne, K. M. 2020. AGILE Reproducible Paper Guidelines. https://doi.org/10.17605/OSF.IO/CB7Z8