Improving reproducibility of geospatial conference papers – lessons learned from a first implementation of reproducibility reviews

Authors

DOI:

https://doi.org/10.7557/5.5601

Keywords:

reproducible research, peer review, open science, reproducibility

Abstract

See RECORDING (starts at 00:30:30).

In an attempt to increase the reproducibility of contributions to a long-running and established geospatial conference series, the 23rd AGILE Conference on Geographic Information Science 2020 (https://agile-online.org/conference-2020) for the first time provided guidelines on preparing reproducible papers (Nüst et al., 2020) and appointed a reproducibility committee to evaluate computational workflows of accepted papers ( https://www.agile-giscience-series.net/review_process.html). Here, the committee’s members report on the lessons learned from reviewing 23 accepted full papers and outline future plans for the conference series. In summary, six submissions were partially reproduced by reproducibility reviewers, whose reports are published openly on OSF ( https://osf.io/6k5fh/). These papers are promoted with badges on the proceedings’ website (https://agile-giss.copernicus.org/articles/1/index.html).

Compared to previous years’ submissions (cf. Nüst et al. 2018), the guidelines and increased community awareness markedly improved reproducibility. However, the reproduction attempts also revealed problems, most importantly insufficient documentation. This was partly mitigated by the non-blind reproducibility review, conducted after paper acceptance, where interaction between reviewers and authors can provide the input and attention needed to increase reproducibility. However, the reviews also showed that anonymisation and public repositories, when properly documented, can enable a successful reproduction without interaction, as was the case with one manuscript. Individual and organisational challenges due to the COVID-19 pandemic and the conference’s eventual cancellation increased the teething problems. Nevertheless, also under normal circumstances, future iterations will have to reduce the reviewer’s efforts to be sustainable, ideally by more readily executable workflows and a larger reproducibility committee.

Furthermore, we discuss changes to the reproducibility review process and their challenges. Reproducibility reports could be made available to “regular” reviewers, or the reports could be considered equally for acceptance/rejection decisions. Insufficient information or invalid arguments for not disclosing material could then lead to a submission being rejected or not being sent out to peer review. Further organisational improvements are a publication of reviewers’ activities in public databases, making the guidelines mandatory, and collecting data on used tools/repositories, spent efforts, and communications.

Finally, we summarise the revision of the guidelines, including their new section for reproducibility reviewers, and the status of the initiative “Reproducible Publications at AGILE Conferences” (https://reproducible-agile.github.io/initiative/), which we connect to related undertakings such as CODECHECK (Eglen et al., 2019). The AGILE Conference’s experiences may help other communities to transition towards more open and reproducible research publications.

Metrics

Metrics Loading ...

Author Biographies

Daniel Nüst, Institute for Geoinformatics, University of Münster, Münster, Germany

Daniel Nüst is a researcher at Spatio-temporal Modelling Lab at the Institute for Geoinformatics (ifgi) at the University of Münster. He works on open tools and processes to enable and improve computational reproducibility in the project Opening Reproducible Research (o2r, https://o2r.info/).

Frank O. Ostermann, University of Twente, The Netherlands

Frank Ostermann is an Assistant Professor at the Geo-Information Processing department of the University of Twente’s Geoinformation Science and Earth Observation faculty. He works mostly on novel sources of geoinformation (social media, volunteered geographic information) and their use in citizen science, but has a keen interest in their impact on scientific reproducibility.

Carlos Granell, Universitat Jaume I, Spain

Carlos Granell is a postdoctoral researcher at the Universitat Jaume I, Spain. His research interests lie in multi-disciplinary application of Geographic Information Science, spatial analysis and visualization of new forms of spatial data, and reproducibility research practices.

Alexander Kmoch, University of Tartu, Estonia

Alexander Kmoch is a Marie Skłodowska-Curie Individual Fellow (MSCA) at the University of Tartu. His interests include web- and cloud-based geoprocessing, web-services for environmental and geo-scientific data sharing (OGC standards), modelling workflows, and interactive geo-scientific visualisation. In the H2020 project GLOMODAT Alex is currently improving standardised data preparation, parameterization, and parallelisation for hydrological and water quality modelling across scales.

References

Eglen, S., and Nüst, D. 2019. CODECHECK: An open-science initiative to facilitate the sharing of computer programs and results presented in scientific publications. Septentrio Conference Series, (1). https://doi.org/10.7557/5.4910

Nüst, D., Granell, C., Hofer, B., Konkol, M., Ostermann, F. O., Sileryte, R., and Cerutti, V. 2018. Reproducible research and GIScience: an evaluation using AGILE conference papers. PeerJ 6:e5072. https://doi.org/10.7717/peerj.5072

Nüst, D., Ostermann, F. O., Sileryte, R., Hofer, B., Granell, C., Teperek, M., Graser, A., Broman, K., and Hettne, K. M. 2020. AGILE Reproducible Paper Guidelines. https://doi.org/10.17605/OSF.IO/CB7Z8

Downloads

Published

2020-09-24

How to Cite

Nüst, D., Ostermann, F., Granell, C., & Kmoch, A. (2020). Improving reproducibility of geospatial conference papers – lessons learned from a first implementation of reproducibility reviews. Septentrio Conference Series, (4). https://doi.org/10.7557/5.5601