Reproducible research and GIScience: an evaluation using AGILE conference papers

Institute for Geoinformatics,, University of Münster, Münster, Germany
Institute of New Imaging Technologies, Universitat Jaume I de Castellón, Castellón, Spain
Interfaculty Department of Geoinformatics - Z_GIS, University of Salzburg, Salzburg, Austria
Faculty of Geo-Information Science and Earth Observation (ITC), University of Twente, Enschede, The Netherlands
Faculty of Architecture and the Built Environment, Delft University of Technology, Delft, The Netherlands
DOI
10.7287/peerj.preprints.26561v1
Subject Areas
Science Policy, Computational Science, Data Science, Spatial and Geographic Information Science
Keywords
GIScience, Open Science, Reproducible Research, Data Science, AGILE, reproducible conference publications, Open Access
Copyright
© 2018 Nüst et al.
Licence
This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ Preprints) and either DOI or URL of the article must be cited.
Cite this article
Nüst D, Granell C, Hofer B, Konkol M, Ostermann FO, Sileryte R, Cerutti V. 2018. Reproducible research and GIScience: an evaluation using AGILE conference papers. PeerJ Preprints 6:e26561v1

Abstract

The demand for reproducibility of research is on the rise in disciplines concerned with data analysis and computational methods. In this work existing recommendations for reproducible research are reviewed and translated into criteria for assessing reproducibility of articles in the field of geographic information science (GIScience). Using a sample of GIScience research from the Association of Geographic Information Laboratories in Europe (AGILE) conference series, we assess the current state of reproducibility of publications in this field. Feedback on the assessment was collected by surveying the authors of the sample papers. The results show the reproducibility levels are low. Although authors support the ideals, the incentives are too small. Therefore we propose concrete actions for individual researchers and the AGILE conference series to improve transparency and reproducibility, such as imparting data and software skills, an award, paper badges, author guidelines for computational research, and Open Access publications.

Author Comment

This is a preprint submission to PeerJ Preprints.

Supplemental Information

Analysis workflow (view)

A PDF rendering or he analysis document for viewing.

DOI: 10.7287/peerj.preprints.26561v1/supp-2

Analysis workflow

R Markdown document with the code to conduct the analysis and create the figures of the paper.

DOI: 10.7287/peerj.preprints.26561v1/supp-3

References of evaluated publications

DOI: 10.7287/peerj.preprints.26561v1/supp-5

Anonymised survey responses

DOI: 10.7287/peerj.preprints.26561v1/supp-6