Testing Open Peer Review for Conferences
Watch the VIDEO.
Peer review is a crucial quality assurance step in the scientific method. Traditional single/double-blinded reviewing can result in bias, does not scale well, and offers little incentives for reviewers to do this important work.
Open peer review (OPR) challenges multiple aspects of this traditional method: (i) open identity (author/reviewer identities are known), (ii) open participation (involve larger communities in the process), (iii) open pre-review (early versions are published before reviewing), (iv) open report (review itself is published), (v) and open final-version comments (public forum for discussions after the publication).
During the H2020 project OpenUP, we conducted a pilot study applying OPR to conference settings. Two separate venues were used to test various OPR ideas, their acceptance within the community and their practicability: the 2nd European Machine Vision Forum (EMVA) 2017 and the eHealth2018 Best Master Student Paper Contest.
For EMVA, we tested the novel approaches to include open identity, open participation, open report, and open final-version comments. For eHealth2018, different variations for open identity, open participation, and open report were evaluated.
The submission, review, and decision process is handled by a dedicated conference management software (CMS). The proposed OPR workflow could not be handled by existing CMS out-of-the-box. Thus, we created a fork of the popular CMS HotCRP and added support for all experimental OPR workflows needed during the pilot.
All pilots were conducted successfully. Feedback was collected by conducting interviews directly at the conference venues and by collecting survey questionnaires. Overall, the feedback for all tested OPR concepts was positive. The open identity aspects resulted in more transparency but some participants feared that this also skewed the reviews to be artificially positive (to prevent repercussions). The open participation aspects were very well received. These concepts may allow for better scaling of peer reviews as we experience strong growth in attendance at many conferences while the number of expert reviewers is not increasing at the same scale. Open report and open final-version comments were seen as positive but their actual usage was very limited.
Our modified CMS software has been released as open source code and is available to the public for future use to foster OPR at conference setups.
Copyright (c) 2018 Oliver Zendel, Matthias Schörghuber, Michela Vignoli
This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).