Testing Open Peer Review for Conferences
DOI:
https://doi.org/10.7557/5.4535Keywords:
Open Peer Review, Conference Management Software, OpenUPAbstract
Watch the VIDEO.
Peer review is a crucial quality assurance step in the scientific method. Traditional single/double-blinded reviewing can result in bias, does not scale well, and offers little incentives for reviewers to do this important work.
Open peer review (OPR) challenges multiple aspects of this traditional method: (i) open identity (author/reviewer identities are known), (ii) open participation (involve larger communities in the process), (iii) open pre-review (early versions are published before reviewing), (iv) open report (review itself is published), (v) and open final-version comments (public forum for discussions after the publication).
During the H2020 project OpenUP, we conducted a pilot study applying OPR to conference settings. Two separate venues were used to test various OPR ideas, their acceptance within the community and their practicability: the 2nd European Machine Vision Forum (EMVA) 2017 and the eHealth2018 Best Master Student Paper Contest.
For EMVA, we tested the novel approaches to include open identity, open participation, open report, and open final-version comments. For eHealth2018, different variations for open identity, open participation, and open report were evaluated.
The submission, review, and decision process is handled by a dedicated conference management software (CMS). The proposed OPR workflow could not be handled by existing CMS out-of-the-box. Thus, we created a fork of the popular CMS HotCRP and added support for all experimental OPR workflows needed during the pilot.
All pilots were conducted successfully. Feedback was collected by conducting interviews directly at the conference venues and by collecting survey questionnaires. Overall, the feedback for all tested OPR concepts was positive. The open identity aspects resulted in more transparency but some participants feared that this also skewed the reviews to be artificially positive (to prevent repercussions). The open participation aspects were very well received. These concepts may allow for better scaling of peer reviews as we experience strong growth in attendance at many conferences while the number of expert reviewers is not increasing at the same scale. Open report and open final-version comments were seen as positive but their actual usage was very limited.
Our modified CMS software has been released as open source code and is available to the public for future use to foster OPR at conference setups.