3 minutes reading time
(548 words)
First OPR-Module for repositories
Converting open access repositories into functional evaluation platforms Bringing back quality control to the scientific community
Supported by OpenAIRE, Open Scholar has coordinated the development of the first Open Peer Review Module (OPRM) for open access repositories.
Find out more about the features this module can bring to your repository:
Quality control through enhanced evaluation: Common open access repositories already offer different, mostly quantitative, metrics like the number of visits and downloads. Those metrics are however limited in terms of giving a reliable idea about the quality of a research output. The OPRM allows peers to explicitly evaluate hosted research items and combines judgements and metrics in a more meaningful way to evaluate their quality and relevance.
Open and transparent: By following the ‘openness’ paradigm, this system is open and transparent: the reviews and full texts will be available alongside the primary publications. The identity of the reviewers will be disclosed to the authors as well as to the public. This enhances accountability and allows to officially credit the review process.
Crediting Reviews: In order to give reviews a place in the scientific reputation system and to allow scaling the importance of each review, the module offers a reputation engine that takes into account different metrics.
How does it work?The open peer review module, OPRM, can be installed to existing Dspace institutional or other repositories. It works as an overlay service that allows the evaluation of digital research works hosted in these repositories.
The OPRM contains different subsystems offering various functions:
[caption id="attachment_795" align="aligncenter" width="600"] Reputation Engine
→ Find more about the project under:
https://github.com/arvoConsultores/Open-Peer-Review-Module/wiki
Supported by OpenAIRE, Open Scholar has coordinated the development of the first Open Peer Review Module (OPRM) for open access repositories.
Find out more about the features this module can bring to your repository:
Quality control through enhanced evaluation: Common open access repositories already offer different, mostly quantitative, metrics like the number of visits and downloads. Those metrics are however limited in terms of giving a reliable idea about the quality of a research output. The OPRM allows peers to explicitly evaluate hosted research items and combines judgements and metrics in a more meaningful way to evaluate their quality and relevance.
Open and transparent: By following the ‘openness’ paradigm, this system is open and transparent: the reviews and full texts will be available alongside the primary publications. The identity of the reviewers will be disclosed to the authors as well as to the public. This enhances accountability and allows to officially credit the review process.
Crediting Reviews: In order to give reviews a place in the scientific reputation system and to allow scaling the importance of each review, the module offers a reputation engine that takes into account different metrics.
How does it work?The open peer review module, OPRM, can be installed to existing Dspace institutional or other repositories. It works as an overlay service that allows the evaluation of digital research works hosted in these repositories.
The OPRM contains different subsystems offering various functions:
- Via the Invitations-subsystem the author can invite – internal or external – peers to review the work. Reviewers can also be invited by administrators or – according to the Open Review idea – volunteer themselves.
- The reviews are created within the Reviews Subsystem. The forms can be adjusted and allow the evaluation of the article. In addition, metadata can be added and a license can be chosen. When finishing the form, a new item is created in the submission workflow which can then be modified and finalized by the administrator.
- The Annotation Subsystem adds an inter-reviewer-level by allowing reviewers to comment or annotate other reviews on the same research object.
- The research works reputation will be calculated by combining metrics like visits and downloads with weighted ratings (opinions) from reviewers and users. The weighting of these opinions takes into account the reputation of the reviews.
- The review reputation combines the weighted evaluations (judgments) of other reviewers and the ratings given by users. The weighting of these judgements depends on the reputation of the reviewer.
- The individual author’s reputation is calculated by the reputation of his/her papers. The calculation takes into account whether she/he is a sole author or a co-author.
- Through the evaluation of each research work, the reviewer’s reputation is calculated on the basis of the evaluation of his/her reviews.
[caption id="attachment_795" align="aligncenter" width="600"] Reputation Engine
→ Find more about the project under:
https://github.com/arvoConsultores/Open-Peer-Review-Module/wiki
Stay Informed
When you subscribe to the blog, we will send you an e-mail when there are new updates on the site so you wouldn't miss them.