We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Characterization of Comments About bioRxiv and medRxiv Preprints.
- Authors
Carneiro, Clarissa França Dias; da Costa, Gabriel Gonçalves; Neves, Kleber; Abreu, Mariana Boechat; Tan, Pedro Batista; Rayêe, Danielle; Boos, Flávia Zacouteguy; Andrejew, Roberta; Lubiana, Tiago; Malički, Mario; Amaral, Olavo Bohrer
- Abstract
Key Points: Question: What is the content of the comments posted on the bioRxiv and medRxiv preprint platforms? Findings: In this cross-sectional study, 7.3% of preprints from 2020 had received at least 1 comment (mean follow-up of 7.5 months), with a median length of 43 words. Criticisms, corrections, or suggestions (most commonly regarding interpretation, methodological design, and data collection) were the most prevalent types of content in these comments, followed by compliments and questions. Meaning: This study found that, although rare, when comments were present on the preprint platforms, they addressed relevant topics that would be expected to emerge from peer review. Importance: Preprints have been increasingly used in biomedical science, and a key feature of many platforms is public commenting. The content of these comments, however, has not been well studied, and it is unclear whether they resemble those found in journal peer review. Objective: To describe the content of comments on the bioRxiv and medRxiv preprint platforms. Design, Setting, and Participants: In this cross-sectional study, preprints posted on the bioRxiv and medRxiv platforms in 2020 were accessed through each platform's application programming interface on March 29, 2021, and a random sample of preprints containing between 1 and 20 comments was evaluated independently by 3 evaluators using an instrument to assess their features and general content. Main Outcome and Measures: The numbers and percentages of comments from authors or nonauthors were assessed, and the comments from nonauthors were assessed for content. These nonauthor comments were assessed to determine whether they included compliments, criticisms, corrections, suggestions, or questions, as well as their topics (eg, relevance, interpretation, and methods). Nonauthor comments were also analyzed to determine whether they included references, provided a summary of the findings, or questioned the preprint's conclusions. Results: Of 52 736 preprints, 3850 (7.3%) received at least 1 comment (mean [SD] follow-up, 7.5 [3.6] months), and the 1921 assessed comments (from 1037 preprints) had a median length of 43 words (range, 1-3172 words). The criticisms, corrections, or suggestions present in 694 of 1125 comments (61.7%) were the most prevalent content, followed by compliments (n = 428 [38.0%]) and questions (n = 393 [35.0%]). Criticisms usually regarded interpretation (n = 286), methodological design (n = 267), and data collection (n = 238), while compliments were mainly about relevance (n = 111) and implications (n = 72). Conclusions and Relevance: In this cross-sectional study of preprint comments, topics commonly associated with journal peer review were frequent. However, only a small percentage of preprints posted on the bioRxiv and medRxiv platforms in 2020 received comments on these platforms. A clearer taxonomy of peer review roles would help to describe whether postpublication peer review fulfills them. This cross-sectional study assesses the content of comments posted on the bioRxiv and medRxiv platforms in 2020, using a predefined taxonomy based on qualitative studies of peer review.
- Subjects
PROFESSIONAL peer review; RESEARCH; MANUSCRIPTS; PREPRINTS; CROSS-sectional method; CRITICISM; CLASSIFICATION; SOCIAL media; WORLD health; HEALTH attitudes; DESCRIPTIVE statistics; RESEARCH funding; STATISTICAL sampling; CONTENT analysis; WORLD Wide Web
- Publication
JAMA Network Open, 2023, Vol 6, Issue 8, pe2331410
- ISSN
2574-3805
- Publication type
Article
- DOI
10.1001/jamanetworkopen.2023.31410