We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Development and Validation of a Tool to Assess the Quality of Clinical Practice Guideline Recommendations.
- Authors
Brouwers, Melissa C.; Spithoff, Karen; Kerkvliet, Kate; Alonso-Coello, Pablo; Burgers, Jako; Cluzeau, Francoise; Férvers, Beatrice; Graham, Ian; Grimshaw, Jeremy; Hanna, Steven; Kastner, Monika; Kho, Michelle; Qaseem, Amir; Straus, Sharon; Florez, Ivan D.
- Abstract
Key Points: Question: Is it possible to create a tool to specifically evaluate the quality of clinical practice guideline recommendations? Findings: In this cross-sectional study of 322 international stakeholders, the Appraisal of Guidelines Research and Evaluation–Recommendations Excellence (AGREE-REX) tool was developed to appraise guidelines for clinical practice. All participants rated the tool as usable and agreed that it represents a valuable addition to the clinical practice guidelines enterprise. Meaning: A panel of stakeholders agrees that the AGREE-REX tool may provide information about the methodologic quality of guideline recommendations and may help in the implementation of clinical practice guidelines. Importance: Clinical practice guidelines (CPGs) may lack rigor and suitability to the setting in which they are to be applied. Methods to yield clinical practice guideline recommendations that are credible and implementable remain to be determined. Objective: To describe the development of AGREE-REX (Appraisal of Guidelines Research and Evaluation–Recommendations Excellence), a tool designed to evaluate the quality of clinical practice guideline recommendations. Design, Setting, and Participants: A cross-sectional study of 322 international stakeholders representing CPG developers, users, and researchers was conducted between December 2015 and March 2019. Advertisements to participate were distributed through professional organizations as well as through the AGREE Enterprise social media accounts and their registered users. Exposures: Between 2015 and 2017, participants appraised 1 of 161 CPGs using the Draft AGREE-REX tool and completed the AGREE-REX Usability Survey. Main Outcomes and Measures: Usability and measurement properties of the tool were assessed with 7-point scales (1 indicating strong disagreement and 7 indicating strong agreement). Internal consistency of items was assessed with the Cronbach α, and the Spearman-Brown reliability adjustment was used to calculate reliability for 2 to 5 raters. Results: A total of 322 participants (202 female participants [62.7%]; 83 aged 40-49 years [25.8%]) rated the survey items (on a 7-point scale). All 11 items were rated as easy to understand (with a mean [SD] ranging from 5.2 [1.38] for the alignment of values item to 6.3 [0.87] for the evidence item) and easy to apply (with a mean [SD] ranging from 4.8 [1.49] for the alignment of values item to 6.1 [1.07] for the evidence item). Participants provided favorable feedback on the tool's instructions, which were considered clear (mean [SD], 5.8 [1.06]), helpful (mean [SD], 5.9 [1.00]), and complete (mean [SD], 5.8 [1.11]). Participants considered the tool easy to use (mean [SD], 5.4 [1.32]) and thought that it added value to the guideline enterprise (mean [SD], 5.9 [1.13]). Internal consistency of the items was high (Cronbach α = 0.94). Positive correlations were found between the overall AGREE-REX score and the implementability score (r = 0.81) and the clinical credibility score (r = 0.76). Conclusions and Relevance: This cross-sectional study found that the AGREE-REX tool can be useful in evaluating CPG recommendations, differentiating among them, and identifying those that are clinically credible and implementable for practicing health professionals and decision makers who use recommendations to inform clinical policy. This cross-sectional study examines a tool for assessing the credibility and implementability of clinical practice guidelines.
- Subjects
STATISTICAL correlation; EXPERIMENTAL design; RESEARCH methodology; MEDICAL protocols; QUALITY assurance; RESEARCH evaluation; RESEARCH funding; CROSS-sectional method; RESEARCH methodology evaluation; DATA analysis software; DESCRIPTIVE statistics
- Publication
JAMA Network Open, 2020, Vol 3, Issue 5, pe205535
- ISSN
2574-3805
- Publication type
Article
- DOI
10.1001/jamanetworkopen.2020.5535