We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Assessment of multiple-choice questions by item analysis for medical students' examinations.
- Authors
Nojomi, Marzieh; Mahmoudi, Maryam
- Abstract
Background: Multiple-choice questions (MCQs) are a common assessment method, and it is crucial to design them carefully. Therefore, this study aimed to determine the item analysis of MCQ exams in clerkship tests for general medicine students. Methods: Following a cross-sectional study, a total of 1202 MCQs designed for fourth-year clerkship medical students in the second semester of 2019 were analyzed. Difficulty and discrimination indices of student scores and taxonomy levels were then computed. Furthermore, the prepared standard structural Millman checklist was utilized. Results: Of the 1202 MCQs, according to difficulty indices, most questions (666) were considered acceptable (55.39%). In terms of the discrimination index (DI), 530 (44.09%) questions had an average discrimination coefficient. Additionally, 215 (17.88%) had a negative or poor DI and required revision or elimination from the tests bank. Of the 1202 MCQs, 669 (50.7 %) were designed at a lower cognitive level (taxonomy I), 174 (14.5 %) belonged to taxonomy II, and 419 (34.8%) of the questions had taxonomy III. Moreover, according to the structural flaws of the Millman checklist, the most common structural flaw was a lack of negative choices for Stems 1127 (93.8 %), while vertical options 376 (31.3%) were the least common. Conclusion: Based on the results, it is recommended that easy questions and negative/poor DI of items, a high level of Bloom's taxonomy type I, and questions with unstructured flaws be reviewed and reconstructed to improve the quality of the question banks. Holding training courses on designing test questions could effectively improve the quality of the questions.
- Subjects
EDUCATIONAL tests &; measurements -- Evaluation; MEDICAL education standards; TEST design; MEDICAL students; RESEARCH methodology evaluation; CROSS-sectional method; CLASSIFICATION; RATING of students; PSYCHOMETRICS; RESEARCH funding; QUALITY assurance; DESCRIPTIVE statistics; DATA analysis software; MEDICAL education; EVALUATION
- Publication
Research & Development in Medical Education, 2022, Vol 11, Issue 1, p1
- ISSN
2322-2719
- Publication type
Article
- DOI
10.34172/rdme.2022.024