We found a match
Your institution may have rights to this item. Sign in to continue.
- Title
Exploring the Impact of Random Guessing in Distractor Analysis.
- Authors
Jin, Kuan‐Yu; Siu, Wai‐Lok; Huang, Xiaoting
- Abstract
Multiple‐choice (MC) items are widely used in educational tests. Distractor analysis, an important procedure for checking the utility of response options within an MC item, can be readily implemented in the framework of item response theory (IRT). Although random guessing is a popular behavior of test‐takers when answering MC items, none of the existing IRT models for distractor analysis have considered the influence of random guessing in this process. In this article, we propose a new IRT model to distinguish the influence of random guessing from response option functioning. A brief simulation study was conducted to examine the parameter recovery of the proposed model. To demonstrate its effectiveness, the new model was applied to the mathematics tests of the Hong Kong Diploma of Secondary Education Examination (HKDSE) from 2015 to 2019. The results of empirical analyses suggest that the complexity of item contents is a key factor in inducing students' random guessing. The implications and applications of the new model to other testing situations are also discussed.
- Subjects
DIPLOMAS (Education); APPLIED mathematics; STOCHASTIC processes; SECONDARY education; ITEM response theory
- Publication
Journal of Educational Measurement, 2022, Vol 59, Issue 1, p43
- ISSN
0022-0655
- Publication type
Article
- DOI
10.1111/jedm.12310