Analysis of distractor difficulty in multiple-choice items
- Cite this article as:
- Revuelta, J. Psychometrika (2004) 69: 217. doi:10.1007/BF02295941
- 470 Downloads
Two psychometric models are presented for evaluating the difficulty of the distractors in multiple-choice items. They are based on the criterion of rising distractor selection ratios, which facilitates interpretation of the subject and item parameters. Statistical inferential tools are developed in a Bayesian framework: modal a posteriori estimation by application of an EM algorithm and model evaluation by monitoring posterior predictive replications of the data matrix. An educational example with real data is included to exemplify the application of the models and compare them with the nominal categories model.