Skip to main content
Log in

The optimal number of options for multiple-choice questions on high-stakes tests: application of a revised index for detecting nonfunctional distractors

  • Published:
Advances in Health Sciences Education Aims and scope Submit manuscript

Abstract

Research suggests that the three-option format is optimal for multiple choice questions (MCQs). This conclusion is supported by numerous studies showing that most distractors (i.e., incorrect answers) are selected by so few examinees that they are essentially nonfunctional. However, nearly all studies have defined a distractor as nonfunctional if it is selected by fewer than 5% of examinees. A limitation of this definition is that the proportion of examinees available to choose a distractor depends on overall item difficulty. This is especially problematic for mastery tests, which consist of items that most examinees are expected to answer correctly. Based on the traditional definition of nonfunctional, a five-option MCQ answered correctly by greater than 90% of examinees will be constrained to have only one functional distractor. The primary purpose of the present study was to evaluate an index of nonfunctional that is sensitive to item difficulty. A secondary purpose was to extend previous research by studying distractor functionality within the context of professionally-developed credentialing tests. Data were analyzed for 840 MCQs consisting of five options per item. Results based on the traditional definition of nonfunctional were consistent with previous research indicating that most MCQs had one or two functional distractors. In contrast, the newly proposed index indicated that nearly half (47.3%) of all items had three or four functional distractors. Implications for item and test development are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  • Abdulghani, H. M., Ahmad, F., Ponnamperuma, G. G., Khalil, M. S., & Aldrees, A. (2014). The relationship between non-functioning distractors and item difficulty of multiple choice questions: A descriptive analysis. Journal of Health Specialties, 2, 148–151.

    Article  Google Scholar 

  • Abdulghani, H. M., Irshad, M., Haque, S., Ahmad, T., Sattar, K., & Khalil, M. S. (2017). Effectiveness of longitudinal faculty development programs on MCQs items writing skills: A follow-up study. PLoS ONE, 12(10), e0185895. https://doi.org/10.1371/journal.pone.0185895.

    Article  Google Scholar 

  • Abozaid, H., Park, Y. S., & Tekian, A. (2017). Peer review improves psychometric characteristics of multiple choice questions. Medical Teacher, 39, s50–s54.

    Article  Google Scholar 

  • Cizek, G. J., & O’Day, D. (1994). Further investigation of nonfunctioning options in multiple-choice test items. Educational and Psychological Measurement, 54(4), 861–872.

    Article  Google Scholar 

  • Delgado, A. R., & Prieto, G. (1998). Further evidence favoring three-option items in multiple-choice tests. European Journal of Psychological Assessment, 14(3), 197–201.

    Article  Google Scholar 

  • Edwards, B. D., Arthur, W., & Bruce, L. L. (2012). The 3-option format for knowledge and ability multiple-choice tests: A case for why it should be more commonly used in personnel testing. International Journal of Selection and Assessment, 20(1), 65–81.

    Article  Google Scholar 

  • Fisher, K. M., Wandersee, J. H., & Moody, D. E. (2000). Mapping biology knowledge. Boston, MA: Kluwer Academic.

    Google Scholar 

  • Gierl, M. J., Balut, O., Guo, Q., & Zhang, X. (2017). Developing, analyzing, and using distractors for multiple-choice tests in education: A comprehensive review. Review of Educational Research, 87, 1082–1116.

    Article  Google Scholar 

  • Gierl, M. J., Lai, H., Pugh, D., Touchie, C., Boulais, A. P., & De Champlain, A. (2016). Evaluating the characteristics of generated multiple-choice test items. Applied Measurement in Education, 29(3), 196–210.

    Article  Google Scholar 

  • Grier, J. B. (1975). The number of alternatives for optimum test reliability. Journal of Educational Measurement, 12, 109–112.

    Article  Google Scholar 

  • Haladyna, T. M., & Rodriguez, M. C. (2013). Developing and validating test items. New York, NY: Routledge.

    Book  Google Scholar 

  • Jozefowicz, R. F., Koeppen, B. M., Case, S., Galbraith, R., Swanson, D., & Glew, R. H. (2002). The quality of in-house medical school examinations. Academic Medicine, 77, 156–161.

    Article  Google Scholar 

  • Kilgour, J. M., & Tayyaba, S. (2016). An investigation into the optimal number of distractors in single-best answer exams. Advances in Health Sciences Education, 21, 571–585.

    Article  Google Scholar 

  • Lord, F. M. (1944). Reliability of multiple choice tests as a function of number of choices per item. Journal of Educational Psychology, 35, 175–180.

    Article  Google Scholar 

  • Pappenberg, M., & Musch, J. (2017). Of small beauties and large beasts: The quality of distractors on multiple-choice tests is more important than their quantity. Applied Measurement in Education, 30(4), 273–286.

    Article  Google Scholar 

  • Rodriguez, M. C. (2005). Three options are optimal for multiple-choice test items: A meta-analysis of 80 years of research. Educational Measurement: Issues and Practice, 24(2), 3–13.

    Article  Google Scholar 

  • Rogausch, A., Hofer, R., & Krebs, R. (2010). Rarely selected distractors in high stakes medical multiple choice examinations and their recognition by item authors: A simulation and survey. BMC Medical Education, 10, 85.

    Article  Google Scholar 

  • Roid, G. H., & Haladyna, T. M. (1982). A technology for test-item writing. New York: Academic Press.

    Google Scholar 

  • Schneid, S. D., Armour, C., Park, Y. S., Yudkowsky, R., & Bordage, G. (2014). Reducing the number of options on multiple-choice questions: Response time, psychometrics and standard setting. Medical Education, 48(10), 1020–1027.

    Article  Google Scholar 

  • Smith, J. K. (1982). Converging on correct answers: A peculiarity of multiple-choice items. Journal of Educational Measurement, 19(3), 211–220.

    Article  Google Scholar 

  • Tarrant, M., Ware, J., & Mohammed, A. M. (2009). An assessment of functioning and nonfunctioning distractors in multiple-choice questions: A descriptive analysis. BMC Medical Education, 9, 40–47.

    Article  Google Scholar 

  • Tversky, A. (1964). On the optimal number of alternatives at a choice point. Journal of Mathematical Psychology, 1, 386–391.

    Article  Google Scholar 

  • Wakefield, J. A. (1958). Does the fifth choice strengthen a test item? Public Personnel Review, 19, 44–48.

    Google Scholar 

  • Wallach, P. M., Crespo, L. M., Holtzman, K. Z., Galbraith, R. M., & Swanson, D. B. (2006). Use of a committee review process to improve the quality of course examination. Academic Medicine, 77(2), 156–161.

    Google Scholar 

Download references

Acknowledgements

The authors express their gratitude to NBME for supporting this research. However, the opinions expressed here are those of the authors and do not necessarily reflect the position of NBME or the United States Medical Licensing Examination.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mark R. Raymond.

Ethics declarations

Conflict of interest

The authors have no conflicts of interest to report. After IRB review by the American Institutes of Research, it was determined that this research is exempt from IRB review and oversight.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Raymond, M.R., Stevens, C. & Bucak, S.D. The optimal number of options for multiple-choice questions on high-stakes tests: application of a revised index for detecting nonfunctional distractors. Adv in Health Sci Educ 24, 141–150 (2019). https://doi.org/10.1007/s10459-018-9855-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10459-018-9855-9

Keywords

Navigation