Advertisement

Medical Science Educator

, Volume 29, Issue 1, pp 41–43 | Cite as

Student-Written Multiple-Choice Questions—a Practical and Educational Approach

  • Manan P. ShahEmail author
  • Benjamin R. Lin
  • Ming Lee
  • Daniel Kahn
  • Estebes Hernandez
Short Communication

Abstract

The purpose of our student-led project was to fulfill junior medical students’ demand for instructive, curriculum-specific practice questions while providing a learning experience and teaching opportunity for participating senior students. Eleven second-year students were taught how to write high-quality multiple-choice questions through an interactive workshop. Subsequently, they were instructed to write questions with detailed explanations for their assigned lecture topics. Thirty-four student-written and faculty-reviewed questions were combined with 16 purely faculty-written questions to create a 50-question exam. No significant difference was found in question difficulty between the student-written (79.5%) and faculty-written (84.0%) questions (p = 0.37). The discrimination index and point biserial correlation were higher for student-written (0.29, 0.32) vs. faculty-written (0.17, 0.25) questions (p < .01, < .05). The test-takers learned key course topics, while the test-writers reviewed key first-year objectives and refined their test-taking strategies. The project provided a model for feasibly developing comprehensive, high-quality, and curriculum-specific questions.

Keywords

Student-written exams Practice questions Test item construction Student-led initiative Formative Medical education 

Notes

Compliance with Ethical Standards

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

IRB exempt

Informed Consent

NA

References

  1. 1.
    Norcini JJ, Swanson DB, Grosso LJ, Webster GD. Reliability, validity and efficiency of multiple choice question and patient management problem item formats in assessment of clinical competence. Med Educ. 1985;19(3):238–47.CrossRefGoogle Scholar
  2. 2.
    Velan GM, Jones P, McNeil HP, Kumar RK. Integrated online formative assessments in the biomedical sciences for medical students: benefits for learning. BMC Medical Education. 2008;8(1):52.CrossRefGoogle Scholar
  3. 3.
    Green ML, Moeller JJ, Spak JM. Test-enhanced learning in health professions education: a systematic review: BEME guide no. 48 Medical teacher. 2018;40(4):337–50.CrossRefGoogle Scholar
  4. 4.
    Haladyna TM, Downing SM. Validity of a taxonomy of multiple-choice item-writing rules. Appl Meas Educ. 1989;2(1):51–78.CrossRefGoogle Scholar
  5. 5.
    Paniagua M, Swygert K. Constructing written test questions for the basic and clinical. sciences. 2016.Google Scholar
  6. 6.
    Kelley TL. The selection of upper and lower groups for the validation of test items. J Educ Psychol. 1939;30(1):17–24.CrossRefGoogle Scholar
  7. 7.
    Walsh J, Harris B, Tayyaba S, Harris D, Smith P. Student written single best answer questions predict performance in finals. Clin Teach. 2016;13(5):352–6.CrossRefGoogle Scholar

Copyright information

© International Association of Medical Science Educators 2018

Authors and Affiliations

  1. 1.David Geffen School of Medicine at UCLALos AngelesUSA
  2. 2.Educational Assessment and Research, David Geffen School of Medicine at UCLALos AngelesUSA
  3. 3.Department of MedicineDavid Geffen School of Medicine at UCLALos AngelesUSA

Personalised recommendations