Advances in Health Sciences Education

, Volume 18, Issue 5, pp 945-961

First online:

Multiple choice questions can be designed or revised to challenge learners’ critical thinking

  • Rochelle E. TractenbergAffiliated withCollaborative for Research on Outcomes and -Metrics and Departments of Neurology, Biostatistics, Bioinformatics & Biomathematics, and Psychiatry, Georgetown University Medical CenterDepartment of Biostatistics, Bioinformatics and Biomathematics, Georgetown University Medical CenterDepartment of Psychiatry, Georgetown University Medical Center Email author 
  • , Matthew M. GushtaAffiliated withWireless Generation
  • , Susan E. MulroneyAffiliated withDepartment of Pharmacology & Physiology, Georgetown University Medical Center
  • , Peggy A. WeissingerAffiliated withSchool of Medicine, Georgetown University

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access


Multiple choice (MC) questions from a graduate physiology course were evaluated by cognitive-psychology (but not physiology) experts, and analyzed statistically, in order to test the independence of content expertise and cognitive complexity ratings of MC items. Integration of higher order thinking into MC exams is important, but widely known to be challenging—perhaps especially when content experts must think like novices. Expertise in the domain (content) may actually impede the creation of higher-complexity items. Three cognitive psychology experts independently rated cognitive complexity for 252 multiple-choice physiology items using a six-level cognitive complexity matrix that was synthesized from the literature. Rasch modeling estimated item difficulties. The complexity ratings and difficulty estimates were then analyzed together to determine the relative contributions (and independence) of complexity and difficulty to the likelihood of correct answers on each item. Cognitive complexity was found to be statistically independent of difficulty estimates for 88 % of items. Using the complexity matrix, modifications were identified to increase some item complexities by one level, without affecting the item’s difficulty. Cognitive complexity can effectively be rated by non-content experts. The six-level complexity matrix, if applied by faculty peer groups trained in cognitive complexity and without domain-specific expertise, could lead to improvements in the complexity targeted with item writing and revision. Targeting higher order thinking with MC questions can be achieved without changing item difficulties or other test characteristics, but this may be less likely if the content expert is left to assess items within their domain of expertise.


Cognitive complexity Higher order thinking Multiple-choice test items Assessment