Skip to main content

Advertisement

Log in

Do sequentially-presented answer options prevent the use of testwiseness cues on continuing medical education tests?

  • Published:
Advances in Health Sciences Education Aims and scope Submit manuscript

Abstract

Testwiseness—that is, the ability to find subtle cues towards the solution by the simultaneous comparison of the available answer options—threatens the validity of multiple-choice (MC) tests. Discrete-option multiple-choice (DOMC) has recently been proposed as a computerized alternative testing format for MC tests, and presumably allows for a better control of testwiseness. It is based on a sequential rather than simultaneous presentation of answer options. The test taker has to decide on the correctness of one option after another, until the item has been answered either correctly or incorrectly. Test items that have been criticized for being susceptible to testwiseness strategies are used in continuing medical education (CME) programs aimed at developing and maintaining the knowledge of medical professionals. In Experiment 1 with 48 adults without a special medical education, presenting answer options sequentially reduced the use of testwiseness cues on a CME test compared to their simultaneous presentation as shown by a significant interaction of answer format and the availability of cues (p = .01, η 2 = 0.13). This result was not dependent on a hint towards potential cues to the solution, as it also held when another 86 adults were not informed of the presence of testwiseness cues in Experiment 2 (p < .001, η 2 = 0.14). The result could be replicated with 106 medical students and medical doctors in Experiment 3. Items were answered correctly more often in the MC condition (71 %) than in the DOMC condition (47 %), but only when items contained cues to their solution. A significant interaction between answer format and the availability of cues (p = .02, η 2 = 0.05) showed that the sequential DOMC answer format allows for a better control of testwiseness than traditional MC testing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Brozo, W. G., Schmelzer, R. V., & Spires, H. A. (1984). A study of testwiseness clues in college and university teacher-made tests with implications for academic assistance centers. Technical report 84-01. College Reading and Learning Assistance, Georgia State University. Retrieved August 21, 2013, from ERIC database (ED240928).

  • Clegg, V. L., & Cashin, W. E. (1986). Improving multiple-choice tests. IDEA paper no. 16. Manhattan: Center for Faculty Evaluation and Development, Kansas State University. Retrieved August 21, 2013, from http://www.theideacenter.org/sites/default/files/Idea_Paper_16.pdf.

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Downing, S. M. (2006a). Selected-response item formats in test development. In S. M. Downing & T. M. Haladyna (Eds.), Handbook of test development (pp. 287–301). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Downing, S. M. (2006b). Twelve steps for effective test development. In S. M. Downing & T. M. Haladyna (Eds.), Handbook of test development (pp. 3–25). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Edwards, B. D. (2003). An examination of factors contributing to a reduction in race-based subgroup differences on a constructed response paper-and-pencil test of achievement. Unpublished doctoral dissertation, Texas A&M University.

  • Exam Innovations. (2014). Discrete option multiple choice. Retrieved April 28, 2014, from http://www.exam-innovations.com/.

  • Farley, J. K. (1989). The multiple choice test: Writing the questions. Nurse Educator, 14, (10–12) and 39.

  • Foster, D. (2014). Compelling test design technology: Discrete option multiple choice. Caveon Test Security White Paper. Retrieved April 28, 2014, from http://trydomc.com/static/pdf/DOMC_whitepaper.pdf.

  • Foster, D., & Miller, H. L. (2009). A new format for multiple-choice testing: Discrete-option multiple-choice. Results from early studies. Psychology Science Quarterly, 51, 355–369.

    Google Scholar 

  • German Medical Association. (2004). Regulations for continuing education and continuing education certificate. Retrieved August 21, 2013, from http://www.bundesaerztekammer.de/downloads/ADFBSatzungEn.pdf.

  • German Social Security Code (SGB). § 95d SGB V Pflicht zur fachlichen Fortbildung [Obligation to professional training]. Retrieved August 21, 2013, from http://www.sozialgesetzbuch-sgb.de/sgbv/95d.html.

  • Gibb, B. G. (1964). Testwiseness as secondary cue response. Doctoral dissertation, No. 64-7643. Stanford University, Ann Arbor, MI: University Microfilms.

  • Globalpark, A. G. (2011). Enterprise feedback suite. EFS Survey. Retrieved August 21, 2013, from http://www.unipark.info.

  • Haladyna, T. M. (2004). Developing and validating multiple choice test items (3rd ed.). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Haladyna, T. M., & Downing, S. M. (2004). Construct-irrelevant variance in high-stakes testing. Educational Measurement: Issues and Practice, 23, 17–27.

    Article  Google Scholar 

  • Hammond, E. J., McIndoe, A. K., Sansome, A. J., & Spargo, P. M. (1998). Multiple-choice examinations: Adopting an evidence-based approach to exam technique. Anaesthesia, 53, 1105–1108.

    Article  Google Scholar 

  • Kersting, M. (2008). Zur Akzeptanz von Intelligenz- und Leistungstests [On the acceptance of intelligence and achievement tests]. Report Psychologie, 33, 420–433.

    Google Scholar 

  • Kingston, N. M., Tiemann, G. C., Miller, H. L., & Foster, D. (2012). An analysis of the discrete-option multiple-choice item type. Psychological Test and Assessment Modeling, 54, 3–19.

    Google Scholar 

  • Kühne-Eversmann, L., Nussbaum, C., Reincke, M., & Fischer, M. R. (2007). CME-Fortbildungsangebote in medizinischen Fachzeitschriften: Strukturqualität der MC-Fragen als Erfolgskontrollen [CME activities of medical journals: Quality of multiple-choice questions as evaluation tool]. Medizinische Klinik, 102, 993–1001.

    Article  Google Scholar 

  • Martinez, M. E. (1999). Cognition and the question of test item format. Educational Psychologist, 34, 207–218.

    Article  Google Scholar 

  • Millman, J., Bishop, C. H., & Ebel, R. (1965). An analysis of testwiseness. Educational and Psychological Measurement, 25, 707–726.

    Article  Google Scholar 

  • Moreno, R., Martinez, R. J., & Muniz, J. (2006). New guidelines for developing multiple-choice items. Methodology, 2, 65–72.

    Google Scholar 

  • Niedergethmann, M., & Post, S. (2006). Differentialdiagnose des Oberbauchschmerzes [The diagnosis and management of upper abdominal pain]. Deutsches Ärzteblatt, 13, A862–A871.

    Google Scholar 

  • Rodriguez, M. C. (2005). Three options are optimal for multiple-choice items: A metaanalysis of 80 years of research. Educational Measurement: Issues and Practice, 24, 3–13.

    Article  Google Scholar 

  • Rotthoff, T., Fahron, U., Baehring, T., & Scherbaum, W. A. (2008). Die Qualität von CME-Fragen in der ärztlichen Fortbildung—eine empirische Studie [The quality of CME questions as a component part of continuing medical education—An empirical study]. Zeitschrift für ärztliche Fortbildung und Qualität im Gesundheitswesen, 101, 667–674.

    Google Scholar 

  • Sarnacki, R. E. (1979). An examination of test-wiseness in the cognitive test domain. Review of Educational Research, 49, 252–279.

    Article  Google Scholar 

  • Stagnaro-Green, A. S., & Downing, S. M. (2006). Use of flawed multiple-choice items by the New England Journal of Medicine for continuing medical education. Medical Teacher, 28, 566–568.

    Article  Google Scholar 

  • Taylor, C., & Gardner, P. L. (1999). An alternative method of answering and scoring multiple choice tests. Research in Science Education, 29, 353–363.

    Article  Google Scholar 

Download references

Conflict of interest

The research and writing of this article was conducted independent of Exam Innovations. There is no conflict of interest for either author.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sonja Willing.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Willing, S., Ostapczuk, M. & Musch, J. Do sequentially-presented answer options prevent the use of testwiseness cues on continuing medical education tests?. Adv in Health Sci Educ 20, 247–263 (2015). https://doi.org/10.1007/s10459-014-9528-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10459-014-9528-2

Keywords

Navigation