Skip to main content

Advertisement

Log in

Optimization of answer keys for script concordance testing: should we exclude deviant panelists, deviant responses, or neither?

  • Published:
Advances in Health Sciences Education Aims and scope Submit manuscript

Abstract

The Script Concordance Test (SCT) uses a panel-based, aggregate scoring method that aims to capture the variability of responses of experienced practitioners to particular clinical situations. The use of this type of scoring method is a key determinant of the tool’s discriminatory power, but deviant answers could potentially diminish the reliability of scores by introducing measurement error. (1) to investigate the effects on SCT psychometrics of excluding from the test’s scoring key either deviant panelists or deviant answers; (2) to propose a method for excluding either deviant panelists or deviant answers. Using an SCT in radiation oncology, we examined three methods for reducing panel response variability. One method (‘outliers’) entailed removing from the panel members with very low total scores. Two other methods (‘distance-from-mode’ and ‘judgment-by-experts’) excluded widely deviant responses to individual questions from the test’s scoring key. We compared the effects of these methods on score reliability, correlations between original and adjusted scores, and between-group effect sizes (panel-residents; panel-students; and residents-students). With a large panel (n = 45), optimization methods have no effect on reliability of scores, correlation and effect size. With a smaller panel (n = 15) no significant effect of optimization methods were observed on reliability and correlation, but significant variation on effect size was observed across samples. Measurement error resulting from deviant panelist responses on SCTs is negligible, provided the panel size is sufficiently large (>15). However, if removal of deviant answers is judged necessary, the distance-from-mode strategy is recommended.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  • Allman, R. M., Steinberg, E. P., Kerulv, J. C., & Dans, P. E. (1985). Physician tolerance for uncertainty: Use of liver-spleen scans to detect metastases. JAMA, 254, 246–248.

    Article  Google Scholar 

  • Bland, A., Kreiter, C., & Gordon, J. (2005). The psychometric properties of five scoring methods applied to the script concordance test. Academic Medicine, 80, 395–399.

    Article  Google Scholar 

  • Carriere, B., Gagnon, R., Charlin, B., Downing, S., & Bordage, G. (2009). Assessing clinical reasoning in pediatric emergency medicine: Validity evidence for a script concordance test. Annals of Emergency Medicine, 53, 647–652.

    Article  Google Scholar 

  • Charlin, B., & van der Vleuten, C. (2004). Standardized assessment of reasoning in contexts of uncertainty: The script concordance approach. Evaluation & the Health Professions, 27, 304–319.

    Article  Google Scholar 

  • Charlin, B., Gagnon, R., Pelletier, J., et al. (2006). Assessment of clinical reasoning in the context of uncertainty: The effect of variability within the reference panel. Medical Education, 40, 848–854.

    Article  Google Scholar 

  • Charlin, B., Boshuizen, H. P., Custers, E. J., & Feltovich, P. J. (2007). Scripts and clinical reasoning. Medical Education, 41, 1178–1184.

    Article  Google Scholar 

  • Eddy, D. M. (1984). Variations in physician practice: The role of uncertainty. Health Affairs (Millwood), 3, 74–89.

    Article  Google Scholar 

  • Fournier, J. P., Demeester, A., & Charlin, B. (2008). Script concordance tests: Guidelines for construction. BMC Medical Informatics and Decision Making, 8, 18.

    Article  Google Scholar 

  • Gagnon, R., Charlin, B., Coletti, M., Sauve, E., & van der Vleuten, C. (2005). Assessment in the context of uncertainty: How many members are needed on the panel of reference of a script concordance test? Medical Education, 39, 284–291.

    Article  Google Scholar 

  • Lambert, C., Gagnon, R., Nguyen, D., Charlin, B. (2009). The script concordance test in radiation oncology: Validation study of a new tool to assess clinical reasoning. Radiation Oncology, 9, 4–7. http://www.ro-journal.com/content/4/1/7.

  • Lubarsky, S., Chalk, C., Kazitani, D., Gagnon, R., & Charlin, B. (2009). The script concordance test: A new tool assessing clinical judgment in neurology. Canadian Journal of Neurological Sciences, 36, 326–331.

    Google Scholar 

  • Lubarsky, S., Charlin, B., Cook, D.A., Chalk, C., & van der Vleuten, C. (2011). Script concordance method: A review of published validity evidence. Medical Education (in press).

  • Meterissian, S. (2007). Is the script concordance test a valid instrument for assessment of intraoperative decision-making skills? The American Journal of Surgery, 193, 248–251.

    Article  Google Scholar 

  • Schmidt, H. G., Norman, G. R., & Boshuizen, H. P. A. (1990). A cognitive perspective on medical expertise: Theory and implications. Academic Medicine, 65, 611–621.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bernard Charlin.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Gagnon, R., Lubarsky, S., Lambert, C. et al. Optimization of answer keys for script concordance testing: should we exclude deviant panelists, deviant responses, or neither?. Adv in Health Sci Educ 16, 601–608 (2011). https://doi.org/10.1007/s10459-011-9279-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10459-011-9279-2

Keywords

Navigation