Abstract
A testlet is a cluster of questions related to a single content area. F-type testlet is a specific type of linear testlets that contains evolving scenarios. There is no study that carried out a factor analysis of an F-type testlet exam in the context of medical education and related disciplines. We aimed to determine to what extent disciplinary domains account for the variability of student performance on an online case-based F-type testlet variant. Final year undergraduate medical students (N = 441) participated in the online exam that consisted of ten case-based F-type testlets. Six of the testlets were pediatrics, four were internal medicine. Exploratory factor analysis was carried out. It revealed that the structure has two factors. Pediatrics testlets loaded factor 1 (loadings between 0.56 and 0.77), and internal medicine testlets loaded factor 2 (loadings between 0.65 and 0.79). The results showed that disciplinary domains account for the variability of the performance on F-type testlets. The results suggest that context specificity still exists in this type of exam. In order to more thoroughly evaluate the clinical reasoning skills of students, medical educators must still ensure that clinical reasoning exams include a comprehensive representation of all relevant content areas.
Similar content being viewed by others
Data Availability
The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.
References
ten Cate O. Introduction. In: ten Cate O, Custers EJFM, Durning SJ, editors. Principles and practice of case-based clinical reasoning education : a method for preclinical students. Cham: Springer International Publishing; 2018. p. 3–19.
Higgs J, Jensen GM, Loftus S, Christensen N, editors. Clinical reasoning in the health professions. 2019.
Elstein AS, Shulman LS, Sprafka SH, Sprafka SA. Medical problem solving: an analysis of clinical reasoning. Harvard University Press. 1978.
Boshuizen HPA, Schmidt H. The development of clinical reasoning expertise. In: Higgs J, Jensen G, Loftus S, Christensen N, editors. Clinical reasoning in the health professions. Edinburg UK: Elsevier; 2018. p. 57–65.
Durning SJ, Artino AR, Boulet JR, Dorrance K, van der Vleuten C, Schuwirth L. The impact of selected contextual factors on experts’ clinical reasoning performance (does context impact clinical reasoning performance in experts?). Adv in Health Sci Educ. 2012;17:65–79.
McBee E, Ratcliffe T, Picho K, Artino AR, Schuwirth L, Kelly W, et al. Consequences of contextual factors on clinical reasoning in resident physicians. Adv in Health Sci Educ. 2015;20:1225–36.
Norman GR, Tugwell P, Feightner JW, Muzzin LJ, Jacoby LL. Knowledge and clinical problem-solving. Med Educ. 1985;19:344–56.
Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2005;39:98–106.
Daniel M, Rencic J, Durning SJ, Holmboe E, Santen SA, Lang V, et al. Clinical reasoning assessment methods: a scoping review and practical guidance. Acad Med. 2019;94:902–12.
Pugh D, De Champlain A, Touchie C. Plus ça change, plus c’est pareil: making a continued case for the use of MCQs in medical education. Med Teach. 2019;41:569–77.
Wainer H, Kiely GL. Item clusters and computerized adaptive testing: a case for testlets. J Educational Measurement. 1987;24:185–201.
Van Der Vleuten C, Newble DI. How can we test clinical reasoning? The Lancet. 1995;345:1032–4.
Farmer EA, Page G. A practical guide to assessing clinical decision-making skills using the key features approach. Med Educ. 2005;39:1188–94.
Jolly B, Dalton MJ. Written assessment. In: Swanwick T, Forrest K, O’Brien BC, editors. Understanding medical education. Chichester, UK: John Wiley & Sons, Ltd; 2018. p. 291–317.
Baldwin P, Baldwin SG, Haist SA. F-type testlets and the effects of feedback and case-specificity. Acad Med. 2011;86:S55–8.
Kıyak YS, Budakoğlu Iİ, Kula S, Coşkun Ö. ContExtended Questions (CEQ) to teach and assess clinical reasoning: a new variant of F-type testlets. Rev Esp Edu Med. 2021;2:48–56.
Kıyak YS, Budakoğlu Iİ, Bakan Kalaycıoğlu D, Kula S, Coşkun Ö. Can preclinical students improve their clinical reasoning skills only by taking case-based online testlets? A randomized controlled study. Innov Educ Teach Int. 2023;60:325–34.
Socrative [Internet]. Socrative. 2020 [cited 2022 Apr 28]. Available from: https://www.socrative.com/.
Yong AG, Pearce S. A beginner’s guide to factor analysis: focusing on exploratory factor analysis. Tutorials in quantitative methods for psychology. 2013;9:79–94.
Hrynchak P, Glover Takahashi S, Nayer M. Key-feature questions for assessment of clinical reasoning: a literature review. Med Educ. 2014;48:870–83.
Comrey AL, Lee HB. A first course in factor analysis, 2nd ed. Hillsdale, NJ, US: Lawrence Erlbaum Associates, Inc; 1992. p. xii, 430.
Tabachnick BG, Fidell LS. Using multivariate statistics. 6th ed. Upper Saddle River (NJ): Pearson; 2013.
UCLA. Seminar: principal components (PCA) and exploratory factor analysis (EFA) with SPSS [Internet]. 2021. Available from: https://stats.idre.ucla.edu/spss/seminars/efa-spss/.
Henson RK, Roberts JK. Use of exploratory factor analysis in published research: common errors and some comment on improved practice. Educ Psychol Measur. 2006;66:393–416.
Guadagnoli E, Velicer WF. Relation of sample size to the stability of component patterns. Psychol Bull. 1988;103:265–75.
Rencic J, Durning SJ, Holmboe E, Gruppen LD. Understanding the assessment of clinical reasoning. In: Wimmers PF, Mentkowski M, editors. Assessing competence in professional performance across disciplines and professions. Cham: Springer International Publishing; 2016. p. 209–35.
Holmboe ES, Durning SJ. Assessing clinical reasoning: moving from in vitro to in vivo. Diagnosis. 2014;1:111–7.
De Champlain AF. Best-fit model of exploratory and confirmatory factor analysis of the 2010 Medical Council of Canada Qualifying Examination Part I clinical decision-making cases. J Educ Eval Health Prof. 2015;12:11.
De Champlain AF, Klass DJ. Assessing the factor structure of a nationally administered standardized patient examination. Acad Med. 1997;72:S88-90.
Norman G, Bordage G, Page G, Keane D. How specific is case specificity? Med Educ. 2006;40:618–23.
Downing SM, Haladyna TM. Validity and its threats. In: Downing SM, Haladyna TM, editors. Assessment in health professions education. 1st ed. Routledge; 2009. p. 21–56.
Acknowledgements
We would like to express our gratitude to final year students of Gazi University Faculty of Medicine who participated in this study.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Ethical Approval
The study was performed in accordance with the ethical standards as laid down in the 1964 Declaration of Helsinki and its later amendments or comparable ethical standards. Gazi University Faculty of Medicine granted the approval (May 11, 2020; E.54035).
Competing Interests
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Kıyak, Y.S., Budakoğlu, I.İ., Bakan Kalaycıoğlu, D. et al. Exploratory Factor Analysis of a Computerized Case-Based F-Type Testlet Variant. Med.Sci.Educ. 33, 1191–1196 (2023). https://doi.org/10.1007/s40670-023-01876-y
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40670-023-01876-y