Skip to main content
Log in

The compatibility principle: on philosophies in the assessment of clinical competence

  • Reflections
  • Published:
Advances in Health Sciences Education Aims and scope Submit manuscript

Abstract

The array of different philosophical positions underlying contemporary views on competence, assessment strategies and justification have led to advances in assessment science. Challenges may arise when these philosophical positions are not considered in assessment design. These can include (a) a logical incompatibility leading to varied or difficult interpretations of assessment results, (b) an “anything goes” approach, and (c) uncertainty regarding when and in what context various philosophical positions are appropriate. We propose a compatibility principle that recognizes that different philosophical positions commit assessors/assessment researchers to particular ideas, assumptions and commitments, and applies ta logic of philosophically-informed, assessment-based inquiry. Assessment is optimized when its underlying philosophical position produces congruent, aligned and coherent views on constructs, assessment strategies, justification and their interpretations. As a way forward we argue that (a) there can and should be variability in the philosophical positions used in assessment, and these should be clearly articulated to promote understanding of assumptions and make sense of justifications; (b) we focus on developing the merits, boundaries and relationships within and/or between philosophical positions in assessment; (c) we examine a core set of principles related to the role and relevance of philosophical positions; (d) we elaborate strategies and criteria to delineate compatible from incompatible; and (f) we articulate a need to broaden knowledge/competencies related to these issues. The broadened use of philosophical positions in assessment in the health professions affect the “state of play” and can undermine assessment programs. This may be overcome with attention to the alignment between underlying assumptions/commitments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  • Bartels, J., Mooney, C. J., & Stone, R. (2017). Numerical versus narrative: A comparison between methods to measure medical student performance during clinical clerkships. Medical Teacher, 39(11), 1154–1158.

    Google Scholar 

  • Bernstein, R. J. (1989). Pragmatism, pluralism and the healing of wounds. Proceedings and Addresses of the American Philosophical Association, 63(3), 5–18.

    Google Scholar 

  • Bordage, G. (2009). Conceptual frameworks to illuminate and magnify. Medical Education, 43(4), 312–319.

    Google Scholar 

  • Borsboom, D., & Markus, K. (2013). Truth and evidence in validity theory. Journal of Educational Measurement, 50(1), 110–114.

    Google Scholar 

  • Brutus, S. (2010). Words versus numbers: A theoretical exploration of giving and receiving narrative comments in performance appraisal. Human Resource Management Review, 20(2), 144–157.

    Google Scholar 

  • Burbules, N. C. (1993). Dialogue in teaching: Theory and practice. New York: Teachers College Press.

    Google Scholar 

  • Cizek, G. (2012). Defining and distinguishing validity: Interpretations of score meaning and justifications of test use. Psychological Methods, 17(1), 31.

    Google Scholar 

  • Cizek, G. (2016). Validating test score meaning and defending test score use: Different aims, different methods. Assessment in Education: Principles, Policy and Practice, 23(2), 212–225.

    Google Scholar 

  • Cook, D. A., Kuper, A., Hatala, R., & Ginsburg, S. (2016). When assessment data are words: Validity evidence for qualitative educational assessments. Academic Medicine, 91(10), 1359–1369.

    Google Scholar 

  • Denzin, N. K., & Lincoln, Y. S. (2011). Preface. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (pp. 9–16). Thousand Oaks, CA: Sage.

    Google Scholar 

  • Dewey, J. (2018). Logic-the theory of inquiry. Redditch: Read Books Ltd.

    Google Scholar 

  • Epstein, R., & Hundert, E. (2002). Defining and assessing professional competence. JAMA, 287(2), 226–235.

    Google Scholar 

  • Eva, K. W., Bordage, G., Campbell, C., Galbraith, R., Ginsburg, S., Holmboe, E., et al. (2016). Towards a program of assessment for health professionals: From training into practice. Advances in Health Sciences Education, 21(4), 897–913.

    Google Scholar 

  • Fletcher, G., Flin, R., McGeorge, P., Glavin, R., Maran, N., & Patey, R. (2003). Anaesthetists’ non-technical skills (ANTS): evaluation of a behavioural marker system. British Journal of Anaesthesia, 90(5), 580–588.

    Google Scholar 

  • Ghiara, V. (2019). Disambiguating the role of paradigms in mixed methods research. Journal of Mixed Methods Research. https://doi.org/10.1177/1558689818819928.

    Article  Google Scholar 

  • Gingerich, A., Kogan, J., Yeates, P., Govaerts, M., & Holmboe, E. (2014). Seeing the ‘black box’differently: Assessor cognition from three research perspectives. Medical Education, 48(11), 1055–1068.

    Google Scholar 

  • Ginsburg, S., Regehr, G., Lingard, L., & Eva, K. W. (2015). Reading between the lines: Faculty interpretations of narrative evaluation comments. Medical Education, 49(3), 296–306.

    Google Scholar 

  • Govaerts, M., Van de Wiel, M., Schuwirth, L., Van der Vleuten, C., & Muijtjens, A. (2013). Workplace-based assessment: Raters’ performance theories and constructs. Advances in Health Sciences Education, 18(3), 375–396.

    Google Scholar 

  • Govaerts, M., van der Vleuten, C., Schuwirth, L., & Muijtjens, A. (2007). Broadening perspectives on clinical performance assessment: Rethinking the nature of in-training assessment. Advances in Health Sciences Education, 12(2), 239–260.

    Google Scholar 

  • Govaerts, M., & Vleuten, C. P. (2013). Validity in work-based assessment: Expanding our horizons. Medical Education, 47(12), 1164–1174.

    Google Scholar 

  • Greene, J. C. (2006). Toward a methodology of mixed methods social inquiry. Research in the Schools, 13(1), 93–99.

    Google Scholar 

  • Guyon, H., Kop, J., Juhel, J., & Falissard, B. (2018). Measurement, ontology, and epistemology: Psychology needs pragmatism-realism. Theory and Psychology, 28(2), 149–171.

    Google Scholar 

  • Hanson, J. L., Rosenberg, A., & Lane, J. (2013). Narrative descriptions should replace grades and numerical ratings for clinical performance in medical education in the United States. Frontiers in Psychology, 4, 668.

    Google Scholar 

  • Hathcoat, J. (2013). Validity Semantics in educational and psychological assessment. Practical Assessment in Research, and Evaluation, 18(9), 1–14.

    Google Scholar 

  • Hathcoat, J. D., & Meixner, C. (2017). Pragmatism, factor analysis, and the conditional incompatibility thesis in mixed methods research. Journal of Mixed Methods Research, 11(4), 433–449.

    Google Scholar 

  • Hodges, B. (2013). Assessment in the post-psychometric era: Learning to love the subjective and collective. Medical Teacher, 35(7), 564–568.

    Google Scholar 

  • Hodges, B., & Lingard, L. (2013). The question of competence: reconsidering medical education in the twenty-first century. Ithaca: Cornell Univeristy Press.

    Google Scholar 

  • Hood, S. (2009). Validity in psychological testing and scientific realism. Theory and Psychology, 19(4), 451–473.

    Google Scholar 

  • Howe, K. (1988). Against the quantitative-qualitative incompatibility thesis or dogmas die hard. Educational Researcher, 17(8), 10–16.

    Google Scholar 

  • James, W. (1907). Pragmatism: A new name for some old ways of thinking. New York, NY

  • Johnson, R. B., & Onwuegbuzie, A. J. J. E. R. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14–26.

    Google Scholar 

  • Kane, M. (1992). The assessment of professional competence. Evaluation and the Health Professions, 15(2), 163–182.

    Google Scholar 

  • Kane, M. (2013). Validation as a pragmatic, scientific activity. Educational Measurement, 50(1), 115–122.

    Google Scholar 

  • Kim, J., Neilipovitz, D., Cardinal, P., Chiu, M., & Clinch, J. (2006). A pilot study using high-fidelity simulation to formally evaluate performance in the resuscitation of critically ill patients: The University of Ottawa Critical Care Medicine, High-Fidelity Simulation, and Crisis Resource Management I Study. Critical Care Medicine, 34(8), 2167–2174.

    Google Scholar 

  • Kuhn, T. S. (1970). The structure of scientific revolutions (2nd ed.). Chicago, IL: University of Chicago Press.

    Google Scholar 

  • Kuhn, T. S. (2012). The structure of scientific revolutions. Chicago: University of Chicago Press.

    Google Scholar 

  • Kuper, A., Reeves, S., Albert, M., & Hodges, B. (2007). Assessment: do we need to broaden our methodological horizons? Medical Education, 41(12), 1121–1123.

    Google Scholar 

  • Lincoln, Y., & Guba, E. (1985). Naturalistic inquiry (Vol. 75). Thousand Oaks: Sage.

    Google Scholar 

  • Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (pp. 13–103). Washington, DC: Amercian Council on Education and National Council on Measurement in Education.

    Google Scholar 

  • Morgan, D. (2007). Paradigms lost and pragmatism regained: Methodological implications of combining qualitative and quantitative methods. Journal of Mixed Methods Research, 1(1), 48–76.

    Google Scholar 

  • Morgan, D. (2014). Pragmatism as a paradigm for social research. Qualitative Inquiry, 20(8), 1045–1053.

    Google Scholar 

  • Neuendorf, K. (2016). The content analysis guidebook. Thousand Oaks: Sage.

    Google Scholar 

  • Norcini, J., Anderson, B., Bollela, V., Burch, V., Costa, M. J., Duvivier, R., et al. (2011). Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 Conference. Medical Teacher, 33(3), 206–214.

    Google Scholar 

  • Norcini, J., Anderson, B., Bollela, V., Burch, V., Costa, M. J., Duvivier, R., et al. (2018). 2018 Consensus framework for good assessment. Medical Teacher, 40(11), 1102–1109.

    Google Scholar 

  • Norcini, J., Blank, L., Duffy, F. D., & Fortna, G. (2003). The mini-CEX: A method for assessing clinical skills. Annals of Internal Medicine, 138, 476–481.

    Google Scholar 

  • Phillips, D., & Burbules, N. (2000). Postpositivism and educational research. Lanham: Rowman and Littlefield.

    Google Scholar 

  • St-Onge, C., Young, M., Eva, K. W., & Hodges, B. (2017). Validity: One word with a plurality of meanings. Advances in Health Sciences Education, 22(4), 853–867.

    Google Scholar 

  • Tavares, W., Boet, S., Theriault, R., Mallette, T., & Eva, K. W. (2012). Global rating scale for the assessment of paramedic clinical competence. Prehospital Emergency Care, 17(1), 57–67.

    Google Scholar 

  • Tavares, W., Ginsburg, S., & Eva, K. (2016). Selecting and simplifying: Rater behavior when considering multiple competencies. Teaching and Learning in Medicine, 28(1), 41–51.

    Google Scholar 

  • Teddlie, C., & Tashakkori, A. (2012). Common “core” characteristics of mixed methods research: A review of critical issues and call for greater convergence. American Behavioral Scientist, 56(6), 774–788.

    Google Scholar 

  • Uprichard, E., & Dawney, L. (2016). Data diffraction: Challenging data integration in mixed methods research. Journal of Mixed Methods Research, 13, 19–32. https://doi.org/10.1177/1558689816674650.

    Article  Google Scholar 

  • van Heerden Gideon, D., & Mellenbergh, J. (2013). Validity and truth. In New developments in psychometrics: Proceedings of the international meeting of the psychometric society IMPS2001. Osaka, Japan, July 1519, 2001 (p. 321). Springer.

  • Whitehead, C. R., Kuper, A., Hodges, B., & Ellaway, R. (2015). Conceptual and practical challenges in the assessment of physician competencies. Medical Teacher, 37(3), 245–251.

    Google Scholar 

  • Wiliam, D. (2017). Assessment and learning: Some reflections. Assessment in Education: Principles, Policy, and Practice, 24(3), 394–403.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Walter Tavares.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tavares, W., Kuper, A., Kulasegaram, K. et al. The compatibility principle: on philosophies in the assessment of clinical competence. Adv in Health Sci Educ 25, 1003–1018 (2020). https://doi.org/10.1007/s10459-019-09939-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10459-019-09939-9

Keywords

Navigation