Advertisement

Simulation-based assessment in anesthesia: an international multicentre validation study

  • Tobias C. EverettEmail author
  • Ralph J. McKinnon
  • Elaine Ng
  • Pradeep Kulkarni
  • Bruno C. R. Borges
  • Michael Letal
  • Melinda Fleming
  • M. Dylan Bould
  • for the MEPA Collaborators
Reports of Original Investigations

Abstract

Purpose

Simulated clinical events provide a means to evaluate a practitioner’s performance in a standardized manner for all candidates that are tested. We sought to provide evidence for the validity of simulation-based assessment tools in simulated pediatric anesthesia emergencies.

Methods

Nine centres in two countries recruited subjects to participate in simulated operating room events. Participants ranged in anesthesia experience from junior residents to staff anesthesiologists. Performances were video recorded for review and scored by specially trained, blinded, expert raters. The rating tools consisted of scenario-specific checklists and a global rating scale that allowed the rater to make a judgement about the subject’s performance, and by extension, preparedness for independent practice. The reliability of the tools was classified as “substantial” (intraclass correlation coefficients ranged from 0.84 to 0.96 for the checklists and from 0.85 to 0.94 for the global rating scale).

Results

Three-hundred and ninety-one simulation encounters were analysed. Senior trainees and staff significantly out-performed junior trainees (P = 0.04 and P < 0.001 respectively). The effect size of grade (junior vs senior trainee vs staff) on performance was classified as “medium” (partial η2 = 0.06). Performance deficits were observed across all grades of anesthesiologist, particularly in two of the scenarios.

Conclusions

This study supports the validity of our simulation-based anesthesiologist assessment tools in several domains of validity. We also describe some residual challenges regarding the validity of our tools, some notes of caution in terms of the intended consequences of their use, and identify opportunities for further research.

L’évaluation basée sur la simulation en anesthésie: une étude multicentrique internationale de validation

Résumé

Objectif

Les événements cliniques simulés offrent la possibilité d’évaluer de façon standardisée la performance de tous les praticiens mis à l’épreuve. Notre objectif était de fournir des données probantes concernant la validité d’outils d’évaluation basés sur la simulation dans le contexte d’urgences anesthésiques pédiatriques simulées.

Méthode

Neuf centres situés dans deux pays ont recruté des praticiens pour prendre part à des événements simulés en salle d’opération. L’expérience anesthésique des participants à la simulation allait de résidents juniors à patrons. Les simulations étaient enregistrées en format vidéo et ont été passées en revue et notées en aveugle par des évaluateurs experts spécialement formés. Les outils d’évaluation comprenaient des listes de contrôle spécifiques à chaque cas et une échelle d’évaluation globale, qui permettait à l’évaluateur de donner son avis sur la performance d’un sujet et, par extension, sur son état de préparation pour une pratique indépendante. La fiabilité des outils a été classée comme étant « substantielle » (les coefficients de corrélation intraclasse allaient de 0,84 à 0,96 pour les listes de contrôle et de 0,85 à 0,94 pour l’échelle d’évaluation globale).

Résultats

Trois cent quatre-vingt-onze séances de simulation ont été analysées. Les résidents plus avancés et les patrons étaient clairement meilleurs que les résidents moins avancés (P = 0,04 et P < 0,001, respectivement). La taille d’effet de l’expérience (résident junior vs senior vs patron) sur la performance a été classée comme « moyenne » (η2 partielle = 0,06). Des déficits de performance ont été observés dans tous les sous-groupes d’anesthésiologistes, particulièrement dans deux scénarios.

Conclusion

Cette étude confirme la validité de nos outils d’évaluation de l’anesthésiologiste fondés sur la simulation dans plusieurs domaines de validité. Nous décrivons également quelques défis résiduels concernant la validité de nos outils, certaines mises en garde en termes des conséquences voulues de leur utilisation, et identifions diverses pistes de recherches futures.

Notes

Acknowledgements

Neil Cowie MD, Site lead investigator, Department of Anesthesia, University of Saskatchewan, Saskatoon, Canada. The MEPA Collaborators: Christopher Marsh, Department of Anesthesia, Royal United Hospital, Bath, Somerset, UK. David Heather, Department of Anesthesia, Middlemore Hospital, Auckland, New Zealand. Vesna Colovic, Department of Anesthesia, Royal Manchester Children’s Hospital, Manchester, UK. Zsuzsanna Kulcsar, Department of Anesthesia, Royal Manchester Children’s Hospital, Manchester, UK. Riley Boyle, Department of Anesthesia, Stollery Children’s Hospital, Edmonton, Canada.

Conflict of interest

The authors have no conflict of interest to declare. TE sits on a committee responsible for implementing the Canadian National Anesthesia Simulation Curriculum (CanNASC). Portions of this study have been presented at the International Meeting for Simulation in Healthcare (2016), Orlando, Florida, USA and the Canadian Anesthesiologists Society annual meeting (2016), Vancouver, British Columbia, Canada.

Editorial responsibility

This submission was handled by Dr. Steven Backman, Associate Editor, Canadian Journal of Anesthesia.

Author contributions

Tobias C. Everett and M. Dylan Bould were involved in study conception and design, data cleaning, statistical analysis, and manuscript drafting. Tobias C. Everett, Ralph J. McKinnon, Elaine Ng, Pradeep Kulkarni, Bruno C. R. Borges, Michael Letal, Melinda Fleming, and M. Dylan Bould were involved in data acquisition, and in critical review and editing of the manuscript.

Funding

This work was supported by: Canadian Anesthesiologists Society (Toronto, Canada) (CAS-2011-060); Royal College of Physicians and Surgeons of Canada Medical Education Research Grant (Ottawa, Canada) (2011MERG); Academy for Innovation in Medical Education (Ottawa, Canada); and Department of Anesthesia of the University of Ottawa (Canada) (#9680).

Supplementary material

12630_2019_1488_MOESM1_ESM.pdf (59 kb)
eTable: REB approval institutions and numbers. Supplementary material 1 (PDF 59 kb)

References

  1. 1.
    Morgan PJ, Cleave-Hogg D, Guest CB. A comparison of global ratings and checklist scores from an undergraduate assessment using an anesthesia simulator. Academic Medicine 2001; 76: 1053-5.CrossRefPubMedGoogle Scholar
  2. 2.
    Morgan P, Cleave-Hogg D, DeSousa S, Tarshis J. High-fidelity patient simulation: validation of performance checklists. Br J Anaesth 2004; 92: 388-92.CrossRefPubMedGoogle Scholar
  3. 3.
    Morgan PJ, Cleave-Hogg DM, Guest CB, Herold J. Validity and reliability of undergraduate performance assessments in an anesthesia simulator. Can J Anesth 2001; 48: 225-33.CrossRefPubMedGoogle Scholar
  4. 4.
    Murray DJ, Boulet JR, Avidan M, et al. Performance of residents and anesthesiologists in a simulation-based skill assessment. Anesthesiology 2007; 107: 705-13.CrossRefPubMedGoogle Scholar
  5. 5.
    Murray DJ, Boulet JR, Kras JF, Woodhouse JA, Cox T, McAllister JD. Acute care skills in anesthesia practice: A simulation-based resident performance assessment. Anesthesiology 2004; 101: 1084-95.CrossRefPubMedGoogle Scholar
  6. 6.
    Schwid HA, Rooke GA, Carline J, et al. Evaluation of anesthesia residents using mannequin-based simulation: a multiinstitutional study. Anesthesiology 2002; 97: 1434-44.CrossRefPubMedGoogle Scholar
  7. 7.
    Fehr JJ, Boulet JR, Waldrop WB, Snider R, Brockel M, Murray DJ. Simulation-based assessment of pediatric anesthesia skills. Anesthesiology 2011; 115: 1308-15.PubMedGoogle Scholar
  8. 8.
    Savoldelli GL, Naik VN, Joo HS, et al. Evaluation of patient simulator performance as an adjunct to the oral examination for senior anesthesia residents. Anesthesiology 2006; 104: 475-81.CrossRefPubMedGoogle Scholar
  9. 9.
    Blum RH, Boulet JR, Cooper JB, Muret-Wagstaff SL. Simulation-based assessment to identify critical gaps in safe anesthesia resident performance. J Am Soc Anesthesiol 2014; 120: 129-41.CrossRefGoogle Scholar
  10. 10.
    Murray DJ, Boulet JR, Kras JF, McAllister JD, Cox TE. A simulation-based acute skills performance assessment for anesthesia training. Anesth Analg 2005; 101: 1127-34.CrossRefPubMedGoogle Scholar
  11. 11.
    Waldrop WB, Murray DJ, Boulet JR, Kras JF. Management of anesthesia equipment failure: a simulation-based resident skill assessment. Anesth Analg 2009; 109: 426-33.CrossRefPubMedGoogle Scholar
  12. 12.
    Berkenstadt H, Ziv A, Gafni N, Sidi A. Incorporating simulation-based objective structured clinical examination into the Israeli National Board Examination in Anesthesiology. Anesth Analg 2006; 102: 853-8.CrossRefPubMedGoogle Scholar
  13. 13.
    McIndoe A. High stakes simulation in anesthesia. Continuing Education in Anesthesia, Critical Care & Pain 2012; 12: 268-73.CrossRefGoogle Scholar
  14. 14.
    Chiu M, Tarshis J, Antoniou A, et al. Simulation-based assessment of anesthesiology residents’ competence: development and implementation of the Canadian National Anesthesiology Simulation Curriculum (CanNASC). Can J Anesth 2016; 63: 1357-63.CrossRefPubMedGoogle Scholar
  15. 15.
    Validity Messick S. In: Linn R, editor. Educational Measurement. NY: Macmillan; 1989. p. 13-103.Google Scholar
  16. 16.
    Everett TC, MacKinnon R, de Beer D, Taylor M, Bould MD. Ten years of simulation-based training in pediatric anesthesia: The inception, evolution, and dissemination of the Managing Emergencies in Pediatric Anesthesia (MEPA) course. Pediatr Anesth 2017; 27: 984-90.CrossRefGoogle Scholar
  17. 17.
    Everett TC, Ng E, Power D, et al. The Managing Emergencies in Pediatric Anesthesia global rating scale is a reliable tool for simulation-based assessment in pediatric anesthesia crisis management. Pediatr Anesth 2013; 23: 1117-23.Google Scholar
  18. 18.
    Shrout PE, Fleiss JL. Intraclass correlations: uses in assessing rater reliability. Psychological Bulletin 1979; 86: 420-8.CrossRefPubMedGoogle Scholar
  19. 19.
    Landis JR, Koch GG. An application of hierarchical kappa-type statistics in the assessment of majority agreement among multiple observers. Biometrics 1977; 33: 363-74.CrossRefPubMedGoogle Scholar
  20. 20.
    Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics 1977; 33: 159-74.CrossRefGoogle Scholar
  21. 21.
    Cohen J. Statistical Power Analysis for the Behavioural Sciences. Hillsdale, NJ: Lawrence Erlbaum; 1988 .Google Scholar
  22. 22.
    Ryall T, Judd BK, Gordon CJ. Simulation-based assessments in health professional education: a systematic review. Journal of multidisciplinary healthcare 2016; 9: 69-82.PubMedPubMedCentralGoogle Scholar
  23. 23.
    Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Technology-enhanced simulation to assess health professionals: a systematic review of validity evidence, research methods, and reporting quality. Acad Med 2013; 88: 872-83.CrossRefPubMedGoogle Scholar
  24. 24.
    Cook DA, Zendejas B, Hamstra SJ, Hatala R, Brydges R. What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment. Adv Health Sci Educ 2014; 19: 233-50.CrossRefGoogle Scholar
  25. 25.
    Brydges R, Hatala R, Zendejas B, Erwin PJ, Cook DA. Linking simulation-based educational assessments and patient-related outcomes: a systematic review and meta-analysis. Acad Med 2015; 90: 246-56.CrossRefPubMedGoogle Scholar
  26. 26.
    Kane MT. Current concerns in validity theory. J Educ Measurement 2006; 38: 319-42.CrossRefGoogle Scholar
  27. 27.
    Culley D, Cohen N, Hall S, et al. The Anesthesiology Milestone Project. Chicago, US: Accreditation Council for Graduate Medical Education; 2015 .Google Scholar

Copyright information

© Canadian Anesthesiologists' Society 2019

Authors and Affiliations

  • Tobias C. Everett
    • 1
    Email author
  • Ralph J. McKinnon
    • 2
  • Elaine Ng
    • 1
  • Pradeep Kulkarni
    • 3
  • Bruno C. R. Borges
    • 4
  • Michael Letal
    • 5
  • Melinda Fleming
    • 6
  • M. Dylan Bould
    • 7
  • for the MEPA Collaborators
  1. 1.Department of Anesthesia and Pain Medicine, The Hospital for Sick ChildrenUniversity of TorontoTorontoCanada
  2. 2.Department of AnesthesiaRoyal Manchester Children’s HospitalManchesterUnited Kingdom
  3. 3.Department of Anesthesia, Stollery Children’s HospitalUniversity of AlbertaEdmontonCanada
  4. 4.Department of Anesthesia, McMaster Children’s HospitalMcMaster UniversityHamiltonCanada
  5. 5.Department of Anesthesia, Alberta Children’s HospitalUniversity of CalgaryCalgaryCanada
  6. 6.Department of AnesthesiaQueens UniversityKingstonCanada
  7. 7.Department of Anesthesia, Children’s Hospital of Eastern OntarioUniversity of OttawaOttawaCanada

Personalised recommendations