Simulation-based assessment of anesthesiology residents’ competence: development and implementation of the Canadian National Anesthesiology Simulation Curriculum (CanNASC)
- 1.3k Downloads
The specialty of anesthesiology will soon adopt the Competence By Design (CBD) approach to residency education developed by the Royal College of Physicians and Surgeons of Canada (RCPSC). A foundational component of CBD is frequent and contextualized assessment of trainees. In 2013, the RCPSC Anesthesiology Specialty Committee assembled a group of simulation educators, representing each of the 17 Canadian anesthesiology residency programs, to form the Canadian National Anesthesiology Simulation Curriculum (CanNASC) Task Force. The goals were to develop, implement, and evaluate a set of consensus-driven standardized mannequin-based simulation scenarios that every trainee must complete satisfactorily prior to completion of anesthesiology residency and certification. Curriculum development followed Kern’s principles and was accomplished via monthly teleconferences and annual face-to-face meetings. The development and implementation processes included the following key elements: 1) Curriculum needs assessment: 368 of 958 invitees (38.4%) responded to a national survey resulting in 64 suggested scenario topics. Use of a modified Delphi technique resulted in seven important and technically feasible scenarios. 2) Scenario development: All scenarios have learning objectives from the National Curriculum for Canadian Anesthesiology Residency. Standardized scenario templates were created, and the content was refined and piloted. 3) Assessment: A validated Global Rating Scale (GRS) is the primary assessment tool, informed by using scenario-specific checklists (created via a modified Delphi technique) and the Anesthesia Non-Technical Skills GRS. 4) Implementation: Standardized implementation guidelines, pre-brief/debrief documents, and rater training videos, guide, and commentary were generated. National implementation of the scenarios and program evaluation is currently underway. It is highly feasible to achieve specialty-based consensus on the elements of a national simulation-based curriculum. Our process could be adapted by any specialty interested in implementing a simulation-based curriculum incorporating competency-based assessment on a national scale.
KeywordsResidency Program Global Rate Scale Residency Education Task Force Member Anesthesiology Resident
L’évaluation par la simulation de la compétence des résidents an anesthésiologie: mise au point et mise en œuvre d’un Programme national de simulation en anesthésiologie au Canada (CanNASC)
La spécialité de l’anesthésiologie adoptera prochainement une approche de La compétence par conception (CPC) à la formation en résidence, mise au point par le Collège royal des médecins et chirurgiens du Canada (CRMCC). Une des compétences fondamentales de la CPC est l’évaluation fréquente et contextualisée des stagiaires. En 2013, le Comité de spécialité en anesthésiologie du CRMCC a réuni un groupe d’enseignants en simulation, chacun représentant l’un des 17 programmes canadiens de résidence en anesthésiologie, afin de créer le Groupe de travail sur le Programme national de simulation en anesthésiologie au Canada (CanNASC). Les objectifs de ce groupe de travail étaient de mettre au point, mettre en œuvre et évaluer un ensemble de scénarios de simulation sur mannequin; ces scénarios seraient standardisés et approuvés par consensus, et chaque stagiaire devrait les compléter de façon satisfaisante avant de pouvoir terminer sa résidence et sa certification en anesthésiologie. La mise au point du programme s’est fondée sur les principes de Kern et s’est faite par l’entremise de téléconférences mensuelles et de réunions annuelles en personne. Les éléments clés suivants ont fait partie des processus de mise au point et de mise en œuvre: 1) Évaluation des besoins du programme: 368 personnes sur 958 personnes invitées (38,4%) ont répondu à un sondage national qui a donné jour à 64 sujets de scénario proposés. Grâce à une méthode de Delphes modifiée, sept scénarios importants et réalisables d’un point de vue technique ont vu le jour. 2) Mise au point des scénarios: Tous les scénarios comportent des objectifs d’apprentissage tirés du Programme national pour la résidence en anesthésiologie au Canada. Des modèles de scénario standardisés ont été créés, et le contenu a été peaufiné et soumis à des essais pilotes. 3) Évaluation: Une Échelle d’évaluation globale (ÉÉG) validée constitue le principal outil d’évaluation; elle s’appuie sur des listes de contrôle spécifiques à chaque scénario (créées grâce à une méthode de Delphes modifiées) et l’EEG des compétences non techniques en anesthésie. 4) Mise en œuvre: Des directives de mise en œuvre standardisées, des documents préparatoires et de rétroaction, et des vidéos de formation, un guide et des commentaires destinés aux évaluateurs ont été générés. La mise en œuvre nationale des scénarios et l’évaluation du programme est en cours. Il est tout à fait possible d’atteindre un consensus dans la spécialité quant aux éléments d’un programme national de simulation. Notre processus peut être adapté à chaque spécialité intéressée à mettre en œuvre un programme fondé sur la simulation et intégrant une évaluation fondée sur les compétences à l’échelle nationale.
The Royal College of Physicians and Surgeons of Canada (RCPSC) is shifting to a competency-based medical education (CBME) and assessment scheme termed Competence By Design (CBD). A recent review article and accompanying editorial in the Journal provide a comprehensive overview of the rationale underlying this change as well as the potential opportunities and challenges associated with CBD in anesthesiology.1,2 The specialty of anesthesiology will adopt CBD in residency beginning in July 2017. The RCPSC Fundamental Innovations in Residency Education mechanism has allowed two CBD programs to launch in advance of the RCPSC schedule, namely, University of Ottawa3 in July 2015 and Dalhousie University in July 2016.
In Canada, anesthesiology residents in all 17 programs have access to simulation-based training, but level of exposure varies by program. In 2010, the RCPSC introduced simulation-assisted oral examinations.7 Resource limitations currently prohibit the incorporation of mannequin-based stations as a component of the final RCPSC examination. As such, an alternative paradigm for integrating simulation into the evolving curriculum and assessment of competence is required.
In 2013, the RCPSC Anesthesiology Specialty Committee assembled a group of simulation educators, representing each of the 17 residency programs, to form the Canadian National Anesthesiology Simulation Curriculum (CanNASC) Task Force (membership listed in the Appendix). The overall goals of the Task Force are to develop, implement, and evaluate a set of consensus-driven standardized simulation scenarios (mannequin-based) that every trainee must complete satisfactorily prior to completion of anesthesiology residency and certification. This paper describes the CanNASC development and implementation processes.
Canadian National Anesthesiology Simulation Curriculum development
The Ottawa Health Science Network Research Ethics Board waived the requirement for review of this project. Curriculum development was based on the principles described by Kern.8 Monthly teleconferences, twice yearly face-to-face meetings, and an engaged Task Force facilitated curriculum development and implementation. A description of some of the key elements of this project follows.
Simulation resource survey
A national survey of simulation resources was conducted amongst Task Force members to assess the feasibility of implementation across the country. Survey results confirmed that all programs had sufficient personnel and equipment to implement a standardized mannequin-based simulation curriculum.
Needs assessment for curriculum content
A literature search was first conducted to identify topics taught in existing anesthesiology simulation curricula.9,10 A needs assessment survey was then developed to identify clinical events, the management of which would be crucial to competence as a consultant anesthesiologist in Canada. The survey was subsequently distributed via FluidSurveys™ to every Canadian resident trainee, program director, simulation instructor, and residency program committee member. Three hundred sixty-eight of 958 invitees responded (38.4%), resulting in 64 unique suggested scenario topics.
Task Force members reached consensus on curriculum content using a modified Delphi technique11 focusing on two criteria: 1) The topic addressed a gap in the anesthesia training program (defined as a subject that is considered important but is suboptimally taught and/or assessed in the program. 2) Teaching and/or assessing this subject is best done with the use of a resource-intensive full-body mannequin. A four-point scale was used: 1 = definitely should not be included; 2 = probably should not be included; 3 = probably should be included; 4 = definitely should be included. Topics not endorsed by ≥ 80% of the Task Force were defined as a score of ≤ 2 and excluded. Seven scenario topics met the cut-off criteria. The Task Force began with the top five scenarios.
Scenario development process
It was critical to standardize the delivery of each scenario to ensure all candidates experience an identical assessment scenario regardless of their centre. Thus, a standardized scenario template specifying medical background, setup and equipment needs, and a single page storyboard summary was created for each scenario. Standardized pre-brief and debrief documents were also created.
All scenarios have learning objectives grounded in the National Curriculum for Canadian Anesthesiology Residency.12 Task Force subcommittees designed the scenarios, which were then refined iteratively by the whole committee. Scenarios were piloted, modified as necessary based on analysis of the pilot, and then finally ratified.
Managing emergencies in pediatric anesthesia global rating scale13
Very poor (appears to be a novice)
Borderline & unsatisfactory
Borderline but satisfactory
Excellent (appears to be highly expert)
Scenario-specific checklists were created using a modified Delphi technique, with the initial items generated from a literature review, including existing published guidelines. Items endorsed by < 80% of respondents were discarded. Four rounds were required to reach consensus for each scenario, resulting in checklists with 10-16 items. Items had to be observable performance markers that raters could specifically identify from the learners’ actions or verbal statements. Specific items on each checklist were mandatory and would lead to failure of the scenario if omitted. An item required 100% consensus by the Task Force to be designated mandatory. Checklist items were scored using a three-point scale (0 = not done; 1 = done improperly/partially, or out of sequence; 2 = done appropriately).
Rater training was accomplished by developing a “how-to” rater training guide and two training videos. Two experienced raters then created a commentary in the format of a transcribed rater training session discussing the scoring, thought processes, and final score given by each rater.
Canadian National Anesthesiology Simulation Curriculum implementation
Standardized elements of the CanNASC implementation guidelines
Pre-brief document to each participant
At time of booking simulation session and again 7 days prior to session
Faculty rater training using standardized videos and rater training guide
Minimum 1 day prior to session
Standardized debriefing using scenario-specific guide
Immediately after simulation session
Participant electronic feedback survey
E-mailed immediately after session and reminder several days later
Implementation data entry survey
Immediately after ≥ 2 raters evaluated participants
Faculty scenario feedback form
Immediately after all scenarios delivered
Data collection and analysis
Each program created a unique participant number that contained no personal identifiers, and entered the results into a FluidSurveys form. The key identification numbers linked to trainee names was kept locally and not shared nationally; thus the master database was anonymized.
To ensure standardized delivery of the scenario and to collect feedback for continuing quality improvement, data were collected on participant performance and key processes, including the use of the pre-brief document and debriefing guide, incidence of scenario technical difficulties, rater calibration, and duration of the scenario and debrief. Resident feedback was solicited.
Adherence to implementation procedures
All sites correctly delivered the pre-brief document and used the debriefing guide, and 92% encountered no technical difficulties. Of the nine instances of technical difficulties, six were due to equipment, two were due to procedure timing, and one was a facility issue (fire alarm). All sites conducted rater calibration with a minimum of two raters at each site. The mean (SD) scenario and debrief durations were 11.9 (1.7) min and 17.3 (7.8) min, respectively.
One hundred fourteen residents (89 postgraduate year [PGY]-5 and 25 PGY-4) participated. The overall pass rate, defined as a MEPA GRS score ≥ 4, was 79%. The pass rate among PGY5 residents was significantly higher than for PGY4 residents (83% vs 64% pass rate, respectively; P < 0.04). This provides evidence for the discriminatory validity of our tools. Residents were given immediate formative feedback during debriefing. Reasons for failing the scenario were collected and fed back to each program to aid in planning future educational initiatives. Consistent with CBD principles, those who failed were allowed to remediate, as achievement of competence is the ultimate goal of CBD.
Resident feedback on CanNASC simulation session, reported as % agreement, defined as a score ≥ 4 on a five-point Likert scale (1 = strongly disagree; 5 = strongly agree)
The information received prior to the session was adequate
The simulator responded like a real patient
The simulation team behaved in an appropriate and believable manner during the scenario
The scenario prompted realistic responses from me like treating a real patient
The scenario was realistic
The scenario was relevant to my clinical practice
The debriefing session clarified important issues of the scenario
The debriefing session enhanced my knowledge
The session increased my confidence in treating patients when a crisis occurs
Simulation should be used as one of several assessment modalities during my residency
Simulation is an appropriate tool to assess management of the anesthesia topic covered in THIS session
The shift to CBME demands increased frequency and variety of valid and reliable assessments. It is feasible to achieve consensus on the elements of a national simulation curriculum, including assessment, for anesthesiology trainees.
The CanNASC scenarios have been successfully implemented in every residency program in Canada, with a universal commitment to increase the provision of scenarios even further in the following academic years. Collection of data on this scale from standardized content provides evidence on which to support (or refute) the reliability and validity of the process. Program evaluation and quality improvement frameworks have been built into the project. Additionally, data on performance will exist not only for individual trainees but also for training programs, which can help assure universities, regulators, and the public that the quality of training and certification is uniformly high across the country. The RCPSC Specialty Committee considers CanNASC a useful assessment modality for high-risk low-frequency clinical events and is discussing the inclusion of CanNASC within the assessment blueprint for CBD residency-training.
While initially designed for education and assessment during residency, there has also been great interest among continuing education (CE) providers to use these scenarios to meet the demands for simulation-based CE programs. Furthermore, the experience accumulated with developing the CanNASC scenarios may help form a basis for the evolution of maintenance of certification processes for practicing anesthesiologists.
The RCPSC approach to the future of medical education is clear: “CBD is a multi-year program to implement a CBME approach to residency education and specialty practice in Canada…”.15 This initiative requires major changes to all residency education programs, including an increased quantity and diversity in assessment modalities during specialty training. The CanNASC has been implemented at each program site and allows nationwide comparisons and performance benchmarking. Our development and implementation processes could be adapted or adopted by any specialty interested in implementing a simulation-based curriculum incorporating competency-based assessment on a national scale.
The authors sincerely thank the former members of the CanNASC Task Force for their contributions to this initiative: Natalie Buu, Rich Cherry, Gilles Chiniara, Keith Drader, Natalie Dupuis, Viren Naik, and Narendra Vakharia. M. Chiu was supported by the Ottawa Hospital Anesthesia Alternate Funds Association. We also acknowledge the support provided by Susan Brien and Kimberley Ross of the RCPSC Practice Performance and Innovation Unit and the in-kind support provided to many members of the Task Force by the Association of Canadian University Departments of Anesthesiology. We dedicate this publication to the memory of Neil Cowie, one of the founding members of the Task Force.
M Chiu was supported by The Ottawa Hospital Anesthesia Alternate Funds Association and is a Simulation Educator with the Royal College of Physicians and Surgeons of Canada (RCPSC). G Peachey, TL Bosma, and J Burjorjee received departmental support for their academic time. This project was supported by the RCPSC Practice Performance and Innovation Unit.
Conflicts of interest
There are no conflicts of interest or disclosures to declare.
Michelle Chiu, Jordan Tarshis, and Tobias Everett contributed substantially to the conception and design of the manuscript, the analysis of data, and drafting the article. Michelle Chiu, Jordan Tarshis, Tobias Everett, Andreas Antoniou, T. Laine Bosma, Jessica E. Burjorjee, Neil Cowie, Simone Crooks, Kate Doyle, David Dubois, Rachel Fisher, Megan Hayter, Genevieve McKinnon, Diana Noseworthy, Noel O’Regan, Greg Peachey, Arnaud Robitaille, Michael Sullivan, Marshall Tenenbein, and Marie-Helene Tremblay contributed substantially to the acquisition and interpretation of data. Andreas Antoniou, T. Laine Bosma, Jessica E. Burjorjee, Neil Cowie, Simone Crooks, Kate Doyle, David Dubois, Rachel Fisher, Megan Hayter, Genevieve McKinnon, Diana Noseworthy, Noel O’Regan, Greg Peachey, Arnaud Robitaille, Michael Sullivan, Marshall Tenenbein, and Marie-Helene Tremblay also contributed to the analysis of the data.
This submission was handled by Dr. Steven Backman, Associate Editor, Canadian Journal of Anesthesia.
- 8.Kern DE, Thomas PA, Hughes MT. Curriculum Development for Medical Education A Six-Step Approach. 2nd ed. Baltimore, Maryland: The Johns Hopkins University Press; 2009 .Google Scholar
- 13.Everett TC, Ng E, Power D, et al. The Managing Emergencies in Paediatric Anaesthesia global rating scale is a reliable tool for simulation-based assessment in pediatric anesthesia crisis management. Pediatr Anesth 2013; 23: 1117-23.Google Scholar
- 15.Royal College of Physicians and Surgeons of Canada. Available from URL: http://www.royalcollege.ca/rcsite/cbd/cbd-tools-resources-e (accessed July 2015).