Abstract
The specialty of anesthesiology will soon adopt the Competence By Design (CBD) approach to residency education developed by the Royal College of Physicians and Surgeons of Canada (RCPSC). A foundational component of CBD is frequent and contextualized assessment of trainees. In 2013, the RCPSC Anesthesiology Specialty Committee assembled a group of simulation educators, representing each of the 17 Canadian anesthesiology residency programs, to form the Canadian National Anesthesiology Simulation Curriculum (CanNASC) Task Force. The goals were to develop, implement, and evaluate a set of consensus-driven standardized mannequin-based simulation scenarios that every trainee must complete satisfactorily prior to completion of anesthesiology residency and certification. Curriculum development followed Kern’s principles and was accomplished via monthly teleconferences and annual face-to-face meetings. The development and implementation processes included the following key elements: 1) Curriculum needs assessment: 368 of 958 invitees (38.4%) responded to a national survey resulting in 64 suggested scenario topics. Use of a modified Delphi technique resulted in seven important and technically feasible scenarios. 2) Scenario development: All scenarios have learning objectives from the National Curriculum for Canadian Anesthesiology Residency. Standardized scenario templates were created, and the content was refined and piloted. 3) Assessment: A validated Global Rating Scale (GRS) is the primary assessment tool, informed by using scenario-specific checklists (created via a modified Delphi technique) and the Anesthesia Non-Technical Skills GRS. 4) Implementation: Standardized implementation guidelines, pre-brief/debrief documents, and rater training videos, guide, and commentary were generated. National implementation of the scenarios and program evaluation is currently underway. It is highly feasible to achieve specialty-based consensus on the elements of a national simulation-based curriculum. Our process could be adapted by any specialty interested in implementing a simulation-based curriculum incorporating competency-based assessment on a national scale.
Résumé
La spécialité de l’anesthésiologie adoptera prochainement une approche de La compétence par conception (CPC) à la formation en résidence, mise au point par le Collège royal des médecins et chirurgiens du Canada (CRMCC). Une des compétences fondamentales de la CPC est l’évaluation fréquente et contextualisée des stagiaires. En 2013, le Comité de spécialité en anesthésiologie du CRMCC a réuni un groupe d’enseignants en simulation, chacun représentant l’un des 17 programmes canadiens de résidence en anesthésiologie, afin de créer le Groupe de travail sur le Programme national de simulation en anesthésiologie au Canada (CanNASC). Les objectifs de ce groupe de travail étaient de mettre au point, mettre en œuvre et évaluer un ensemble de scénarios de simulation sur mannequin; ces scénarios seraient standardisés et approuvés par consensus, et chaque stagiaire devrait les compléter de façon satisfaisante avant de pouvoir terminer sa résidence et sa certification en anesthésiologie. La mise au point du programme s’est fondée sur les principes de Kern et s’est faite par l’entremise de téléconférences mensuelles et de réunions annuelles en personne. Les éléments clés suivants ont fait partie des processus de mise au point et de mise en œuvre: 1) Évaluation des besoins du programme: 368 personnes sur 958 personnes invitées (38,4%) ont répondu à un sondage national qui a donné jour à 64 sujets de scénario proposés. Grâce à une méthode de Delphes modifiée, sept scénarios importants et réalisables d’un point de vue technique ont vu le jour. 2) Mise au point des scénarios: Tous les scénarios comportent des objectifs d’apprentissage tirés du Programme national pour la résidence en anesthésiologie au Canada. Des modèles de scénario standardisés ont été créés, et le contenu a été peaufiné et soumis à des essais pilotes. 3) Évaluation: Une Échelle d’évaluation globale (ÉÉG) validée constitue le principal outil d’évaluation; elle s’appuie sur des listes de contrôle spécifiques à chaque scénario (créées grâce à une méthode de Delphes modifiées) et l’EEG des compétences non techniques en anesthésie. 4) Mise en œuvre: Des directives de mise en œuvre standardisées, des documents préparatoires et de rétroaction, et des vidéos de formation, un guide et des commentaires destinés aux évaluateurs ont été générés. La mise en œuvre nationale des scénarios et l’évaluation du programme est en cours. Il est tout à fait possible d’atteindre un consensus dans la spécialité quant aux éléments d’un programme national de simulation. Notre processus peut être adapté à chaque spécialité intéressée à mettre en œuvre un programme fondé sur la simulation et intégrant une évaluation fondée sur les compétences à l’échelle nationale.
Background
The Royal College of Physicians and Surgeons of Canada (RCPSC) is shifting to a competency-based medical education (CBME) and assessment scheme termed Competence By Design (CBD). A recent review article and accompanying editorial in the Journal provide a comprehensive overview of the rationale underlying this change as well as the potential opportunities and challenges associated with CBD in anesthesiology.1,2 The specialty of anesthesiology will adopt CBD in residency beginning in July 2017. The RCPSC Fundamental Innovations in Residency Education mechanism has allowed two CBD programs to launch in advance of the RCPSC schedule, namely, University of Ottawa3 in July 2015 and Dalhousie University in July 2016.
One of the foundational components of CBME is the frequent and contextualized assessment of trainees.4 Miller describes the attainment of competency via a staged progression from “knows” and “knows how” to “shows how” and “does”.5 Different methods of assessment can be matched to the various stages of competency (Figure). Simulation scenarios allow learners deliberate practice of crisis management at no risk to patient safety or quality of care, and if designed for evaluation, permit the “shows how” level of assessment.6
Miller’s pyramid of competence matched to assessment methods (based on: Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65(9 Suppl): S63-7).5
In Canada, anesthesiology residents in all 17 programs have access to simulation-based training, but level of exposure varies by program. In 2010, the RCPSC introduced simulation-assisted oral examinations.7 Resource limitations currently prohibit the incorporation of mannequin-based stations as a component of the final RCPSC examination. As such, an alternative paradigm for integrating simulation into the evolving curriculum and assessment of competence is required.
In 2013, the RCPSC Anesthesiology Specialty Committee assembled a group of simulation educators, representing each of the 17 residency programs, to form the Canadian National Anesthesiology Simulation Curriculum (CanNASC) Task Force (membership listed in the Appendix). The overall goals of the Task Force are to develop, implement, and evaluate a set of consensus-driven standardized simulation scenarios (mannequin-based) that every trainee must complete satisfactorily prior to completion of anesthesiology residency and certification. This paper describes the CanNASC development and implementation processes.
Canadian National Anesthesiology Simulation Curriculum development
The Ottawa Health Science Network Research Ethics Board waived the requirement for review of this project. Curriculum development was based on the principles described by Kern.8 Monthly teleconferences, twice yearly face-to-face meetings, and an engaged Task Force facilitated curriculum development and implementation. A description of some of the key elements of this project follows.
Simulation resource survey
A national survey of simulation resources was conducted amongst Task Force members to assess the feasibility of implementation across the country. Survey results confirmed that all programs had sufficient personnel and equipment to implement a standardized mannequin-based simulation curriculum.
Needs assessment for curriculum content
A literature search was first conducted to identify topics taught in existing anesthesiology simulation curricula.9,10 A needs assessment survey was then developed to identify clinical events, the management of which would be crucial to competence as a consultant anesthesiologist in Canada. The survey was subsequently distributed via FluidSurveys™ to every Canadian resident trainee, program director, simulation instructor, and residency program committee member. Three hundred sixty-eight of 958 invitees responded (38.4%), resulting in 64 unique suggested scenario topics.
Task Force members reached consensus on curriculum content using a modified Delphi technique11 focusing on two criteria: 1) The topic addressed a gap in the anesthesia training program (defined as a subject that is considered important but is suboptimally taught and/or assessed in the program. 2) Teaching and/or assessing this subject is best done with the use of a resource-intensive full-body mannequin. A four-point scale was used: 1 = definitely should not be included; 2 = probably should not be included; 3 = probably should be included; 4 = definitely should be included. Topics not endorsed by ≥ 80% of the Task Force were defined as a score of ≤ 2 and excluded. Seven scenario topics met the cut-off criteria. The Task Force began with the top five scenarios.
Scenario development process
It was critical to standardize the delivery of each scenario to ensure all candidates experience an identical assessment scenario regardless of their centre. Thus, a standardized scenario template specifying medical background, setup and equipment needs, and a single page storyboard summary was created for each scenario. Standardized pre-brief and debrief documents were also created.
All scenarios have learning objectives grounded in the National Curriculum for Canadian Anesthesiology Residency.12 Task Force subcommittees designed the scenarios, which were then refined iteratively by the whole committee. Scenarios were piloted, modified as necessary based on analysis of the pilot, and then finally ratified.
Assessment strategy
Concurrently with scenario development, the Task Force discussed the assessment strategy. The goal was a simple, reliable, and valid criterion-based assessment system from which competence could be inferred. Ultimately, it was decided to use the Managing Emergencies in Pediatric Anesthesia Global Rating Scale (MEPA GRS) as the primary tool for assessment of competence (Table 1). The MEPA GRS has robust evidence to support its validity in a similar context.13 Scenario-specific checklists and the Anesthesia Non-Technical Skills (ANTS) score14 were used to help raters decide on the final MEPA GRS score and to aid in debriefing the scenario. The checklists and ANTS were not directly used in the final assessment but helped guide rater judgement. Assessment, by at least two raters, could be conducted live or via video review at the discretion of the program.
Scenario-specific checklists were created using a modified Delphi technique, with the initial items generated from a literature review, including existing published guidelines. Items endorsed by < 80% of respondents were discarded. Four rounds were required to reach consensus for each scenario, resulting in checklists with 10-16 items. Items had to be observable performance markers that raters could specifically identify from the learners’ actions or verbal statements. Specific items on each checklist were mandatory and would lead to failure of the scenario if omitted. An item required 100% consensus by the Task Force to be designated mandatory. Checklist items were scored using a three-point scale (0 = not done; 1 = done improperly/partially, or out of sequence; 2 = done appropriately).
Rater training was accomplished by developing a “how-to” rater training guide and two training videos. Two experienced raters then created a commentary in the format of a transcribed rater training session discussing the scoring, thought processes, and final score given by each rater.
Canadian National Anesthesiology Simulation Curriculum implementation
All scenarios, storyboards, pre-brief/debrief documents, assessment rubrics, and data collection surveys were translated into both official languages. Standardized implementation guidelines were developed and composed of the elements described in Table 2. National rollout of the first CanNASC scenario occurred during the 2014-2015 academic year at all 17 programs in Canada. Each program controlled the timing, location, and local resources involved in delivering this scenario. The target cohort was senior residents, which would correspond to the Core or Transition to Practice phases of the new CBD curriculum.
Data collection and analysis
Each program created a unique participant number that contained no personal identifiers, and entered the results into a FluidSurveys form. The key identification numbers linked to trainee names was kept locally and not shared nationally; thus the master database was anonymized.
To ensure standardized delivery of the scenario and to collect feedback for continuing quality improvement, data were collected on participant performance and key processes, including the use of the pre-brief document and debriefing guide, incidence of scenario technical difficulties, rater calibration, and duration of the scenario and debrief. Resident feedback was solicited.
Adherence to implementation procedures
All sites correctly delivered the pre-brief document and used the debriefing guide, and 92% encountered no technical difficulties. Of the nine instances of technical difficulties, six were due to equipment, two were due to procedure timing, and one was a facility issue (fire alarm). All sites conducted rater calibration with a minimum of two raters at each site. The mean (SD) scenario and debrief durations were 11.9 (1.7) min and 17.3 (7.8) min, respectively.
Resident performance
One hundred fourteen residents (89 postgraduate year [PGY]-5 and 25 PGY-4) participated. The overall pass rate, defined as a MEPA GRS score ≥ 4, was 79%. The pass rate among PGY5 residents was significantly higher than for PGY4 residents (83% vs 64% pass rate, respectively; P < 0.04). This provides evidence for the discriminatory validity of our tools. Residents were given immediate formative feedback during debriefing. Reasons for failing the scenario were collected and fed back to each program to aid in planning future educational initiatives. Consistent with CBD principles, those who failed were allowed to remediate, as achievement of competence is the ultimate goal of CBD.
Resident feedback
One hundred eight residents completed the feedback surveys (95% response rate). They were asked to indicate their level of agreement to a series of statements using a five-point Likert scale (1 = strongly disagree; 5 = strongly agree) (Table 3). While the primary purpose of the sessions was performance evaluation, the residents also valued the educational component of the debriefing. Residents considered the simulation team and scenario realistic and the scenario relevant to their clinical practice. Seventy-seven percent of respondents supported the concept that simulation should be used as an assessment modality during residency.
Future directions
The shift to CBME demands increased frequency and variety of valid and reliable assessments. It is feasible to achieve consensus on the elements of a national simulation curriculum, including assessment, for anesthesiology trainees.
The CanNASC scenarios have been successfully implemented in every residency program in Canada, with a universal commitment to increase the provision of scenarios even further in the following academic years. Collection of data on this scale from standardized content provides evidence on which to support (or refute) the reliability and validity of the process. Program evaluation and quality improvement frameworks have been built into the project. Additionally, data on performance will exist not only for individual trainees but also for training programs, which can help assure universities, regulators, and the public that the quality of training and certification is uniformly high across the country. The RCPSC Specialty Committee considers CanNASC a useful assessment modality for high-risk low-frequency clinical events and is discussing the inclusion of CanNASC within the assessment blueprint for CBD residency-training.
While initially designed for education and assessment during residency, there has also been great interest among continuing education (CE) providers to use these scenarios to meet the demands for simulation-based CE programs. Furthermore, the experience accumulated with developing the CanNASC scenarios may help form a basis for the evolution of maintenance of certification processes for practicing anesthesiologists.
Conclusions
The RCPSC approach to the future of medical education is clear: “CBD is a multi-year program to implement a CBME approach to residency education and specialty practice in Canada…”.15 This initiative requires major changes to all residency education programs, including an increased quantity and diversity in assessment modalities during specialty training. The CanNASC has been implemented at each program site and allows nationwide comparisons and performance benchmarking. Our development and implementation processes could be adapted or adopted by any specialty interested in implementing a simulation-based curriculum incorporating competency-based assessment on a national scale.
References
Fraser AB, Stodel EJ, Chaput AJ. Curriculum reform for residency training: competence, change, and opportunities for leadership. Can J Anesth 2016; 63: 875-84.
Levine MF, Shorten G. Competency-based medical education: its time has arrived. Can J Anesth 2016; 63: 802-6.
Stodel E, Wyand A, Crooks S, Moffett S, Chiu M, Hudson CC. Designing and implementing a competency-based training program for anesthesiology residents at the University of Ottawa. Anesthesiol Res Pract 2015; 2015: 713038.
Carraccio C, Englander R, Van Melle E, et al. Advancing competency-based medical education: a charter for clinician-educators. Acad Med 2015; 91: 645-9.
Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65(9 Suppl): S63-7.
Naik VN, Brien SE. Review article: Simulation: a means to address and improve patient safety. Can J Anesth 2013; 60: 192-200.
Blew P, Muir JG, Naik VN. The evolving Royal College examination in anesthesiology. Can J Anesth 2010; 57: 804-10.
Kern DE, Thomas PA, Hughes MT. Curriculum Development for Medical Education A Six-Step Approach. 2nd ed. Baltimore, Maryland: The Johns Hopkins University Press; 2009 .
Weller J, Morris R, Watterson L, et al. Effective management of anaesthetic crises: development and evaluation of a college-accredited simulation-based course for anaesthesia education in Australia and New Zealand. Simul Healthc 2006; 1: 209-14.
Price JW, Price JR, Pratt DD, Collins JB, McDonald J. High-fidelity simulation in anesthesiology training: a survey of Canadian anesthesiology residents’ simulator experience. Can J Anesth 2010; 57: 134-42.
Clayton MJ. Delphi: a technique to harness expert opinion for critical decision-making tasks in education. Educ Psychol 1997; 17: 373-86.
Spadafora SM, Houston P, Levine M. A national curriculum in anesthesia: rationale, development, implementation, and implications. Can J Anesth 2012; 59: 636-41.
Everett TC, Ng E, Power D, et al. The Managing Emergencies in Paediatric Anaesthesia global rating scale is a reliable tool for simulation-based assessment in pediatric anesthesia crisis management. Pediatr Anesth 2013; 23: 1117-23.
Fletcher G, Flin R, McGeorge P, Glavin R, Maran N, Patey R. Anaesthetists’ Non-Technical Skills (ANTS): evaluation of a behavioural marker system. Br J Anaesth 2003; 90: 580-8.
Royal College of Physicians and Surgeons of Canada. Available from URL: http://www.royalcollege.ca/rcsite/cbd/cbd-tools-resources-e (accessed July 2015).
Acknowledgements
The authors sincerely thank the former members of the CanNASC Task Force for their contributions to this initiative: Natalie Buu, Rich Cherry, Gilles Chiniara, Keith Drader, Natalie Dupuis, Viren Naik, and Narendra Vakharia. M. Chiu was supported by the Ottawa Hospital Anesthesia Alternate Funds Association. We also acknowledge the support provided by Susan Brien and Kimberley Ross of the RCPSC Practice Performance and Innovation Unit and the in-kind support provided to many members of the Task Force by the Association of Canadian University Departments of Anesthesiology. We dedicate this publication to the memory of Neil Cowie, one of the founding members of the Task Force.
Funding
M Chiu was supported by The Ottawa Hospital Anesthesia Alternate Funds Association and is a Simulation Educator with the Royal College of Physicians and Surgeons of Canada (RCPSC). G Peachey, TL Bosma, and J Burjorjee received departmental support for their academic time. This project was supported by the RCPSC Practice Performance and Innovation Unit.
Conflicts of interest
There are no conflicts of interest or disclosures to declare.
Author contributions
Michelle Chiu, Jordan Tarshis, and Tobias Everett contributed substantially to the conception and design of the manuscript, the analysis of data, and drafting the article. Michelle Chiu, Jordan Tarshis, Tobias Everett, Andreas Antoniou, T. Laine Bosma, Jessica E. Burjorjee, Neil Cowie, Simone Crooks, Kate Doyle, David Dubois, Rachel Fisher, Megan Hayter, Genevieve McKinnon, Diana Noseworthy, Noel O’Regan, Greg Peachey, Arnaud Robitaille, Michael Sullivan, Marshall Tenenbein, and Marie-Helene Tremblay contributed substantially to the acquisition and interpretation of data. Andreas Antoniou, T. Laine Bosma, Jessica E. Burjorjee, Neil Cowie, Simone Crooks, Kate Doyle, David Dubois, Rachel Fisher, Megan Hayter, Genevieve McKinnon, Diana Noseworthy, Noel O’Regan, Greg Peachey, Arnaud Robitaille, Michael Sullivan, Marshall Tenenbein, and Marie-Helene Tremblay also contributed to the analysis of the data.
Editorial responsibility
This submission was handled by Dr. Steven Backman, Associate Editor, Canadian Journal of Anesthesia.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
Current Canadian National Anesthesiology Simulation Curriculum (CanNASC) Task Force Membership
Name | Institution |
---|---|
Antoniou, Andreas | Western University |
Bosma, Laine | University of British Columbia |
Burjorjee, Jessica | Queen’s University |
Chiu, Michelle (Chair) | University of Ottawa |
Crooks, Simone | University of Ottawa |
Doyle, Kate | University of Alberta |
Dubois, David | Université de Sherbrooke |
Everett, Tobias | University of Toronto |
Fisher, Rachel | McGill University |
Hayter, Megan | University of Calgary |
Joann Kawchuk | University of Saskatchewan |
McKinnon, Genevieve | Dalhousie University |
Noseworthy, Diana | Northern Ontario School of Medicine |
O’Regan, Noel | Memorial University |
Peachey, Greg | McMaster University |
Robitaille, Arnaud | Université de Montréal |
Sullivan, Michael (RC Specialty Committee Chair) | Southlake Regional Health Centre |
Tarshis, Jordan (RC Member at large) | University of Toronto |
Tenenbein, Marshall | University of Manitoba |
Tremblay, Marie-Helene | Université de Laval |
Rights and permissions
About this article
Cite this article
Chiu, M., Tarshis, J., Antoniou, A. et al. Simulation-based assessment of anesthesiology residents’ competence: development and implementation of the Canadian National Anesthesiology Simulation Curriculum (CanNASC). Can J Anesth/J Can Anesth 63, 1357–1363 (2016). https://doi.org/10.1007/s12630-016-0733-8
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12630-016-0733-8
Keywords
- Residency Program
- Global Rate Scale
- Residency Education
- Task Force Member
- Anesthesiology Resident