Skip to main content
Log in

Place de la simulation aux examens de réanimation

Simulation in intensive care exams

  • Article Original / Original Article
  • Published:
Réanimation

Résumé

En France, l’implantation de la simulation comme méthode de formation et d’évaluation dans le domaine de la santé est encore très hétérogène. Actuellement, aucun examen standardisé de simulation n’est organisé ou même recommandé à l’échelle nationale pour valider les compétences nécessaires à l’exercice du métier de réanimateur.

Sur la demande justifiée de la société et par devoir moral envers elle, les enseignants en médecine et les professionnels des différentes spécialités doivent s’assurer d’un niveau minimum de compétences chez les médecins en exercice. La simulation permet d’évaluer de manière pertinente des compétences médicales qui ne sont pas explorées par les autres méthodes d’évaluation. L’intégration de la simulation aux examens pourrait améliorer la validité des évaluations.

Les principales méthodes de simulation médicale disponibles sont les patients simulés, les mannequins et les patients virtuels. Les méthodes les plus simples de chaque catégorie de simulation sont les plus étudiées et leur faisabilité serait tout à fait adaptée à des examens interrégionaux. En raison de leur faible coût, de leur validité avérée et de leur grande reproductibilité, les patients simulés, les mannequins partiels mécaniques, les cas cliniques QCM et les tests de concordance de script seraient les méthodes de simulation à utiliser en première intention aux examens des étudiants en réanimation. La place et la méthode d’utilisation des outils plus complexes comme les mannequins haute fidélité et les patients virtuels complexes restent à préciser.

Après une phase d’évaluation formative permettant de valider la qualité des examens, la simulation pourrait intégrer l’évaluation des futurs réanimateurs.

Abstract

Simulation-based education and evaluation of French medical students is still heterogeneous. To date, the use of simulation is not recommended for the tests required to obtain the certification in critical care medicine (CCM) in France. In response to a growing public demand for a safer healthcare system, teachers and medical societies have to guarantee that physicians are competent and able to provide optimal care to patients.

Traditional assessment methods as multiple-choice exams or continuing medical education exercises may not be appropriate to evaluate all competencies required for excellence in medical practice. Simulation provides relevant tools for assessing specific skills, not tested by other methods. Incorporation of simulation in the tests required to obtain the certification in CCM may improve their validity and provide a better guarantee of healthcare quality to the society.

The main methods of medical simulation are standardized patients, mannequins and virtual reality. The simplest techniques are strongly validated to accurately and easily evaluate small groups of students. These methods would be appropriate for local or interregional examinations. Due to their low cost, high validity and great reproducibility, standardized patients, low-fidelity mannequins, multiple-choice questions and script concordance tests should be used as first-line options. The role of more complicated techniques as high-fidelity mannequins and complex virtual reality is still to be determined.

We suggest that after a first step of evaluation, simulation methods may be part of the future validation tests for the certification in CCM in France.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Références

  1. Levine AI, Schwartz AD, Bryson EO, et al (2012) Role of simulation in U.S. physician licensure and certification. Mt Sinai J Med 79:140–53

    Article  PubMed  Google Scholar 

  2. HAS (2012) Rapport de mission. État de l’art (national et international) en matière de pratiques de simulation dans le domaine de la santé, dans le cadre du développement professionnel continu (DPC) et de la prévention des risques associés aux soins, Janvier 2012

    Google Scholar 

  3. Boulet JR, Murray D, Kras J, et al (2003) Reliability and validity of a simulation-based acute care skills assessment for medical students and residents. Anesthesiology 99:1270–80

    Article  PubMed  Google Scholar 

  4. Grand’Maison P, Brailovsky CA, Lescop J (1996) Content validity of the Quebec licensing examination (OSCE). Assessed by practising physicians. Can Fam Physician 42:254–9

    PubMed Central  PubMed  Google Scholar 

  5. Grand’Maison P, Lescop J, Rainsberry P, et al (1992) Large-scale use of an objective, structured clinical examination for licensing family physicians. CMAJ 146:1735–40

    PubMed Central  PubMed  Google Scholar 

  6. Draycott TJ, Crofts JF, Ash JP, et al (2008) Improving neonatal outcome through practical shoulder dystocia training. Obstet Gynecol 112:14–20

    Article  PubMed  Google Scholar 

  7. Tamblyn R, Abrahamowicz M, Brailovsky C, et al (1998) Association between licensing examination scores and resource use and quality of care in primary care practice. JAMA 280:989–96

    Article  CAS  PubMed  Google Scholar 

  8. Wayne DB, Didwania A, Feinglass J, et al (2008) Simulationbased education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. Chest 133:56–61

    Article  PubMed  Google Scholar 

  9. Jolly D, Lorette G, Pierre Ambrosi P, et al (2013) Résultats des épreuves classantes nationales (ECN) 2012. PresseMed 42:1138–40

    Google Scholar 

  10. Jolly D, Ambrosi P, Chaffanjon P, et al (2014) Résultats des épreuves classantes nationales (ECN) 2013. Presse Med 43:865–7

    Article  PubMed  Google Scholar 

  11. Gaffan J, Dacre J, Jones A (2006) Educating undergraduate medical students about oncology: a literature review. J Clin Oncol 24:1932–9

    Article  PubMed  Google Scholar 

  12. Sutherland LM, Middleton PF, Anthony A, et al (2006) Surgical simulation: a systematic review. Ann Surg 243:291–300

    Article  PubMed Central  PubMed  Google Scholar 

  13. Haque S, Srinivasan S (2006) A meta-analysis of the training effectiveness of virtual reality surgical simulators. IEEE Trans Inf Technol Biomed 10:51–8

    Article  PubMed  Google Scholar 

  14. Kohn LT, Corrigan JM, Donaldson MS (1999) To err is human: building a safer health system. National Academy Press (PDF disponible à l’adresse suivante: http://www.nap.edu/catalog/9728.html)

    Google Scholar 

  15. Andreatta PB, Woodrum DT, Birkmeyer JD, et al (2006) Laparoscopic skills are improved with LapMentor training: results of a randomized, double-blinded study. Ann Surg 243:854–60

    Article  PubMed Central  PubMed  Google Scholar 

  16. Barsuk JH, McGaghie WC, Cohen ER, et al (2009) Simulationbased mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med 37:2697–701

    Article  PubMed  Google Scholar 

  17. Blum MG, Powers TW, Sundaresan S (2004) Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy. Ann Thorac Surg 78:287–91

    Article  PubMed  Google Scholar 

  18. Epstein RM (2007) Assessment in medical education. N Engl J Med 356:387–96

    Article  CAS  PubMed  Google Scholar 

  19. Grand’Maison P, Lescop J, Brailovsky CA (1993) Canadian experience with structured clinical examinations. CMAJ 148:1573–6

    PubMed Central  PubMed  Google Scholar 

  20. Rothman AI, Cohen R, Dawson-Saunders B, et al (1992) Testing the equivalence of multiple-station tests of clinical competence. Acad Med 67(10 Suppl):S40–1

    Article  Google Scholar 

  21. Cohen R, Rothman AI, Poldre P, et al (1991) Validity and generalizability of global ratings in an objective structured clinical examination. Acad Med 66:545–8

    Article  CAS  PubMed  Google Scholar 

  22. Hodges B, McIlroy JH (2003) Analytic global OSCE ratings are sensitive to level of training. Med Educ 37:1012–6

    Article  PubMed  Google Scholar 

  23. Regehr G, MacRae H, Reznick RK, et al (1998) Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med 73:993–7

    Article  CAS  PubMed  Google Scholar 

  24. Brydges R, Classen R, Larmer J, et al (2006) Computer-assisted assessment of one-handed knot tying skills performed within various contexts: a construct validity study. Am J Surg 192:109–13

    Article  PubMed  Google Scholar 

  25. Morris MC, Gallagher TK, Ridgway PF (2012) Tools used to assess medical students competence in procedural skills at the end of a primary medical degree: a systematic review. Med Educ [in press]

    Google Scholar 

  26. Datta V, Mackay S, Mandalia M, Darzi A (2001) The use of electromagnetic motion tracking analysis to objectively measure open surgical skill in the laboratory-based model. J Am Coll Surg 193:479–85

    Article  CAS  PubMed  Google Scholar 

  27. Ghaderi I, Vaillancourt M, Sroka G, et al (2011) Performance of simulated laparoscopic incisional hernia repair correlates with operating room performance. Am J Surg 201:40–5

    Article  PubMed  Google Scholar 

  28. Vassiliou MC, Ghitulescu GA, Feldman LS, et al (2006) The MISTELS program to measure technical skill in laparoscopic surgery: evidence for reliability. Surg Endosc 20:744–7

    Article  CAS  PubMed  Google Scholar 

  29. Wass V, Jones R, Van der Vleuten C (2001) Standardized or real patients to test clinical competence? The long case revisited. Med Educ 35:321–5

    Article  CAS  PubMed  Google Scholar 

  30. Demaria S Jr, Bryson EO, Mooney TJ, et al (2010) Adding emotional stressors to training in simulated cardiopulmonary arrest enhances participant performance. Med Educ 44:1006–15

    Article  PubMed  Google Scholar 

  31. Gordon JA, Alexander EK, Lockley SW, et al (2010) Harvard Work Hours, Health, and Safety Group (Boston, Massachusetts). Does simulator-based clinical performance correlate with actual hospital behavior? The effect of extended work hours on patient care provided by medical interns. Acad Med 85:1583–8

    Article  PubMed Central  PubMed  Google Scholar 

  32. Stevens A, Hernandez J, Johnsen K, et al (2006) The use of virtual patients to teach medical students history taking and communication skills. Am J Surg 191:806–11

    Article  PubMed  Google Scholar 

  33. Floreck LM, Guernsey MJ, Clyman SG, Clauser BE (2002) Examinee performance on computer-based case simulations as part of the USMLE step 3 examination: are examinees ordering dangerous actions? Acad Med 77(10 Suppl):S77–9

    Article  Google Scholar 

  34. Feinberg RA, Swygert KA, Haist SA, et al (2012) The impact of postgraduate training on USMLE® step 3® and its computer-based case simulation component. J Gen Intern Med 2012 27:65–70

    Article  Google Scholar 

  35. Margolis MJ, Clauser BE, Harik P (2004) Scoring the computerbased case simulation component of USMLE Step 3: a comparison of preoperational and operational data. Acad Med 79(10 Suppl):S62–4

    Article  Google Scholar 

  36. Flexner A (1910) Medical education in the United States and Canada: a report to the Carnegie foundation for the advancement of teaching. Carnegie Foundation for the Advancement of Teaching, New York, pp 169

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to C. Clec’h.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Clec’h, C., Préau, S. Place de la simulation aux examens de réanimation. Réanimation 23, 698–705 (2014). https://doi.org/10.1007/s13546-014-0931-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13546-014-0931-8

Mots clés

Keywords

Navigation