Introduction

The rapid evolution of medical knowledge, decreased time for medical training, and ethical concerns about patients as educational subjects have increased the complexity of medical decision making and medical training [1, 2]. The virtual patient (VP) is a computer program that simulates real-life clinical scenarios and allows learners to emulate the roles of health care providers to make clinical decisions (reviewed in [3, 4]). While VP cases are available widely online, few focus on medical students and junior residents as their target audience.

In addition to the knowledge and technical expertise medical trainees must acquire, there are intrinsic competencies with significant impact on health-care delivery and patient satisfaction [5]. In the 1990s, the Royal College of Physicians and Surgeons of Canada developed the “Canadian Medical Education Directives for Specialists” (CanMEDS) framework [reviewed in 6]. Training programs have implemented curricula to integrate the framework [6, 7]; however, there remains a paucity of literature on effective means of integrating the CanMEDS framework in medical education.

To standardize clinical exposure and improve trainees’ knowledge and perceived preparedness to manage core internal medicine cases, we developed a pool of VP cases to simulate internal medicine presentations. We used quantitative and qualitative analyses to evaluate the effectiveness of one of our VP cases. We also evaluated the role of VP cases as a tool for the integrated teaching of CanMEDS competencies.

Main text

Methods

Module design

We selected a VP case from a pool developed by physicians at the University of Toronto. Each module begins with defined learning objectives, followed by the case and guided questions. Drop-down menus provide suggested responses to questions, and discussion points highlight concepts pertinent to evidence-based medicine and key psychosocial factors. For this study, we used a case on diagnosis and management of upper gastrointestinal bleed (UGIB).

Module evaluation

We invited University of Toronto trainees to participate in VP case evaluation. Trainees completed questionnaires about demographic items and rated their perceived confidence in diagnostic and management abilities (Additional file 1: Additional materials—Questionnaires). Items were measured on a 5-point scale ranging from 1 (“poor” confidence) to 5 (“excellent” confidence). Next, we randomized trainees to complete a VP case (intervention arm) or PowerPoint presentation (control arm). After module completion, participants again completed the confidence questionnaire, and a 10-item multiple-choice test assessing knowledge acquisition (Additional file 1: Additional materials—Questionnaires).

We invited trainees who completed the post-test to participate in audiotaped focus groups to provide open-ended feedback, especially related to their experiences in learning the integrated non-medical expert CanMEDS roles. We conducted two focus groups (5 participants each) with medical students and residents, using a semi-structured interview method. Each meeting was between 60 and 120 min in duration. Meetings were audiotaped and transcribed.

The methods were approved by the institutional research ethics board at the University of Toronto. Written consent was obtained from all participants.

Data analysis

We analyzed responses to the multiple-choice questions with the Wilcoxon signed-rank test using Excel software. Free text responses were analyzed using grounded theory to identify common themes. Two authors (SJ and JW) independently transcribed and analyzed the focus group audio recordings using the “framework” technique (described in [8]).

Results

Quantitative analysis

A total of 52 participants completed the study. Baseline characteristics were similar between groups (Table 1). A majority had used online learning modules, but most of these were non-interactive (Table 1). Participants differed in their perceived confidence in diagnosing UGIB, with lower baseline ability to diagnose and manage UGIB in trainees who completed the VP case (Table 1).

Table 1 Participant characteristics (% of participants in each intervention group)

Table 2 compares average objective knowledge acquisition and clinical reasoning scores from participants’ post-module tests. Trainees performed similarly in each test question, including those pertaining to non-medical-expert CanMEDS competencies (Table 2). Change in level of confidence for clinical hand-over trended towards being higher in the VP-case arm (Mann–Whitney P = 0.051, Table 2). Overall, there were no significant differences in participants’ perceived confidence (Additional file 2: Tables S1 and Additional file 3: Table S2).

Table 2 Comparing objective assessment scores (median [IQR1, IQR3])

We asked participants for feedback on the VP case. Although trainees thought there was similar learning value in the VP case and PowerPoint presentation, they preferred the VP case as a learning resource (Additional file 4: Tables S3 and Additional file 5: Table S4).

Qualitative analysis

To further evaluate how medical students and residents used the VP cases in their learning, especially pertaining to integration of non-medical-expert CanMEDS competencies, we organized two trainee focus groups. The baseline characteristics of these participants are shown in Additional file 6: Table S5. Seven categories of investigation were highlighted regarding VP cases as a learning resource and for integration of CanMEDS competencies. We divided each category into subcategories; each comment was allocated to one of these subcategories. We repeated charting for focus groups using the same subcategories. The interview technique was iterative after the first focus group. We (JW and SJ) compared individual analysis of the focus group comments, and found our analyses in agreement. The categories and subcategories of the framework technique are presented in Table 3 and below.

Table 3 Focus group categories

Category 1. Trainees are looking for practical resources beyond didactic lectures

Focus group participants emphasized limited time to learn a vast amount of knowledge. The transition points from pre-clerkship to clerkship, and from senior medical student to intern, was felt to be especially challenging.

I think one of the hardest parts when you are beginning is … we get inundated with so much information. We don’t know what is important and what is not.

Category 2. Learning needs differ based on level of training

Pre-clinical students were looking to gain experience with a practical approach to a clinical presentation.

I’m focusing less about the UGIB content and more about the experience – I think I took a lot out of it that way

Most junior medical students wanted repeated practice with ward skills. Senior medical students confirmed that they felt inadequately prepared, which detracted from their learning experience.

Things like handover, writing admission orders, writing a prescription, etc. - we didn’t get taught that until the week before clerkship started.

Senior medical students were looking for efficient ways to refresh their knowledge, especially in preparation for their licensing exams. They emphasized that an interactive platform would be more effective than passive review of lectures or of the scientific literature.

The consensus was that learning resources which provide concise, practical, evidence-based information would be useful to help build confidence in diagnostic and management skills. Trainees emphasized the importance of a simulation setting where they can safely practice skills without consequences on patient care.

It is nice to go through early in clerkship, to have a place to safely practice things without judgement or killing a patient.

Category 3. VP cases are a useful adjunct to didactic lectures

In comparison to commonly-used resources, medical students appreciated that the VP case simulated a real world clinical scenario.

Toronto Notes is … comprehensive but a book of lists with no emphasis on what’s common, what you should prioritize. This realistic case scenario which takes you through the steps in practical terms is more useful.

Residents appreciated the teaching of practical, clinically-relevant details.

I like the specifics - like doses and timelines, like 72 hours, how many milligram. At a resident level that’s what we need to know.

Participants also liked the integration of evidence-based medicine, and appreciated the user-friendly, interactive aspect of the case.

And the format you used with the dropdown menus – they gave you a chance to think about the question and then the answer was there.

Category 4. Suggestions for improvement

Trainees felt the case was lengthy and contained extraneous detail. Participants also wanted a greater level of interactivity.

Category 5. There are challenges to current approaches to CanMEDS training

Participants felt that the way CanMEDS breaks down the concept of the physician is reductionist, and not realistic.

CanMEDS is trying to make an abstract thing concrete and it does not make sense. If you try to focus on communicator role in our job as a physician, it is not doing it justice. We communicate all the time, it is hard to take it out of context and isolate it.

Category 6. VPs may represent a useful tool for integrating CanMEDS

The consensus was that VP cases may be a useful resource to integrate CanMEDS roles in medical education. Trainees appreciated that they did not realize they were learning CanMEDS competencies throughout the VP case.

It was a surprise. For example, writing admission orders. Those are really useful for clerkship. Nice way to integrate it without being explicit.

They also liked that multiple competencies were covered with one concept.

I like that about [the] cases. Like handover includes…communication, collaboration…

Category 7. Simulations cannot replace real world experience of patient care

Most trainees felt that, although VP cases provided a useful adjunct, many of the CanMEDS competencies are best achieved through real world experience.

Discussion

Our goal in creating VP cases was to facilitate transition between the role of a senior medical student and a first year internal medicine resident. This was based on consensus among colleagues and studies reporting that 41–60% of medical graduates feel clinically unprepared after medical school graduation [9,10,11]. Interactive VP cases allows medical educators to facilitate learning in an environment that does not compromise patient safety [12]. We hoped that our VP cases could complement the medical curricula to help trainees become comfortable with assessing and managing common, key presentations in a protected environment.

Our results indicated that VP cases did not significantly affect knowledge acquisition, for both medical expert and non-medical expert CanMEDS topics (Table 2). This is consistent with a meta-analysis of 201 studies summarizing the effect of internet-based instruction for medical trainees [13]. Despite its modest effects on knowledge acquisition, a majority of participants enjoyed using VP cases as a resource to help them prepare for, and reinforce clinical experiences. Trainees’ preference for using VP cases, especially over traditional curriculum adjuncts, is important, as learner engagement can significantly improve effectiveness of technology-enhanced simulation [14]. Other features of simulation-based training shown to be effective in medical education, including cognitive interactivity and repetitive practice [13], were aspects of our VP case appreciated by study participants. It is possible that trainees perceive VP cases as a useful resource as their learning can be customized to their actions within the case, resulting in unique learning trajectories. For example, in our study junior trainees focused on learning an approach to the consultation process, whereas senior trainees reviewed their medical knowledge. In addition, junior trainees concentrated on non-medical expert CanMEDS competencies, such as writing admission orders, whereas senior trainees enjoyed learning about evidence-based medicine. Based on current adult learning theories, such personalized and interactive instruction methods may be more powerful and efficient than didactic education [15].

Although trainees agreed that non-medical expert CanMEDS roles are important, they consistently expressed dissatisfaction with existing CanMEDS curricula, finding the approaches reductionist and artificial. Trainees appreciated incorporation of CanMEDS topics in the VP case, and especially that multiple CanMEDS competencies were introduced without disrupting the flow of the case. VP cases may provide an exciting new arena where CanMEDS competencies can be introduced or reinforced.

There are several advantages to integrating VP cases in medical education, including cost benefits, cases that closely match real-life situations, the ability to create collections of similar cases, seamless integration of CanMEDS competency training, and the ability to create VP cases with which trainees should ideally gain competence. Future work will concentrate case enhancements based on feedback, and cases that can provide real-time feedback or introduce different challenges based on training levels. We hope to create a larger pool of cases to allow for standardization in trainees’ exposure to common and atypical internal medicine presentations.

Limitations

Our analysis is limited by a small sample size and selective participation. Another limitation is the use of self-assessment to evaluate changes in knowledge and confidence in managing UGIB, although analysis of objective knowledge scores corroborated subjective reports. Lastly, our study was limited to trainees at University of Toronto, and used one VP case to extrapolate data. It would be interesting to evaluate whether our data is reproducible for different VP cases, and at other medical training programs.