Dear Editor,

Although they care for many hospitalized patients with difficult venous access (DVA), internal medicine (IM) residents do not routinely receive formal training in ultrasound-guided peripheral intravenous catheter (USG-PIV) placement [1]. USG-PIV is an effective method to establish vascular access and reduces the utilization of advanced catheter use among DVA patients [2]. Feasible strategies to train IM residents in USG-PIV placement are needed.

In a 2013 survey of IM residency program leadership, the point-of-care ultrasound (POCUS) applications identified to be most useful were procedurally based; however, only 25% of respondents reported offering formal POCUS curricula to their residents [3]. The lack of IM resident training in procedural POCUS applications, such as USG-PIV, was further illustrated in a 2019 systematic review of published USG-PIV training curricula. While 16 of the 23 studies described training of emergency medicine (EM) physicians and other emergency department (ED) staff, none described the training of IM physicians [1].

Training in USG-PIV is well-described in the ED, as patients with DVA comprise up to one-third of ED patients [4]. As a result, EM residents have many opportunities to hone this skill. Since USG-PIV training has proven effective in utilizing a resident-as-teacher approach, it would be reasonable to leverage the procedural proficiency of EM residents to benefit the education of IM residents [5].

We created a resident-led, interdepartmental training program where EM residents taught USG-PIV placement to IM residents. This study sought to evaluate short-term educational outcomes of our program and to use skills acquisition outcomes to validate a USG-PIV procedural training checklist we had previously developed via the Delphi method [6].

Methods

In 2015, a USG-PIV training program was developed jointly between the EM and IM departments at our hospital with the support of both EM and IM residency program leadership. Enrollment was offered to postgraduate year (PGY) -2 & -3 IM residents during the second half of each academic year, starting in January 2016 and ending in June 2018. PGY-3 and PGY-4 EM residents served as instructors for didactic components of the program. After completion of the PGY-2 EM role, which is focused on procedural skills development in the ED including USG-PIV placement, PGY-3 and PGY-4 EM residents would have had extensive procedural experience and thus were well-qualified to teach this skill. The ED at the study site serves over 130,000 patients annually.

The training program consisted of didactic and hands-on sessions in the hospital simulation center, as well as an ED-based practical component. Didactic session content was approved by the EM ultrasound faculty. During the 2-h didactic session, participants viewed a published instructional video about USG-PIV placement and completed a homegrown attitudinal questionnaire [7]. Then they underwent supervised instruction in USG-PIV placement using vascular access simulators (Blue Phantom, Sarasota, FL). During the 2016–2017 academic year, the didactic session also included baseline skills assessments, which consisted of participants performing video-recorded USG-PIV placement. The video recordings captured two views via a split screen. One window showed the participant's hand movements and use of the vascular access simulator while the other window portrayed a real-time sonographic display, as shown in Fig. 1. Participants’ faces were not visible, and they were asked to verbalize procedural steps that were necessary for the placement of USG-PIV in the clinical setting but not applicable in the simulation-based setting. Due to the limited resources of the simulation center, video recordings were only obtained during the 2016–2017 academic year.

Fig. 1
figure 1

Skills assessment split screen. An example of the split screen view captured during the video-recorded skills assessments

Within 4 weeks of completing the didactic session, IM participants completed the practical component, during which they spent four 8-h shifts in the ED performing USG-PIV placement when it was necessary for clinical care. They were asked to keep a log of their USG-PIV placement attempts. Procedural supervision was provided by PGY-2, PGY-3 or PGY-4 EM residents, or EM faculty. At the conclusion of their practical component, IM participants were asked to complete their second attitudinal questionnaire and, during the 2016–2017 academic year, their second video-recorded skills assessment.

Questionnaires and video recordings were paired; the last four digits of each participant’s telephone number were used as unique identifiers to allow for repeated-measures analysis. Only complete pairs of questionnaires and video recordings were included in the analysis.

Educational outcomes included changes in both self-reported attitudes and USG-PIV placement skills among IM participants. Skills acquisition was assessed using video recordings. Two EM ultrasound specialists (K.P. & J.R.) asynchronously rated the video recordings using a modified Global Rating Scale (mGRS) as well as our published USG-PIV procedural training checklist. Global Rating Scales are well-accepted instruments in the assessment of residents' procedural skills [8]. A similar approach was utilized by Hartman et al. in the validation of a procedural training checklist for ultrasound-guided central venous catheter placement [9]. Our USG-PIV procedural training checklist was generated via the Delphi method but remains unvalidated [6]. For the skills assessments, raters were blinded to each other’s ratings, and video recordings were randomized.

All statistical analyses were performed using NCSS v21.0.3 (NCSS, LLC; Kaysville, UT). The Wilcoxon Signed-Rank test was used to analyze ordinal data in the repeated-measures design. Quadratic-weighted κ was calculated to assess interrater reliability. Spearman correlation was used to assess for a relationship between changes in mGRS scores and changes in procedural training checklist scores. Changes in the scores were calculated by subtracting baseline measures from post-program measures. For all comparisons, a p value ≤ 0.05 was considered to be significant.

The Institutional Review Board at Boston University School of Medicine approved all study protocols, including the informed consent process.

Results

In total, 36 IM residents participated in the program; 25 (69%) were in their PGY-2 year and 11 (31%) were in their PGY-3 year. Twenty-one participants (58%) completed both before and after questionnaires. During the 2016–2017 academic year, 10 of 11 (91%) participants recorded a pair of skills assessment videos. Twelve participants (33%) kept a log; the median number of attempted USG-PIVs was 10 (interquartile range 9–12). Interrater reliability was moderate for ratings performed using the procedural training checklist (κ = 0.538; 95% confidence interval 0.286–0.789) and substantial for ratings performed using the mGRS (κ = 0.622; 95% confidence interval 0.499–0.745).

Results of the attitudinal questionnaires are summarized in Table 1. Ratings using the mGRS and the procedural training checklist are summarized in Table 2. A strong positive correlation was found between changes in the procedural training checklist and changes in the summed mGRS score (Spearman ρ = 0.702; 95% confidence interval 0.328–0.886; p < 0.01).

Table 1 Attitudinal outcomes among program participants who completed questionnaires
Table 2 Skills acquisition outcomes among program participants who created video recordings

Discussion

This resident-led USG-PIV procedural training program demonstrated improvements in both attitudinal and skills outcomes. A procedural training checklist was validated via correlation with the mGRS, an established procedural training instrument.

After completing the program, IM residents reported greater degrees of comfort in performing USG-PIV placement and in teaching this skill to their colleagues. The perceived importance of POCUS training among IM residents was high prior to their participation in the training program, and remained so afterwards, emphasizing the program’s acceptability to the participants. As captured by the rated video recordings, IM participants demonstrated improvement in USG-PIV placement skills. The median number of USG-PIV placement attempts was 10, which matches the number of supervised attempts required by most of the USG-PIV training programs reported by van Loon et al. [1].

Validation of the USG-PIV procedural training checklist was achieved, as changes in the procedural training checklist scores correlated strongly with changes in the mGRS scores. As Global Rating Scales are well-established in surgical training but require modification prior to application, there is an educational role for a dedicated, validated USG-PIV training checklist [10].

This program followed a resident-led resident-as-teacher model, which has previously been described as a successful educational strategy [11]. Such an approach allowed for minimal ongoing faculty involvement, improving the program's feasibility. Use of video recordings further enhanced feasibility by allowing for asynchronous evaluation of procedural skills. Resources of the hospital simulation center were critical to the program's implementation.

There are several limitations to our study. As this pilot program focused on feasibility and implementation, only short-term outcomes were captured. To more definitively understand the long-term impact of this approach, longitudinal data collection would be necessary. Additionally, just over half of the participants completed attitudinal questionnaires, and the video-recorded skills assessment was offered only to a subset of the total participants. To mitigate the effect of these limitations, we utilized nonparametric statistical analysis. Participation in the program was voluntary, and attitudinal outcomes were self-reported, so the risk for both selection bias and confirmation bias was present. Program efficacy may not have been the same across different baseline skill levels. Finally, study activities took place at a single site and required dedicated audiovisual and simulation-specific resources, limiting the generalizability of any conclusions.

The use of video recordings was necessary for the asynchronous assessment of IM participants' technical skills. It allowed the evaluators to perform ratings using multiple instruments without requiring repeat procedural attempts by participants. This approach may be especially novel and timely given the impact of the COVID-19 pandemic on medical education [12]. In addition, the video-recording technique utilized in this study often made it difficult to identify IM participants' race and gender, although the recorded audio sometimes made gender identifiable. Further refinement of this approach may help mitigate the bias that is well-described in procedural skills evaluation, and similar technology-based efforts have been described in surgical education and in USG-PIV skills assessment [13,14,15]. In future work, strategies to further de-identify trainees such as voice modulation or automated transcription could be evaluated.