Screening, Brief Intervention and Referral to Treatment (SBIRT), is an evidence-based approach to screening and early intervention for those at risk of substance use disorders. With the ongoing health concerns related to COVID-19, there is an increased need for social workers who can competently deliver evidence-based interventions, such as SBIRT, via telehealth. Due to the COVID-19 pandemic, traditional SBIRT training approaches using face-to-face (FTF) instruction and FTF simulated practice may not be a safe or feasible way to develop students’ SBIRT- related skills. This study explores 35 social work graduate students’ experiences of learning SBIRT skills in a remote learning format and subsequently delivering a SBIRT intervention to a live “client” via a peer-to-peer simulated telehealth session. Overall, students reported that the shift from FTF to remote learning made learning SBIRT skills difficult, and that providing brief intervention and referral was the most difficult step of the simulated SBIRT telehealth intervention. Qualitative feedback indicates that overall, students found the simulated telehealth sessions a valuable learning experience, but also reported that richer educational experiences would have resulted from additional practice opportunities and real time feedback. Implications for future research, simulation-based education and clinical practice are discussed.
Introduction to SBIRT
Screening, Brief Intervention and Referral to Treatment (SBIRT) is an evidence-based approach for identification and early intervention for those at risk of substance use disorders (SUDs) (Babor et al. 2007; Madras et al. 2009; SAMHSA 2011). The SBIRT intervention generally takes 10–15 min to complete and can be conducted with clients of any age that may be at risk of substance misuse. While the majority of SBIRT is conducted in healthcare related settings (Agerwala and McCance-Katz 2012), it is also appropriate for use in child welfare and educational settings (Curtis et al. 2014; Mitchell et al. 2012; Wright et al. 2016). The SBIRT intervention is comprised of four steps, which are delivered in the following order: (1) quickly building rapport and inviting the client to have a conversation about health; (2) using a standardized tool to conduct a universal substance use screening; (3) scoring the screening and providing the client with feedback on the meaning of their scores; and (4) providing a brief intervention and, if indicated, a referral for further treatment (Babor et al. 2007). SBIRT can be delivered either in person or remotely using a secure online telehealth platform, such as doxy.me, Teladoc or Zoom for Healthcare. The combination of flexibility of use across multiple practice settings (Sacco et al. 2017) combined with its effectiveness (Babor et al. 2017) has led to increased adoption of SBIRT as an important component of clinical social work training.
Due to the ongoing “opioid crisis” and continually high levels of substance misuse in the general population, it has become increasingly important for future social workers to be able to competently identify and assist clients with substance use concerns (Ashenberg Straussner and Senreich 2002; Berger and DePaolo 2015; Makhaira 2014; Vakharia 2014). Begun and Clapp (2016) assert that reducing alcohol (and drug) misuse and their consequences should be one of the foremost goals of social work, due to the widespread impact that problematic substance use has on all parts of society. Social workers can expect to encounter individuals/families with SUDs across practice settings, including schools, the justice system, the child welfare system and physical and/or behavioral healthcare settings. As such, learning evidence-based approaches to address alcohol and drug misuse should be an important component of social work education (Ashenberg Straussner and Senreich 2002; Harwood et al. 2004; Osborne et al. 2012; Quinn 2010; Vakharia 2014).
SBIRT Training Using Simulated Sessions
SBIRT in social work education is generally taught as part of field based instruction or as part of a direct practice class. Traditionally, SBIRT training consists of classroom-based instruction about each intervention step, followed by a simulated practice component with a live “standardized” client, often times an actor or someone who had already completed the class. Simulations are designed to help students learn the different skill sets needed for each step of the SBIRT intervention, while providing them with “hands on” practice. These live simulations may be video recorded so that instructors can provide more structured and in-depth feedback on students’ performance. Students can also review how competently they conducted the SBIRT session and reflect further on their skill development.
A working definition of what constitutes a simulation offered by Bland et al. (2011) states that a simulation is “a dynamic process involving the creation of a hypothetical opportunity that incorporates an authentic representation of reality, facilitates active student engagement and integrates the complexities of practical and theoretical learning with opportunity for repetition, feedback, evaluation and reflection” (p. 668). Accordingly, the authors conceptualize simulations as different from traditional role plays often used in social work classrooms, as they: (1) tend to be longer in duration—whereas most role plays are very brief usually 3–5 min, simulated sessions may last between 15 and 20 min or longer, (2) require students to prepare in advance—students are given their “role” 1 to 2 weeks prior to the simulation and are asked to closely review the information on the client they would be portraying ahead of time to add a level of authenticity and realism to the simulated client interaction, whereas role plays tend to be impromptu, requiring minimal preparation and thus may lack the requisite authenticity, and (3) feature “clients” with more complex social histories and clinical symptom presentation. Simulations may utilize other students in class as clients (standardized peer-to-peer simulation), students who have already completed SBIRT training acting as clients (standardized client simulation) or real actors (standardized actor client simulation), depending on the program size and resource availability.
During the last 5 years, there has been a proliferation of research on the impact of SBIRT training and simulated practice on social work skill acquisition, as well as on students’ perceptions of these simulations. Osborne et al. (2012, 2016) were some of the first social work scholars to incorporate SBIRT training into the social work curriculum, as a way of expanding addictions education. They found that SBIRT training improved social work student’s perception of their ability to assess and intervene with clients experiencing alcohol use disorders (AUDs). These findings were supported by Pugatch et al. (2015), who also found the implementation of SBIRT training to be an acceptable and feasible way to incorporate substance use curriculum into social work education. Multiple other studies indicate that participating in SBIRT training with simulated practice enhanced nursing, social work and other behavioral health care students’ SBIRT skills, as well as improved their attitudes towards those who are misusing substances (Neander et al. 2018; Putney et al. 2017; Sacco et al. 2017; Sampson et al. 2018; Senreich et al. 2017; Tanner et al. 2012).
Technology Enhanced Simulations and Telehealth
The continued growth of fully online MSW programs (CSWE 2019), coupled with health risks associated with in-person client interactions during the COVID-19 pandemic, highlight the increasing need for social workers who can competently deliver telehealth services (Smith et al. 2020). Current safety recommendations promoting remote learning may require students to learn and perform the SBIRT skills without the normal face-to-face (FTF) interaction in a traditional classroom or field-based setting.
There is an emerging body of research concerning online SBIRT training and the use of virtual standardized client simulations to practice SBIRT skills (Huttar and BrintzenhofeSzoc 2020). Virtual standardized client simulations are similar to live standardized client simulations, except that the “standardized clients” are computerized avatars with varying levels of interactively. Studies focusing on web-based SBIRT training for health care providers (i.e., nurses, primary care providers and medical/nursing students) using virtual standardized client simulations found this training approach was associated with more positive attitudes toward individuals who use substances, as well as increased SBIRT-related knowledge and skills (Bradley et al. 2012; Stoner et al. 2014; Tanner et al. 2012). Further, these studies found that engaging in these simulations increased users’ confidence in being able to competently deliver the SBIRT intervention, and that the simulations were feasible to implement and rated highly by users (Koetting and Freed 2017; Puskar et al. 2016; Wamsley et al. 2018).
Similarly, the Sampson et al. (2018, in press) investigated the use of virtual standardized client simulations to improve brief behavioral health assessment skills with a sample of social work and counseling students. They found that the use of standardized virtual clients to practice brief mental health assessment skills was a feasible training approach that produced outcomes similar to those found with traditional FTF standardized actor-client simulations. Hitchcock et al. (2019) found that virtual client simulations were associated with increases in social work and nursing students’ confidence, competency, and readiness to provide the SBIRT intervention to adolescents, while Boyle and Pham (2019) found that virtual client simulations increased social work students SBIRT skills and knowledge.
After reviewing articles included in two recent reviews on the use of simulation in social work education (Huttar and BrintzenhofeSzoc 2020; Kourgiantakis et al. 2020), coupled with a systematic online search of the SBIRT and simulation literature, the authors were unable to locate any empirical articles focusing on SBIRT simulations that were conducted via a telehealth platform, where both parties were non-virtual (human) entities interacting live via computer from different locations. This article begins to address this gap by providing feedback from two cohorts of MSW students about their experiences of conducting a simulated SBIRT telehealth session after transition from FTF to remote learning due to COVID-19. One cohort was enrolled in a course on treatment of substance use disorders (hereafter SUT), the other was enrolled in a specialized training program focusing on integrated behavioral health care (IBH). Rather than to evaluate the effectiveness of this approach to SBIRT training (data to be presented in a forthcoming article), the authors sought to better understand two cohorts’ experiences of learning the four SBIRT skills in a fully online format, along with their perceptions of how difficult it was to engage in a simulated SBIRT session requiring them to deliver each step of the SBIRT intervention via an online video conference platform. As this study was exploratory in nature, the authors also sought to determine if any demographic factors or prior online learning experiences were associated with how students rated the difficulty level of executing each of the four SBIRT skills within a simulated client session, and if technology-specific factors, such as absence of reliable internet, were associated with students’ difficulty rating.
Context and Background of the SBIRT Telehealth Simulation
Two cohorts of students from urban, public MSW programs in the Southwestern United States completed an anonymous online survey about their experiences learning SBIRT skills in an online environment and delivering a simulated SBIRT telehealth intervention. Both cohorts had originally been enrolled in FTF programming in Spring 2020. However, due to COVID-19, their coursework moved online for the remainder of semester in early March. This move required faculty to incorporate new approaches to content delivery and assessment related to the SBIRT modules that were scheduled towards the end of the semester. Prior to COVID-19, students would be expected to complete a live simulated SBIRT session at the end of the spring semester with a standardized client portrayed by a doctoral student. However, the transition to online virtual learning occurred prior to initiation of the standardized client training. Those who had volunteered to serve as standardized clients indicated that they did not have the necessary bandwidth to complete the training or simulated sessions, given concerns about COVID-related disruptions to their own coursework and dissertation efforts. Thus, the decision was made by the authors to use standardized peer-to-peer simulations in place of the standardized client simulations, and that these simulations would be conducted as if the social worker and client were engaging in a live telehealth session. This decision not only made the end of the semester evaluations feasible; it also allowed the instructors to incorporate content about conducting effective telehealth sessions into the students’ coursework, adding another timely dimension to their academic preparation.
Two weeks prior to the simulated sessions, students in each class were divided into two groups. Each of the two groups was given a unique and detailed standardized client scenario to use when they were acting as the client. The two client scenarios were the same for each cohort. Each scenario included demographic characteristics for the client, along with a detailed psychosocial and health history. Students within the same cohort but with different client scenarios were then paired with one another to complete their simulated telehealth sessions. Students were also given a “tip sheet” (available from the authors upon request) used in the past for training “standard actor clients” on best practices for accurately depicting a client during a simulated session. These tips included how to ground themselves to reduce anxiety, information on how they should react to the interviewer’s questions, and ways to accurate demonstrate affect through subtle changes to tone of voice, eye contact and posture. Students were encouraged to reach out to their instructors if they had additional questions about how to skillfully present as the client, or if they needed any other assistance in this area. Following assignment of the simulation scenarios, students received 3 h of synchronous SBIRT instruction via the Zoom video conferencing platform, supplemented by video examples showing: (1) how to effectively conduct a brief substance use screening using a standardized tool; (2) how to discuss the results of this screening with the client in a non-judgmental way that would build the client’s motivation to change; and (3) how to conduct a brief intervention, such as providing education or creating a risk reduction step, and if needed refer for additional services.
For their simulated session, students were asked to create a 10–12-min video demonstrating their SBIRT skills with a client with whom they would meet remotely using the Zoom online video conferencing/telehealth platform. All students were then given a detailed rubric concerning the skillsets that they would be assessed on, as well as tips on how to competently execute each of the SBIRT skill sets. Finally, after completing the SBIRT telehealth session, students were required to watch their simulated sessions, rate their own performance using the rubric, and reflect on aspects of the interaction that they found challenging. All students received a written summary of individualized feedback on their simulated telehealth sessions, from the first author (University 1) and the first and the third authors (University 2).
Thirty-seven full time MSW students (first year, second year and advanced standing) originally enrolled in FTF courses were invited to participate in this study; 15 students enrolled in a class on SUD treatment at University 1, and 22 students enrolled in a class on integrated behavioral health care at University 2. They were asked to complete an anonymous online survey at the end of the Spring 2020 semester to offer feedback on their experiences of learning and practicing SBIRT skills in an online environment and to assist with ongoing internal quality improvement efforts related to these classes. This survey could be accessed via an online Qualtrics link sent out via email by the first author. Students were told that their responses would be anonymized and no IP addresses would be collected, as this option was disabled by the authors. Participation in the survey was not required but strongly encouraged. A total of 35 MSW students (15 SUT students and 20 IBH students) completed the survey, yielding a 95% participation rate.
The survey instrument consisted of 28 questions about conducting a simulated SBIRT telehealth session and the unexpected transition to fully online learning. It took, on average, 21 min to complete. The survey included (1) demographic questions, (2) two questions about the number of online undergraduate or graduate classes they had completed prior to the spring 2020 semester, (3) five questions specific to how COVID-19 impacted their classroom instruction and field-based training, (4) two questions specific to how shifting to remote instruction impacted their learning about addiction or integrated care, (5) eight questions each about how easy or difficult it was for them to learn each of the four SBIRT skills in an online environment and why, (6) three open-ended questions asked students to share their perceptions of how completion of a simulated SBIRT telehealth session impacted their overall SBIRT skill development. In addition, they were asked for suggestions on how the process could be improved in the future to maximize its impact on SBIRT skill development and to describe what they liked and did not like about conducting a simulated SBIRT telehealth session.
Upon completion of the 2020 spring semester, the first author received approval from University 1′s Institutional Review Board to complete a secondary data analysis on these deidentified data for purposes of publication. After approval, a reciprocity agreement with University 2 was put in place to include the data from the students attending University 2. All students enrolled in these two courses, regardless of survey participation, were notified via email that these data would be published and were given the opportunity to have their unique responses removed from the data set. No students requested for their data to be removed.
Survey responses were exported from Qualtrics into SPSS version 25. Quantitative data were cleaned and checked for assumptions of normality. Less than 1% of all data were missing, and pairwise deletion was used for variables containing missing data. Frequencies and percentages were calculated for demographics and for survey questions that had binary response options. Means and standard deviations were calculated for all scale measures. Fisher’s exact tests were used to test if any demographic characteristics were associated with binary ratings of difficulty for each step of the SBIRT process. Independent t-test were used to determine if any demographic characteristics were associated with students rating of how much their learning related to SBIRT skills was impacted by the transition from FTF to online learning. The t-test is robust in relation to assumptions of normality, even when sample sizes are small.
Inductive content analysis (Hsiu and Shannon 2005) was used to synthesize qualitative responses to the three open-ended questions, where students elaborated on their experiences conducted simulated telehealth sessions. Content analysis allows for methodical classification of text and description of patterns within the data. This methodology is appropriate for analyzing a sample of simple open-ended survey questions, where there was no opportunity for researchers to ask follow-up or probing questions that may lead to the breadth or depth of content found in an interview (Elo and Kyngäs 2008; Hsiu and Shannon 2005; Oxhandler and Giardina 2017). Data were counted by two authors to determine the frequency of responses for each open-ended question. Data were then independently coded by the same two authors using descriptive open coding methods to identify emergent codes, followed by a round of pattern coding, which consolidated content into categories as appropriate based on shared or overlapping content. The input from a third author was used to resolve discrepancies between coders.
Thirty-five students who were enrolled in FTF MSW programs participated in the study. As seen in Table 1, the sample was predominately female (n = 27, 77.1%) and was largely representative of the racial/ethnic student makeup at these two institutions (University 1, 2020; University 2, 2020) with 31.4% (n = 11) of participants identifying as White/Caucasian, 28.6% (n = 10) as Black/African American, 22.9% (n = 8) as Latinx, 11.4% (n = 4) as Asian/Pacific Islander and 5.7% (n = 2) as multi-racial. One-quarter (n = 9, 25.7%) of participants indicated that English was not their native language and 5.7% (n = 2) were international students. The mean age of participants was 31.2 years (SD 8.1). Over three-quarters (n = 27, 77.1%) of participants reported completing at least one online graduate or undergraduate class in the past, with a mean of 3.5 online undergraduate (SD 7.8) and 3.1 graduate (SD 4.3) classes completed prior to the spring 2020 semester. Nearly one-quarter (n = 8, 23.9%) reported that they lacked access to adequate internet or technology to support the transition to online learning at least some of the time. Finally, 14.3% of the sample (n = 5) reported that they, or someone close to them, had been diagnosed with COVID-19.
To determine if participants’ demographic variables such as gender or ethnicity were associated with how difficult they found it to learn and execute each of the SBIRT skills in a remote learning environment, Chi Square and Fisher’s exact tests were executed. No significant differences were found on how students rated the difficulty level of each skill in relation to their demographic characteristics. As seen in Table 2, 40.0% of participants (n = 14) indicated that they thought the shift to online education made it more difficult to learn SBIRT skills (than if they had done it in a face to face format), with 22.9% (n = 8) indicating that this shift made it much more difficult. Nearly two-thirds (n = 23, 65.7%) of participants indicated that the skills of brief intervention and referral were difficult to practice/master in an online telehealth environment. However, participants also reported that they found it easy to practice/master the SBIRT skills of rapport building (n = 24, 68.6%), substance use screening (n = 20, 57.1%) and scoring/offering feedback (62.9%) in an online/telehealth environment. None of the participants indicated that all 4 steps of the SBIRT process were easy to execute within the online environment, with 20.0% of them (n = 7) describing all 4 steps of the SBIRT process as difficult to execute remotely.
Table 3 presents comparisons of the mean negative impact of the transition to remote learning on SBIRT skill development by demographics. Participants who were non-native English speakers reported that the shift to remote learning had less of a negative impact on their SBIRT skill development than native English speakers, t (33) = 3.18, p = 0.03, g = 1.2, 95% CI [0.89, 1.52]. This effect size was large. Participants who reported that all of the SBIRT skills were difficult to practice/execute in an online telehealth environment also reported that the shift to remote learning had a more negative impact on their SBIRT skill acquisition, t (33) = 2.26, p = 0.03, g = 0.91, 95% CI [0.58, 1.26]. The effect size associated with this difference was also large. No significant differences were found by cohort (IBH or SUT) or other demographic characteristics such as race/ethnicity, gender, prior remote education experience, availability of needed technology or if they or someone close to them had been impacted by COVID-19.
Responses of both cohorts of participants were aggregated to get a broader view of students’ experiences conducting a peer-to-peer simulated telehealth session, and their perceptions of how the unexpected transition to remote learning impacted their SBIRT skill development. Participants reflected on the perceived benefits of training future social workers in SBIRT delivered via a telehealth platform using simulated practice sessions, as well as possible improvements for future SBIRT telehealth training. Several content areas emerged from these qualitative data.
When asked about their perceptions of how simulated telehealth sessions helped them to practice their SBIRT skills, students mentioned various benefits of the peer-to-peer simulated telehealth session. In general, students agreed that SBIRT was an important intervention approach and that it was critical for MSW students to learn how to screen for and appropriately address substance use issues. They reported it was beneficial to their future clinical practice to be able to apply their brief intervention skills, learn more about telehealth and have an opportunity to recognize and reflect their own strengths and weaknesses after completing the simulated session.
Benefits of SBIRT Training
Among 35 valid responses, 12 students shared that the simulated practice helped to establish a strong understanding of the SBIRT process, and to be able practice the associated skill sets. As one student stated:
It helped to jump right in and practice this clinical skill. I know I have a lot to learn, but every time an attempt is made learning and applying new skills, I am sharpening my techniques and becoming more adept. (#8).
Another noted, “I see the value and efficiency of the SBIRT process when meeting with patients in time constraints.” (#7). Eight of 35 students said that watching their own simulation allowed them to recognize their strengths and identify areas for improvement, which they may otherwise not be able to do. They appreciated the opportunity to watch and re-watch their telehealth session and evaluate their own performance. One student shared the following:
It was helpful in allowing me to recognize areas I am less confident in and which areas I need to work on. It was helpful in catching countertransference. I also think it was helpful in showing me that I need to focus more effort on practicing efficient assessments in a primary care setting as my assessment lasted longer than I was hoping for. (#12).
The simulations also helped students to be aware of different ways to ask questions and identify skills that require more fine-tuning. Things mentioned were formality of language used when speaking with clients, facial expressions, amount of hand movements, and relaying unnecessary information. One student specifically described the areas of assessment he/she needs to work on:
I learned where I can cut back a lot on rambling/talking too much/over explaining. I also saw where I could speak more, and how to shape the conversation differently that would make the flow make more sense and allow the client to answer questions easier. (#6).
The majority of students reported that telehealth training was not part of their normal classroom or field-based instruction, but they believe telehealth skills were important to learn, given the current COVID-19 pandemic. They also conceptualized telehealth skills as a means to better serve vulnerable communities in the future. “Telehealth is a real need for underserved communities, so it forced me to practice in a format I am not used to doing.” (#19).
SBIRT Telehealth Likes and Dislikes
When asked what they liked and did not like about doing a SBIRT telehealth session, students shared both their favorite and uncomfortable parts of the process. Students indicated that they liked the instructors’ feedback and learning about telehealth. However, they felt uncomfortable recording and watching themselves, and experienced distress related to using a telehealth platform like Zoom. Five out of 33 students specifically mentioned that they liked receiving detailed feedback from their instructors and found it to be useful for the growth and development of their professional skills. One expressed their appreciation of feedback on their performance and explained, “How could I grow personally or professionally without feedback? I've been willing to be coached all my life.” (#18) Another student stated that the feedback was important for when they were going to work with actual clients in the field, stating, “Receiving constructive criticism helped me keep in mind what I should do for future/real practice.” (#9) Another student stated:
I liked that it forced me to get out of my comfort zone and also learn about how to introduce the technology to a client. I really like the training video that was provided on how to maintain confidentiality through technology and how to address this with the client. (#18).
Nine out of the 33 student participants indicated difficulties with technology, including unstable internet connection, screen sharing problems, and overall low levels of familiarity with using Zoom to communicate remotely. Among these nine responses, four students expressed frustration due to lack of control over the internet connection quality. For example, “The internet connection presented lots of challenges. Our video kept freezing and we had to start over. I liked practicing with my partner but felt that the technology made it less helpful.” (#4) Moreover, three students indicated they had difficulty with using the “screen share” option to share information with the client, as a part of the brief intervention. One student explained, “I didn't like having to juggle all of the documents we had to share on my screen, because I feel that it took away from the interaction and my ability to be present with the client. (#11) Finally, six out of 33 students indicated that they felt uncomfortable recording themselves. They were not used to talking in front of the camera or seeing themselves on screen, so it made the client interaction more difficult. One student simply said, “I just feel uncomfortable hearing my own voice and seeing my own face (on the screen).” (#25).
Suggestions for Improvement
Students offered suggestions on what could be improved to maximize the impact of the simulated session on SBIRT skill development. These included more opportunities to practice each step of the SBIRT process, prior to conducting a full SBIRT telehealth session. Additionally, doing live simulations that included real time feedback from the instructor. Although students were given a number of videos to watch where professionals demonstrated the SBIRT skills, two students suggested for the instructors themselves to do a video demonstration of the SBIRT intervention:
I wish I could have had experienced watching the professors act it out for us, so we could have seen an example of an interaction that fit all the requirements they were looking for and give us the opportunity to ask specific questions about why they did or didn't go a certain direction with a client. (#12).
They also indicated a preference for feedback in real time, instead of receiving feedback a few days after conducting the SBIRT session. As one student stated, “Maybe a live, in-class role play where instructors critique us live rather than a video and being critiqued after.” (#1).
This work begins to fill gaps in the literature regarding SBIRT simulations provided in a telehealth format, by presenting quantitative and qualitative data from two cohorts of MSW students that transitioned from FTF to remote learning during the COVID-19 pandemic. Students shared their perceptions about how this transition impacted their SBIRT skill development and also provided feedback on conducting a peer-to-peer simulated SBIRT telehealth session. Characteristics associated with the impact of the adjustment from FTF to remote learning were also examined.
The majority of the participants reported that it was easy to execute the first three SBIRT skills of rapport building, conducting and scoring a standardized substance use screening, and providing the client with feedback on the results of the screening as part of the telehealth session. This was not surprising, given that rapport building is a foundational skill taught in all social work programs. Thus, students confidence in their ability to effectively execute this skill via any mode of deliver (FTF in the same room or remotely via a telehealth platform) may have been less impacted, when moving to remote learning than less familiar SBIRT skills. Similarly, administering a standardized substance use screening and providing feedback to the client follows an orderly step-by-step process, which may not be as impacted by mode of delivery. In this study, students consistently indicated that this skill set of “brief intervention and referral” was the most difficult to learn and execute within the online environment, presumably because it does not follow a template or a standard process. Another reason it may have been difficult to execute brief intervention and referral is they are not typical strategies learned and practiced in foundation social work classes. Some students may have had a full year of gaining confidence and practice with building rapport, but limited exposure to brief intervention and referral to treatment. Moreover, what constitutes “brief intervention and referral” is different for each client, as clients will have different needs and goals, requiring knowledge about the SBIRT process and the integration of multiple clinical skills. In the future, instructors may wish to incorporate opportunities for additional simulation-based practice of this particular component of the SBIRT process, to ensure that students acquire both an adequate knowledge base and practice synthesizing multiple clinical skills. Accordingly, instructors may consider putting more emphasis on building student competence for the final step of the SBIRT process, which is essential to helping the client make some kind of meaningful behavioral change.
Over half of the students (62.9%, n = 22) indicated the shift to remote learning made it either more difficult or much more difficult to learn the SBIRT skills and effectively deliver the SBIRT intervention via telehealth. More research is needed to determine if it was the abrupt and non-chosen nature of the shift to remote instruction which made skill acquisition in this area seem more difficult, or if learning SBIRT in an online environment is generally more difficult than learning it FTF. In addition, all results should be interpreted within the context of life within the first few months of COVID-19 and subsequent school and statewide shutdowns. This is an important layer of context that no doubt affected students’ capacity for remote instruction and new skill acquisition. Additional research comparing students’ perceptions of learning SBIRT via different modes of instruction, as well as comparisons of associated learning outcomes is needed. Moreover, research on using telehealth-based simulations as a clinical training tool is also indicated to determine if learning transfer related to telehealth is similar to that resulting from traditional FTF instruction or other online simulated learning approaches. It is worth noting that over half the sample affirmed that moving online mid-semester made the new skill acquisition difficult. As clinical social work instructors, we must be alert to potential stress and disruption that is caused with changes to learning environments to which students are accustomed and strive to bridge any potential gaps in understanding.
Some study results were inconsistent with prior research in this area. For example, the responses of students with no prior online education experience were not significantly different than those who reported having previously completed at least one online course. Prior research indicates that students who are more familiar with online learning tend to perform better on measures of learning outcomes, as well as have more positive perceptions of the online course experience (Kauffman 2015; Lee et al. 2011; Wang et al. 2013). This was not the case with our sample, which was an unanticipated finding. Using a median split, we further explored if those who had completed three or more online classes had significantly different perceptions about the level of difficulty associated with learning SBIRT skills online, or if they rated the transition to remote learning as having more of a negative impact on their SBIRT skill development. Again, no significant differences were found. These results indicate that familiarity with remote/online learning is only one component influencing students’ perceptions of how easy or difficulty it is to learn practice-based skills within an online/remote learning environment. Prior work indicate that additional factors such as instructors’ comfort with remote instruction methods, student motivation, individual learning styles, and instructional design also play a role in how students’ perceive the online learning environment (Kauffman 2015; Wang et al. 2013). Instructors who wish students to engage in simulation-based learning within a remote learning context may benefit from additional training in effective online instructional methods. Instructors must be able to model comfort with technology and competently demonstrate the skill sets they demand of the students. Instructors may also consider conducting their own simulated sessions using the same telehealth platform the students will use (such as Zoom, Canvas or Microsoft Teams), prior to the beginning of the semester to allow them to help students troubleshoot technology based difficulties should they arise and also serve as a model for how to conduct an effective simulated telehealth session.
The authors were surprised that non-native English speakers reported that the transition to online education impacted them less than native English speakers, which is also inconsistent with prior research (Kauffman 2015; Lee et al. 2011). On average, they reported that this change had little impact on their ability to practice and master SBIRT skills. It is possible that these students, out of necessity of having to navigate educational systems in English rather than in their native language, may have developed a greater degree of flexibility in relation to their ability to adapt to challenges related to their educational environment than native English speakers who may not have been required to develop these skill sets in the past (Chamberlin-Qinlisk 2010; Koh et al. 2014). Likewise, non-native English speakers may have developed some educational strategies related to being multilingual that allowed them to more easily adapt to the changes presented by the transition from FTF to remote learning. Additional exploration regarding native English speakers and non-native English speakers ability to adapt to unexpected educational challenges is warranted.
Overall, qualitative feedback indicates that students felt that participating in these recorded standardized peer-to-peer simulations provided valuable opportunities to be self-reflective and receive feedback specific to the SBIRT process. They noted that it was helpful to observe the skills they applied well and also consider what they might have done differently in a session. Active self-reflection is key to building competency in clinical practice skills, as well as building confidence concerning intervention delivery (Sampson et al. 2018, in press; Rosen et al. 2017). Telehealth simulations such as those described in this study, may provide a unique opportunity for this level of self-reflection that is seldom afforded in field-based practice, due to issues associated with client confidentiality. Future research may wish to examine whether the implementation of a series of simulations, focusing on each step of the SBIRT process enhances student SBIRT skill development and self-efficacy by allowing them to practice, make mistakes, and learn from those missteps in a low stakes context.
Students found instructors’ feedback on their simulated telehealth sessions to be valuable to their global competency development. Receiving specific feedback from instructors prompted them to consider adjustments that could be made as they prepare to practice in real-world situations. However, a couple of students indicated a preference for real time feedback, rather than feedback after the session. This suggestion, which would combine simulations with real-time feedback, may strengthen the learning experience for students and for their peers who may be observing these sessions live or while watching others pre-recorded sessions. Accordingly, social work educators should attempt to structure their courses to include both synchronous and asynchronous feedback methods for evaluating student performance during simulations.
Students also indicated that given the current public health crisis, learning about telehealth was an important part of their education and training, since they may be expected to complete telehealth sessions in their field placements, due to ongoing health and safety concerns associated with face-to-face clinical interactions (CSWE 2019). Further, a few students noted that engaging in the SBIRT process through telehealth simulations provided valuable experience that may help them better serve currently underserved communities. Similar to other studies on telehealth (Carlton et al. 2020; Cook 2012; Joseph et al. 2011), the qualitative data illustrated that many students struggled with the technology aspects of delivering a telehealth session. These results suggest that when teaching about telehealth in the future, instructors may need to intentionally build in extra “low stakes” opportunities to practice using the telehealth software, prior to having students conduct a full simulated SBIRT telehealth session. Introducing the telehealth software (in this case Zoom) earlier in the semester, may also assist with addressing students’ perception that being on a video screen (either recorded or not) is more uncomfortable than being in the same physical space with a client. Some struggles frequently associated with remote learning and telehealth, such as unstable internet connections, slow internet speed or overloaded networks cannot be avoided and may continue to prove frustrating to students engaging in remote instruction. Prior to initiating the first online lesson or module, instructors should note that there is a possibility of these “glitches” occurring not only in the educational realm, but also when conducting telehealth sessions with actual clients. Instructions may also want to consider offering solutions concerning how to troubleshoot these issues should they occur, particularly for those new to remote learning. As the COVID-19 pandemic continues it has become exceedingly more important for all social work students to have at least some telehealth as a part of their basic competency development.
The authors solicited feedback from students concerning their experiences of remote learning once at the end of the semester. Students who ranked all SBIRT skills as difficult to learn and execute in an online environment also reported that their overall learning and skill acquisition was more significantly negatively impacted by this transition. These students reported struggling with both the transition to online learning and in being able to successfully demonstrate SBIRT skills during a simulated telehealth session. Early identification of students who are struggling with online learning is key to connecting them with needed resources to support their success. In the future, instructors may consider soliciting student feedback multiple times throughout the semester, particularly when an event significantly alters the normal course of educational operations. Soliciting ongoing feedback from students would allow instructors to engage in more one-on-one coaching with students experiencing greater challenges related to their educational attainment.
Several study limitations existed. This was an exploratory study, and the results are largely descriptive. These data were drawn from samples of graduate students from two large public urban southwestern universities. Both programs were located in the same state and had similar student body demographics; however, the results may not generalize to the larger population of MSW students. Although we tested for potential differential responses among groups, the conclusions that can be drawn from these data should be interpreted with care due to small sample size and the use of non-randomized sampling. Despite the higher levels of ecological validity and consistency in presentation that results from using non-peers as clients for simulated sessions (Sampson et al. in press; Bogo et al. 2014; Kourgiantakis et al. 2020), it was not feasible for this to occur at either institution due to the COVID-19 pandemic. Standardized peer-to-peer client simulations were used in place of simulations using standardized clients (actors or other students who were not in the class), who received formal training on how to accurately and consistently portray the “client.” It is recommended to use traditional (non-peer) standardized clients when logistically possible to minimize the potential impact on the clinical interaction that prior familiarity with one’s “client” may have. It is important to interpret these results within the context of the abrupt shift to remote learning coupled with significant risks to the health of students and their friends and family, due to the COVID-19 outbreak in early spring of 2020. There results may be different than those that would have been found during a time when larger environmental factors such as a public health crisis were not present. However, these challenges also provided the instructors with a unique opportunity to gather important feedback from students who normally would not be participating in online learning, or engaging in telehealth simulations, about how the process of learning and delivering the SBIRT intervention via a simulated telehealth session could be improved.
The current findings support prior research on the use of technology-based simulation in social work education as a feasible learning approach that is largely acceptable to students (Sampson et al. in press; Boyle and Pham 2019; Huttar and BrintzenhofeSzoc 2020). Students found that the simulations provided them with an opportunity to evaluate their own skills and receive feedback from instructors, which enhanced their confidence and skills for future practice settings. However, these results also indicate that, although more and more students have been exposed to online education as a regular part of course delivery, there remains a perception for some students that practice-based education is more challenging when done in an online/remote format. Indeed, this concern had already been voiced by social work faculty in relation to remote/online practice-based education (Reamer 2013, Smith et al. 2018b; Vernon et al. 2009). Nonetheless, an emerging body of research supports the effectiveness of remote/online educational methods on practice-based learning outcomes, if the course is thoughtfully designed and well-executed. (Cummings et al. 2013, Hamilton 2017; Kurzman 2013, 2019; Petracchi et al. 2005; Regan 2016). Findings from this study indicated that most students were receptive to technology-based simulations, and at least some acknowledged the potential of applying what they learned during simulations within practice settings to improve access to marginalized communities. The use of simulations with evidence informed interventions, such as SBIRT, is an asset to social work education as our society grows increasingly reliant on technology for communication and health services.
As there remains a great amount of uncertainty concerning the safety of traditional face-to-face instructional methods and clinical service provision, it is important now more than ever to rethink the discourse concerning online education and its ability to prepare social work students for real life practice. This—coupled with that fact that the vast majority of MSW programs now offer hybrid or fully online instruction (CSWE 2019)—makes it imperative that social work doctoral programs put more emphasis on preparing future social work educators to utilize technology to deliver course content and engage in technology enhance simulation-based learning methods.
The current state of public health is providing social workers with a unique opportunity to show their adaptability and willingness to (literally) meet clients where they are at by providing clinical services in new ways. Our study shows that it is possible to deliver an evidence-based intervention via telehealth, while providing a modern learning opportunity for clinical skill development that most students received favorably. Telehealth services have the potential to address the substantial health disparities that impact residents of rural areas and members of historically marginalized communities less likely to present in person for in traditional health care settings, due to logistical concerns such as transportation and stigma around mental health substance misuse (Benavides-Vaello et al. 2013; Miller 2005; Ohinmaa et al. 2010; Zhou et al. 2020). Previously, engagement in telehealth and other technology-enhanced service provision methods presented some unique challenges in terms of client privacy and confidentiality (McCarty and Clancy 2002). However, the security of telehealth platforms has been substantially improved over the past few years, as more and more providers have expanded their services to include telehealth (Blandford et al. 2020; Dorsey and Topol 2016).
Teaching future social work professionals to effectively use technology to enhance both their education and their practice via the integration of simulation-based learning will continue to situate the social work profession as the leading provider of direct mental health and substance abuse services. With over 700,000 practicing social workers in the United States (Bureau of Labor Statistics, U.S. Department of Labor 2020), and countless more worldwide, we are uniquely positioned to continue to address the ever-increasing behavioral health needs of our communities via the implementation of SBIRT and other evidence-based interventions (Smith et al. 2018a, b; Warner and Acquavita 2019). Future research should continue to investigate the impact of different training modalities on social workers’ SBIRT skill development and explore the impact of telehealth-based clinical simulations on clinical skill acquisition.
Agerwala, S. M., & McCance-Katz, E. F. (2012). Integrating screening, brief intervention, and referral to treatment (SBIRT) into clinical practice settings: A brief review. Journal of Psychoactive Drugs, 44(4), 307–317. https://doi.org/10.1080/02791072.2012.720169.
AshenbergStraussner, S. L., & Senreich, E. (2002). Educating social workers to work with individuals affected by substance use disorders. Substance Abuse, 23(S1), 319–340. https://doi.org/10.1080/08897070209511524.
Babor, T. F., Del Boca, F., & Bray, J. W. (2017). Screening, brief intervention and referral to treatment: Implications of SAMHSA’s SBIRT initiative for substance abuse policy and practice. Addiction, 112, 110–117. https://doi.org/10.1111/add.13675.
Babor, T. F., McRee, B. G., Kassebaum, P. A., Grimaldi, P. L., Ahmed, K., & Bray, J. (2007). Screening, brief intervention, and referral to treatment (SBIRT) toward a public health approach to the management of substance abuse. Substance Abuse, 28(3), 7–30. https://doi.org/10.1300/J465v28n03_03.
Begun, A. L., & Clapp, J. D. (2016). Reducing and preventing alcohol misuse and its consequences: A Grand Challenge for social work. The International Journal of Alcohol and Drug Research, 5(2), 73–83. https://doi.org/10.7895/ijadr.v5i2.223.
Benavides-Vaello, S., Strode, A., & Sheeran, B. C. (2013). Using technology in the delivery of mental health and substance abuse treatment in rural communities: A review. The Journal of Behavioral Health Services & Research, 40(1), 111–120. https://doi.org/10.1007/s11414-012-9299-6.
Berger, L., & Di Paolo, M. (2015). Screening, brief intervention, and referral to treatment (SBIRT): An Interview with Scott Caldwell, MA, and Darla Spence Coffey, PhD2. Journal of Social Work Practice in the Addictions, 15(2), 219–226. https://doi.org/10.1080/1533256X.2015.1029418.
Bland, A. J., Topping, A., & Wood, B. (2011). A concept analysis of simulation as a learning strategy in the education of undergraduate nursing students. Nurse Education Today, 31, 664–670. https://doi.org/10.1016/j.nedt.2010.10.013.
Blandford, A., Wesson, J., Amalberti, R., AlHazme, R., & Allwihan, R. (2020). Opportunities and challenges for telehealth within, and beyond, a pandemic. The Lancet Global Health, 8(11), e1364–e1365. https://doi.org/10.1016/S2214-109X(20)30362-4.
Bogo, M., Rawlings, M., Katz, E., & Logie, C. (2014). Using simulation in assessment and teaching: OSCE adapted for social work. Alexandria: Council on Social Work Education.
Boyle, S., & Pham, T. (2019). The Innovative Integration of SBIRT training using standardized clients and computer simulation in social work education. European Journal of Education, 2(2), 50–56. https://doi.org/10.26417/ejed-2019.v2i2-62.
Bradley, T. T., Wilhelm, S. E., Rossie, K. M., & Metcalf, M. P. (2012). Web-based SBIRT skills training for health professional students and primary care providers. Substance Abuse, 33(3), 316–320. https://doi.org/10.1080/08897077.2011.640151
Bureau of Labor Statistics, U.S. Department of Labor. (2020). Occupational outlook handbook: Social workers. Retrieved September 9, 2020 from https://www.bls.gov/ooh/community-and-social-service/social-workers.htm
Carlton, B., Abedini, N., & Fratkin, M. (2020). Telemedicine in the time of coronavirus. Journal of Pain and Symptom Management, 60(1), E12–E15. https://doi.org/10.1016/j.jpainsymman.2020.03.019.
Chamberlin‐Quinlisk, C. (2010). Language learner/native speaker interactions: exploring adaptability in intercultural encounters. Intercultural Education, 21(4), 365–377. https://doi.org/10.1080/14675986.2010.506704.
Cohen, J. (1992). A power primer. Psychological Bulletin, 112, 155–159. https://doi.org/10.1037/0033-2909.112.1.155.
Cook, R. (2012). Exploring the benefits and challenges of telehealth. Nursing Times, 108(24), 16–17.
Council on Social Work Education (CSWE). (2019). 2018 statistics on social work education in the United States. Alexandria, VA: Council on Social Work Education.
Curtis, B. L., McLellan, A. T., & Gabellini, B. N. (2014). Translating SBIRT to public school settings: An initial test of feasibility. Journal of Substance Abuse Treatment, 46(1), 15–21. https://doi.org/10.1016/j.jsat.2013.08.001.
Cummings, S. M., Foels, L., & Chaffin, K. M. (2013). Comparative analysis of distance education and classroom-based formats for a clinical social work practice course. Social Work Education, 32(1), 68–80. https://doi.org/10.1080/02615479.2011.648179.
De Winter, J. C. (2013). Using the Student’s t-test with extremely small sample sizes. Practical Assessment, Research & Evaluation, 18, 1–12. https://doi.org/10.7275/e4r6-dj05.
Dorsey, E. R., & Topol, E. J. (2016). State of telehealth. New England Journal of Medicine, 375(2), 154–161. https://doi.org/10.1056/NEJMra1601705.
Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. The Journal of Advanced Nursing, 62, 107–115.
Hamilton, L. (2017). Distance education in social work: A review of the literature. Professional Development: The International Journal of Continuing Education, 20(2), 45–56.
Harwood, H. J., Kowalski, J., & Ameen, A. (2004). The need for substance abuse training among mental health professionals. Administration and Policy in Mental Health and Mental Health Services Research, 32(2), 189–205. https://doi.org/10.1023/B:APIH.0000042746.79349.64.
Hitchcock, L. I., King, D. M., Johnson, K., Cohen, H., & Mcpherson, T. L. (2019). Learning outcomes for adolescent SBIRT simulation training in social work and nursing education. Journal of Social Work Practice in the Addictions, 19(1–2), 47–56. https://doi.org/10.1080/1533256X.2019.1591781.
Hsiu, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15, 1277–1288. https://doi.org/10.1177/1049732305276687.
Huttar, C. M., & BrintzenhofeSzoc, K. (2020). Virtual reality and computer simulation in social work education: A systematic review. Journal of Social Work Education, 56(1), 131–141. https://doi.org/10.1080/10437797.2019.1648221.
Joseph, V., West, R. M., Shickle, D., Keen, J., & Clamp, S. (2011). Key challenges in the development and implementation of telehealth projects. Journal of Telemedicine and Telecare, 17(2), 71–77. https://doi.org/10.1258/jtt.2010.100315.
Kauffman, H. (2015). A review of predictive factors of student success in and satisfaction with online learning. Research in Learning Technology, 23, 1–13. https://doi.org/10.3402/rlt.v23.26507.
Koetting, C., & Freed, P. (2017). Educating undergraduate psychiatric mental health nursing students in screening, brief intervention, referral to treatment (SBIRT) using an online, interactive simulation. Archives of Psychiatric Nursing, 31(3), 241–247. https://doi.org/10.1016/j.apnu.2016.11.004.
Koh, E., Hong, H., & Seah, J. (2014). Learner adaptivity: An initial conceptualization. In Adaptivity as a Transformative Disposition (pp. 15–30). Springer, Singapore.
Kourgiantakis, T., Sewell, K. M., Hu, R., Logan, J., & Bogo, M. (2020). Simulation in social work education: A scoping review. Research on Social Work Practice, 30(4), 433–450. https://doi.org/10.1177/1049731519885015.
Kurzman, P. A. (2013). The evolution of distance learning and online education. Journal of Teaching in Social Work, 33(4–5), 331–338. https://doi.org/10.1080/08841233.2013.843346.
Kurzman, P. A. (2019). The Current Status of Social Work Online and Distance Education. Journal of Teaching in Social Work, 39(4–5), 286–292. https://doi.org/10.1080/08841233.2019.1660117.
Lee, S. J., Srinivasan, S., Trail, T., Lewis, D., & Lopez, S. (2011). Examining the relationship among student perception of support, course satisfaction, and learning outcomes in online learning. The Internet and Higher Education, 14(3), 158–163. https://doi.org/10.1016/j.iheduc.2011.04.001.
Madras, B. K., Compton, W. M., Avula, D., Stegbauer, T., Stein, J. B., & Clark, H. W. (2009). Screening, brief interventions, referral to treatment (SBIRT) for illicit drug and alcohol use at multiple healthcare sites: comparison at intake and 6 months later. Drug and Alcohol Dependence, 99(1–3), 280–295. https://doi.org/10.1016/j.drugalcdep.2008.08.003.
Makhaira, S. P. (2014). Incorporating substance use content into social work curricula: Opioid overdose as a micro, mezzo, and macro problem. Social Work Education, 33(5), 692–698. https://doi.org/10.1080/02615479.2014.919093.
McCarty, D., & Clancy, C. (2002). Telehealth: Implications for social work practice. Social Work, 47(2), 153–161. https://doi.org/10.1093/sw/47.2.153.
Miller, A. S. (2005). CE FEATURE Adolescent alcohol and substance abuse in rural areas: How telehealth can provide treatment solutions. Journal of Addictions Nursing, 16(3), 107–115. https://doi.org/10.1080/10884600500196701.
Mitchell, S. G., Gryczynski, J., Gonzales, A., Moseley, A., Peterson, T., O’Grady, K. E., & Schwartz, R. P. (2012). Screening, brief intervention, and referral to treatment (SBIRT) for substance use in a school-based program: Services and outcomes. The American Journal on Addictions, 21(s1), S5–S13. https://doi.org/10.1111/j.1521-0391.2012.00299.x.
Neander, L. L., Hanson, B. L., Edwards, A. E., Shercliffe, R., Cattrell, E., Barnett, J. D., et al. (2018). Teaching SBIRT through simulation: Educational case studies from nursing, psychology, social work, and medical residency programs. Journal of Interprofessional Education & Practice, 13, 39–47. https://doi.org/10.1016/j.xjep.2018.08.002.
Ohinmaa, A., Chatterley, P., Nguyen, T., & Jacobs, P. (2010). Telehealth in substance abuse and addiction: Review of the literature on smoking, alcohol, drug abuse and gambling. Edmonton: Institute of Health Economics.
Osborne, V., Benner, K., Snively, C., Vinson, D., & Horwitz, B. (2012). Teaching screening, brief intervention, and referral to treatment to social work students. Addiction Science & Clinical Practice. https://doi.org/10.1186/1940-0640-7-S1-A64.
Osborne, V. A., Benner, K., Sprague, D. J., & Cleveland, I. N. (2016). Simulating real life: Enhancing social work education on alcohol screening and brief intervention. Journal of Social Work Education, 52(3), 337–346. https://doi.org/10.1080/10437797.2016.1174629.
Oxhandler, H. K., & Giardina, T. D. (2017). Social workers’ perceived barriers to and sources of support for integrating clients’ religion and spirituality in practice. Social Work, 62(4), 323–332. https://doi.org/10.1093/sw/swx036.
Petracchi, H., Mallinger, G., Engel, R., Rishel, C. W., & Washburn, C. (2005). Evaluating the efficacy of traditional and web-assisted instruction in an undergraduate social work practice class. Journal of Technology in Human Services, 23(3/4), 299–310. https://doi.org/10.1300/J017v23n03_09.
Pugatch, M., Putney, J., O’Brien, K. H. M., Rabinow, L., Weitzman, E., & Levy, S. (2015). Integrating substance use training into social work education. Addiction Science & Clinical Practice, 10(1), 1–2. https://doi.org/10.1186/1940-0640-10-S1-A49.
Puskar, K., Kane, I., Lee, H., Mitchell, A. M., Albrecht, S., Frank, L., et al. (2016). Interprofessional Screening, Brief Intervention, and Referral to Treatment (SBIRT) education for registered nurses and behavioral health professionals. Issues in Mental Health Nursing, 37(9), 682–687. https://doi.org/10.1080/01612840.2016.1198946.
Putney, J. M., O’Brien, K. H., Collin, C. R., & Levine, A. (2017). Evaluation of alcohol screening, brief intervention, and referral to treatment (SBIRT) training for social workers. Journal of Social Work Practice in the Addictions, 17(1–2), 169–187. https://doi.org/10.1080/1533256X.2017.1412978.
Quinn, G. (2010). Institutional denial or minimization: Substance abuse training in social work education. Substance Abuse, 31(1), 8–11. https://doi.org/10.1080/08897070903442475.
Reamer, F. G. (2013). Distance and online social work education: Novel ethical challenges. Journal of Teaching in Social Work, 33(4–5), 369–384. https://doi.org/10.1080/08841233.2013.828669.
Regan, J. A. C. (2016). Innovators and early adopters of distance education in social work. Advances in Social Work, 17(1), 113–115. https://doi.org/10.18060/21091.
Rosen, D., McCall, J., & Goodkind, S. (2017). Teaching critical self-reflection through the lens of cultural humility: An assignment in a social work diversity course. Social Work Education, 36(3), 289–298. https://doi.org/10.1080/02615479.2017.1287260.
Sacco, P., Ting, L., Crouch, T. B., Emery, L., Moreland, M., Bright, C., et al. (2017). SBIRT training in social work education: Evaluating change using standardized patient simulation. Journal of Social Work Practice in the Addictions, 17(1–2), 150–168. https://doi.org/10.1080/1533256X.2017.1302886.
SAMHSA. (2011). Screening, Brief Intervention and Referral to Treatment (SBIRT) in behavioral healthcare. Rockville: SAMHSA.
Sampson, M., Parrish, D. E., & Washburn, M. (2018). Assessing MSW students’ integrated behavioral health skills using an objective structured clinical examination. Journal of Social Work Education, 54(2), 287–299. https://doi.org/10.1080/10437797.2017.1299064.
Sampson, M. M., Washburn, M., & Parrish, D. E. (in press). Evaluation of a youth focused integrated behavioral health training program: A comparison of three cohorts of MSW students. Journal of Social Work Education.
Senreich, E., Ogden, L. P., & Greenberg, J. P. (2017). Enhancing social work students’ knowledge and attitudes regarding substance-using clients through SBIRT training. Journal of Social Work Education, 53(2), 260–275. https://doi.org/10.1080/10437797.2016.1266979.
Smith, K., Jeffery, D., & Collins, K. (2018b). Slowing things down: Taming time in the neoliberal university using social work distance education. Social Work Education, 37(6), 691–704. https://doi.org/10.1080/02615479.2018.1445216.
Smith, A. C., Thomas, E., Snoswell, C. L., Haydon, H., Mehrotra, A., Clemensen, J., & Caffery, L. J. (2020). Telehealth for global emergencies: Implications for coronavirus disease 2019 (COVID-19). Journal of Telemedicine and Telecare. https://doi.org/10.1177/1357633X20916567.
Smith, D. C., Egizio, L. L., Bennett, K., Windsor, L. C., & Clary, K. (2018a). Teaching empirically supported substance use interventions in social work: Navigating instructional methods and accreditation standardized. Journal of Social Work Education, 54(sup1), S90–S102. https://doi.org/10.1080/10437797.2018.1434438.
Stoner, S. A., Mikko, A. T., & Carpenter, K. M. (2014). Web-based training for primary care providers on screening, brief intervention, and referral to treatment (SBIRT) for alcohol, tobacco, and other drugs. Journal of Substance Abuse Treatment, 47(5), 362–370. https://doi.org/10.1016/j.jsat.2014.06.009.
Tanner, T. B., Wilhelm, S. E., Rossie, K. M., & Metcalf, M. P. (2012). Web-based SBIRT skills training for health professional students and primary care providers. Substance Abuse, 33(3), 316–320. https://doi.org/10.1080/08897077.2011.640151.
Turner, H. M., & Bernard, R. M. (2006). Calculating and synthesizing effect sizes. Contemporary Issues in Communication Science and Disorders, 33(Spring), 42–55. https://doi.org/10.1044/cicsd_33_S_42.
University 1 (2020). At a Glance. Retrieved July 2020, from https://uh.edu/socialwork/
University 2 (2020). Enrollment and Student Profile. Retrieved July 2020, from https://www.uta.edu/academics/schools-colleges/social-work/about/student-profile
Vakharia, S. P. (2014). Incorporating substance use content into social work curricula: Opioid overdose as a micro, mezzo, and macro problem. Social Work Education, 33(5), 692–698. https://doi.org/10.1080/02615479.2014.919093.
Vernon, R., Vakalahi, H., Pierce, D., Pittman-Munke, P., & Adkins, L. F. (2009). Distance education programs in social work: Current and emerging trends. Journal of Social Work Education, 45(2), 263–276. https://doi.org/10.5175/JSWE.2009.200700081.
Wamsley, M., Satterfield, J. M., Curtis, A., Lundgren, L., & Satre, D. D. (2018). Alcohol and drug Screening, Brief Intervention, and Referral to Treatment (SBIRT) training and implementation: Perspectives from 4 health professions. Journal of Addiction Medicine, 12(4), 262–272. https://doi.org/10.1097/ADM.0000000000000410.
Wang, C. H., Shannon, D. M., & Ross, M. E. (2013). Students’ characteristics, self-regulated learning, technology self-efficacy, and course outcomes in online learning. Distance Education, 34(3), 302–323. https://doi.org/10.1080/01587919.2013.835779.
Warner, L. A., & Acquavita, S. P. (2019). Introduction to the special issue: The use of screening, brief intervention, and referral to treatment by social workers. Social Work Practice in the Addictions, 19(1–2), 1–9. https://doi.org/10.1080/1533256X.2019.1590707.
Wright, T. E., Terplan, M., Ondersma, S. J., Boyce, C., Yonkers, K., Chang, G., & Creanga, A. A. (2016). The role of screening, brief intervention, and referral to treatment in the perinatal period. American Journal of Obstetrics and Gynecology, 215(5), 539–547. https://doi.org/10.1016/j.ajog.2016.06.038.
Zhou, X., Snoswell, C. L., Harding, L. E., Bambling, M., Edirippulige, S., Bai, X., & Smith, A. C. (2020). The role of telehealth in reducing the mental health burden from COVID-19. Telemedicine and e-Health, 26(4), 377–379. https://doi.org/10.1089/tmj.2020.0068.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Washburn, M., Zhou, S., Sampson, M. et al. A Pilot Study of Peer-to-Peer SBIRT Simulation as a Clinical Telehealth Training Tool During COVID-19. Clin Soc Work J 49, 136–150 (2021). https://doi.org/10.1007/s10615-021-00799-8
- Online learning