1 Introduction

Innovation in education has been an emerging interest in recent decades, with an initial focus on technologies and a recent added focus on the pedagogical aspects (Jacobson & Reimann, 2010). In the digital era, online education has gained wide acceptance in society, especially after the extensive implementation of emergency remote education during 2020–2021 (Tulaskar & Turunen, 2022), and by continuing can contribute to further sustainable education. Several parameters are identified in the literature as important for a successful remote education practice, among which communication and interaction are highlighted (Tulaskar & Turunen, 2022; Vlachopoulos & Makri, 2019). Experimental and instrumentation laboratory exercises are among the most challenging teaching forms to be conducted remotely. Early remote laboratories (Ryan et al., 2013) have been focused on collaborative exercises among high-resource centers such as the Virtual Laboratory Project at the National Aeronautics and Space Administration for education using space simulators, or web-based simulation laboratories for dangerous or expensive experiments such as the Virtual Microscope (The Open University, 2012).

In more recent implementations of remote laboratories (Potkonjak et al., 2016), simulation laboratories are reported to be appreciated by students when provisionally used for triggering experimental setups and are suggested as an enrichment that should be added to the ordinary engineering curriculum (Kapilan et al., 2021) or as preparatory material for the physical lab (Motejlek & Alpay, 2023). More advanced works have employed virtual reality (VR) or augmented reality (AR) to replace experiments within the natural sciences, e.g., anatomy, chemistry and mathematics, or engineering education as well as medicine (Bajči et al., 2019; Blum et al., 2012; Bork et al., 2021; Hamilton et al., 2021; Rojas-Sánchez et al., 2023). Web-based AR methods (Garzón, 2021) can be used through widely available smartphones and can be useful for creating a hybrid virtual and physical environment, dynamic or animated content (Hincapie et al., 2021) and for increasing the 3D perception and motivation of students in self-studies and distance learning (Gurevych et al., 2021). However, their application for instrumentation or remote hands-on labs is still limited (Liccardo et al., 2021; Wu et al., 2021). During the pandemic in 2019–2021 when courses had to be swiftly converted to remote variants, numerous innovative approaches were implemented (Kapilan et al., 2021). Among these, a provisional but interesting approach was to distribute material to students that could be used at home, a so-called home-laboratory, and then to arrange online discussion groups (Dewilde & Li, 2021; O’Neill, 2021); however, this design is feasible only when distributed materials are safe, affordable and can be acquired in quantity.

The next level design in virtual or online learning environments has collaborative, multi-user, team-based and interactive aspects. Interaction in various dimensions, namely learner interaction with tutors, peers and course content is an important aspect of the online education; however, its effective implementation remains a challenge (Lazou & Tsinakos, 2022; Lowenthal, 2023; Vlachopoulos & Makri, 2019). The earlier proposed communication structures are asynchronous, or only partially interactive, and provide team-work opportunities through wikis, repositories and use of textual artifacts to communicate ideas (Jacobson & Reimann, 2010). During recent years, digital platforms more suitable for interactive meetings and group work have been used. Among these, Zoom (Zoom Video Communications, Inc.) gained popularity over other similar platforms mainly due to offering breakout rooms. Studies based on Zoom breakout room teaching (Haase, 2021; O’Neill, 2021; Ong & Yamashiro, 2022) report the appreciation of the students but also describe a slower pace of group work, which means that less work can be accomplished (O’Neill, 2021). Dewilde has also reported on the need for additional hours when the lab was held remotely but did not mention if this was specifically due to the Zoom sessions (Dewilde & Li, 2021).

Low interactivity has been identified as a factor that influenced the effectiveness of emergency remote education (Tulaskar & Turunen, 2022). There are few publications on innovative approaches for increasing interaction that impact learned engineering skills. One example is Trace Matrix, which is proposed for improving coordination in remote lab experimental exercises (Ahrar et al., 2021). Multi-dimensionality in terms of providing multiple strategies for communication positively affects interaction (Vlachopoulos & Makri, 2019). For instance, gamification or game-based learning is frequently mentioned as a successful strategy for increasing interaction. Gamification, defined as using game elements in a non-game context, has shown promise for engaging students and increasing their performance in learning (Dahalan et al., 2023; Ekici, 2021). However, it is important to implement gamification in accordance with the intended educational purpose (Dichev & Dicheva, 2017).

At the Department of Biomedical Engineering, at Linköping University, only the computer-based laboratories were offered as a remote variant in the 2020–2021 pandemic. According to course evaluations, computer-based remote laboratories were successful via Zoom, but the students’ experience was not as positive when evaluating remote laboratory variants of previous on site computer laboratory work. The students experienced the remote laboratory group work as less interactive and found that the educational goals of the laboratory exercises were not fulfilled to the expected extent. The main identified reasons were: (1) the interactions between the teacher and the students, and also among the students were reduced, and (2) sharing the experimental program and protocols was not easy within the groups.

Considering this feedback, a different laboratory structure was implemented in a freestanding course at Linköping University in ‘Safety in Hospitals’ with the aim of enabling interactive remote education on electrical safety in biomedical engineering education that could contribute to higher sustainability. To fulfill this goal, a real-life problem-based laboratory layout was designed, implemented and evaluated using a user acceptance questionnaire. The main considerations in the laboratory’s design were that it should offer: 1) use of common instrumentation and a common experimental protocol 2) a collaborative environment for online groups, enhanced peer-peer interaction (due to the importance of student peer-learning), and between the teacher and the students 3) real-life problem-based content, and 4) suitability for integrating both internal and external students and non-student participants in terms of access to digital platforms through the restricted computers in various hospitals, 5) sustainability and accessibility in terms of substantially reduced costs for participants and their employers, laboratory material at no expense, the effect of reduced travels on the climate, and the possibility for potential future laboratory updates.

2 Materials and methods

2.1 Design and implementation

The study was performed on an annually offered freestanding course for three consecutive years in 2021–2023. The course structure, students here referred to as course participants, equipment needed for the laboratory and the digital learning environment (Gather Town), laboratory and group work assignments are described in detail in below sections.

2.2 Course

The course title is ‘Safety in Hospitals’, with three higher education credits (ECTS). It is offered once a year as a freestanding course for biomedical engineering program students or freelancers. The course is important in the biomedical and clinical engineering profession and can be accredited for certification in Sweden. The course participants are a mixture of students, and industry and hospital personnel. Well-functioning remote laboratory work is necessary as the ability to travel to another city to perform an experimental laboratory exercise can be periodically decreased, especially for healthcare personnel with an excessive burden of duties in their daily occupation. In this course, peer-learning can lead to a fruitful exchange of experiences as many of the participants have expertise in different departments of hospitals across the country.

The laboratory in this course is on medical device electrical safety. In the previous on site variant of the laboratory in the considered course, the exercises included theoretical principles of electrical safety, measurement of skin contact impedance and leakage current, testing of the protective earth, modeling a 4–5 lead system on a power distribution system simulator that is not widely accessible (Fig. 1a). In the updated and remote design, the laboratory is instead based on a real-life situation. In the remote laboratory, the participants (1) choose a device available at their workplace (2) perform a safety test on the chosen device and (3) interpret the results and validate the electrical safety of the device. In practice, the electrical safety of a medical device is checked by running a test based on electrical standards. In such a test, a list of electrical parameters are measured by an electrical safety analyzer and compared to the permitted limits in a specific electrical standard. Any value that exceeds the permitted limits will be marked in the analysis protocol of the electrical safety analyzer. Most modern electrical safety analyzers perform this test automatically; however, it is of great importance to choose the settings and connect the electrical safety analyzer and the test object correctly.

Fig. 1
figure 1

(a) The simulator system used in the earlier version of the laboratory, (b) the setup for the remote laboratory including a medical product (ECG system) and (c) an electrical safety analyzer, I is ECG electrode connections and II connection to protective earth or any exposed surface on the enclosure

2.3 Course participants

Everyone with a high school degree including a grade in Swedish language skills could register for the course through the Swedish university admission system. In a separate form, the admitted persons were asked about their profession and city of residence/work; however, answering this form was not obligatory. Based on the information submitted via this form, most of the course participants were biomedical or clinical engineers/technicians who had graduated recently or a long while ago but some had other educational backgrounds, e.g. students in nursing or cognitive science. According to the registration form, in each year of this course, participants were from 17 to 24 geographical locations, making a total of 42 unique locations (41 in Sweden, 1 in Finland). During 2021, 2022 and 2023, respectively 35, 27 and 25 persons actively took part and completed the course. Among the active course participants, one, six and four persons chose to perform the laboratory on site at the course institute for the years 2021–2023, respectively.

2.4 Laboratory equipment and measurement protocol

The laboratory module was focused on electrical safety control of medical devices where a standard electrical safety analyzer, e.g., ESA612 (Fluke® Biomedical, USA), and a medical device existing in most hospitals, could be used (Fig. 1b-c) to measure a certain number of electrical parameters and compare these with the allowed limits based on an electrical standard. The medical device to test was recommended to be a class II device, e.g., an electrocardiography (ECG) system or a patient monitoring system. However, a wide variety of devices could be used and the course participants had the possibility to check the suitability of their systems with the teachers before the course started. The electrical analyzer was electrically connected to the medical device before the test. An example of the connections of the devices is given in (McMahon, 2015).

The electrical analyzer was controlled through a computer-based program provided with the system. An electrical standard was selected in the program and the device to test is assigned a class and a type. The class (I-III) was defined based on the electrical shock hazard that it presents for the patient or user. The type, i.e., body (B), body floating (B), cardiac floating (CF), is defined based on the insulation of the part of the device that is in direct contact with the patient and the level of protection provided for the patient’s body, and particularly heart, which is the most vulnerable body part to electrical shock. The type of device is commonly stated in the user manuals or is marked on the device.

A measurement protocol (Fig. 2) and electrical safety analyzer instructions were provided as shared documents for each group. The allowed limits for the mentioned parameters according to the IEC 60601-1 electrical standard for medical device, e.g., resistance, voltage, leakage current, were assigned in the measurement protocol for the corresponding class (I, II) and type (B, BF, CF) and an empty column was allocated for the values to be measured.

Fig. 2
figure 2

Lab measurement protocol. The allowed thresholds for each device class and type are given as guidance in the table based on the IEC 60601-1 standard

2.5 Graphical digital platform

The video chat Gather© (Gather.Town) platform (Gather Presence, Inc., USA) was used as a digital learning environment, where interaction between the individuals, the teacher and individual groups, as well as among groups, takes place very similarly to in a physical laboratory room. Gather© uses a web-based graphical platform where each participant attends the laboratory room with an avatar. There each participant can share the screen, sound and video via a camera and microphone with other participants in the same group, and can use the materials available in the room (Fig. 3a). Participants can also move around to communicate with others, and the teacher can be seen and heard by one or all groups at the same time when positioned in the teacher spots. An example of such a session is shown in Fig. 3b. During the sessions, a joint report could be written in a document shared via Gather© (Fig. 3c). The learning environment was evaluated after the laboratory work through a questionnaire (Table 1) placed in Gather classroom (Fig. 3a).

Fig. 3
figure 3

(a) The Gather© classroom modified from the original design for the laboratory session. (b) An example of the interaction of the teacher with a group. The teacher is standing in a square and discusses with group #2. The group members and the teacher can see each other via video. (c) Example of a group from a GW session and interaction within the group. Each avatar represents one course participant or a group member.

Table 1 The van der Laan evaluation questionnaire (Van Der Laan et al., 1997)

2.6 Laboratory assignment definition

The laboratory assignment was defined as performing an electrical safety test on a medical device of choice, filling out the provided measurement protocol (Fig. 2), and interpreting the measurements. Prior to the laboratory session, the teacher provided information regarding the laboratory assignment, including the device connections and measurements. The course participants were in geographically different locations, but everyone had access to his/her own equipment. The measurements could be performed individually or in pairs if two course participants were in the same geographical locations. The course participants without direct access to such equipment were offered the option of performing the laboratory assignment at a nearby hospital or traveling to the institute where the course was offered.

After the laboratory session, the teacher discussed the measurement protocol with the course participants and they could ask questions regarding their measurements, for example when their results showed deviations from the allowed threshold. The course participants were then asked to submit the completed protocol as a report. The report was used for the formal examination of the laboratory module. The scheduling and digital learning environments were first designed and implemented as shown in layout 1, and based on the evaluation results from layout 1, re-designed and implemented as in layout 2. A 2-hours lecture on electrical safety principles was scheduled prior to the laboratory module. Details of the layouts are listed below and in Fig. 4.

2.6.1 Laboratory module layout 1 (year 2021)

The teacher gave instructions on how to perform the laboratory assignment, scheduled for two hours, after which one hour was scheduled so the course participants could conduct the measurements on their own in their hometown or on site in the course institute. The teacher went through the measurement results with the class after this session on the same day. The Zoom platform was used for communication between the teacher and course participants for these layouts. The schedule is shown in Fig. 4a.

2.6.2 Laboratory module layout 2 in Gather© (years 2022 and 2023)

The teacher gave instructions on how to take part in the laboratory session and how to form random groups composed of two or three persons in Gather©. This session was scheduled for two hours, the same as in layout 1. Thereafter, two alternative laboratory sessions were scheduled: (1) Using the Gather© platform with a three-hour scheduled time for performing the measurements on day 2 of the course, and (2) A session scheduled the day after (day 3) for those who registered for the lab on site. Performance of the laboratory assignment on site took approximately one hour for each group. During the laboratory session in the Gather© platform, each participant while performing instrumental experiments at their home site, discussed their measurement results within the group in the Gather© platform and wrote either an individual or joint measurement protocol (Fig. 2) in a shared document. In the afternoon of day 3 of the course, the teacher reviewed the measurement results with all of the course participants who performed the laboratory on site and remotely. The schedule is shown in Fig. 4b.

2.7 Additional collaborative assignments

Two additional group work assignments (GW case A and GW case B) including case discussions with mandatory hand-in reports were performed using Gather© digital platform. GW case A was scheduled one week after the laboratory session, and GW case B one day after GW case A (Fig. 4). The participants were instructed to form random groups composed of four or five persons. The digital learning environment was evaluated after each GW session using a questionnaire.

Fig. 4
figure 4

Schedule of the electrical safety and laboratory module and the group work assignments for (a) layout 1 and (b) layout 2. The day # is relative to the number of the days within the course. The main change in the schedule from a to b is in the length of the hands-on laboratory session and the digital platform used for the collaborative sessions

2.8 Evaluation methodologies

The implementations of the laboratory and the GW sessions were evaluated using self-reported questionnaires for three consecutive years in 2021–2023, following the university practices in collecting evaluation and participant information. Therefore, the analyzed data is based on the responses of the course participants who voluntarily participated in the evaluations. The course participants evaluated the overall laboratory module quantitatively and qualitatively using the final course evaluation questionnaire while they evaluated the digital learning environment only quantitatively using the van der Laan scale. Three teachers in total participated in the digital platform sessions, and two of them used Gather© with minimal instructions for the first time in the course. The teachers did not take part in the questionnaire-based evaluations but the teachers’ feedback is included in the study in the form of reflections.

2.9 Evaluation of the overall laboratory module (quantitative and qualitative)

The overall laboratory module for years 2021–2023 was evaluated at the end of the course through the standard anonymous course evaluation questionnaire distributed by the university. The evaluation questionnaire was distributed to all of the registered course participants by the university evaluation system. A separate question was allocated to the laboratory module in the final course evaluation for ranking the usefulness of the lab on a scale of 1–5 with 5 being the highest rank (Fig. 5). In the same question, the participants could give free-text comments for qualitative assessment of the laboratory.

Fig. 5
figure 5

Final course evaluation showing (a) number of respondents and (b) % of the total respondents, on the specific evaluation question concerning the laboratory module layout design 1 (average 3.17/5) and layout design 2 (average 4.10/5)

2.10 Evaluation of the digital platform using the van der Laan scale (quantitative)

A link to the questionnaire,  for the evaluation of the digital learning platform was placed in the Gather© classroom (Fig. 3a). The questionnaire was generated in Google Forms and the purpose of the evaluation was included in its heading. The participants were asked to answer the questionnaire prior to exiting the session or prior to the next scheduled Gather© session. The platform was evaluated after the two group work assignments in a similar way as for the laboratory session. The course participants who performed the laboratory exercises on site at the course institute during 2022–2023 (n = 10) did not participate in the evaluation of the digital laboratory platform but were asked to participate in evaluation of the digital platform for group work assignments.

A van der Laan acceptance scale (Van Der Laan et al., 1997) was used for evaluation. The scale, originally adapted from Osgood (Osgood et al., 1957), has been widely used for evaluation of user acceptance of interfaces at advanced technological levels, e.g., for surgery, traffic scenarios, driver acceptance, haptic control, etc. (Chen et al., 2007; Maintz et al., 2019; Sayer et al., 2007). The questionnaire was provided in English to avoid translation errors. The questionnaire was anonymous, and the participants were instructed that adding their names to the questionnaire was voluntary. This evaluation was performed for the years 2022 and 2023 due to the implementation of the Gather© platform.

The van der Laan questionnaire has nine items representing either the usefulness subscale (items 1, 3, 5, 7 and 9) or the satisfaction subscale (2,4,6, and 8) on a 5-point Likert scale. Having several items for each dimension increases the reliability of the questionnaire. The assessment questionnaire is shown in Table 1. Each item in the questionnaire was averaged and then scaled according to the original van der Laan definition, where the Likert scores were scaled from + 2 to -2 for items 1,2,4,5,7,9 and from − 2 to + 2 for items 3,6,8 since they were mirrored. The most positive aspects were assigned a score of + 2. Mean and standard deviation (m ± sd) of the subscales was used for assessment of the overall acceptance of the sessions. Internal consistency was tested using Cronbach’s alpha calculated using a two-factor analysis of variance (ANOVA) without replication in Excel© (Microsoft Office, Inc.). A Cronbach’s alpha value above 0.65 is considered reliable by van der Laan (Van Der Laan et al., 1997); however, a higher value shows a higher reliability in the test (Lewis & Sauro, 2021).

3 Results

3.1 Evaluation of the overall laboratory module

For layout design 1 (year 2021), the overall laboratory module was rated 3.17/5 by course participants who answered the evaluation (n = 35) in the final course evaluation. In free-text answers, the major comment was the shortage of time for the laboratory hands-on session and the need for assistance during the measurements. In layout design 2 (year 2022 and 2023), the participants (n = 29) in the final course evaluation questionnaire scored the module as 4.10/5 (Fig. 5), with overall positive comments on the laboratory hands-on session including interaction with peers in Gather Town, and the time shortage was no longer a mentioned issue since the laboratory time was increased (Fig. 4).

3.2 Evaluation of the digital platform

The evaluation of the digital learning environment shows that overall, the participants found the Gather© town to be useful and satisfying as a learning environment. Cronbach’s alpha of 0.75–0.87 was obtained for the three tests (Table 2). The results of the evaluation questionnaire based on the van der Laan scale are presented in Fig. 6; Table 3. Both subscales, usefulness (0.54 ± 0.67) and satisfaction (0.37 ± 0.60), showed a positive user acceptance, i.e., above zero on a scale of [-2, 2] on the first occasion. By comparing the two GW sessions with identical assignments, the user acceptance on both subscales of usefulness and satisfaction improved from 0.75 ± 0.70 and 0.67 ± 0.75, to 1.11 ± 0.59 and 1.00 ± 0.71, respectively, after the second GW session. As a result, training on the environment could improve the metrics and experience of the participants. The changes in the acceptance items were assessed for the whole population who answered the questionnaire and not for each individual due to the anonymity of the evaluation questionnaires.

Fig. 6
figure 6

The outcome of the questionnaire showing user acceptance. Graphs show individual values (blue circles) for each participant and the m ± sd (red squares) for overall satisfaction vs. usefulness for the sessions. (a) Session 1: laboratory, (b) session 2: group work, case A and (c) session 3: group work, case B. Group work, cases A and B were held one week after the laboratory module. Some of the subjects had the same scores in each session, which is why the number of markers on the graphs appears less than the actual number

Table 2 Cronbach’s alpha showing internal consistency of the subscales for each test
Table 3 The results of the evaluation questionnaire. Each cell shows the mean ± standard deviation of corresponding usefulness or satisfaction for the participants. n: the number of the participants who answered the questionnaire

3.3 Teacher reflections

Overall, the teachers found the platform interesting and they experienced that they could easily supervise the groups. Even though the participants had different computer and internet settings, they could easily connect to and use the Gather© platform. Some minor technical issues, mostly concerning the documents, arose during the sessions for the group work which was possible to resolve by alternative solutions. One example was that the positioning or distance of the avatar from the document icon was not always optimal for accessing the document. Most of the groups wrote the report in the shared document directly and one group shared the document using the screen in Gather© platform. One issue for the participants was opening two documents, for instructions and the report, simultaneously in the platform. The advantage of Gather-linked document sharing for report writing was that the teacher could independently have a view of the shared documents during the session.

4 Discussion and Conclusion

In this study, an interactive remote laboratory was implemented in a digital learning environment and evaluated. The entire course was converted to a distant variant in 2021 as a consequence of the pandemic, and was intended to be a future structure for the course. The suggested laboratory layout is sustainable in providing real-life-problem-based experiments that could potentially be easily updated with the future upgraded hospital settings and are at the same time cost-effective and accessible for the organizing institution regarding providing the laboratory equipment and for the course participants regarding saving on travel costs. The possibility of performing the measurements remotely is ecological and climate-friendly since it avoids long travel and reduces use of consumables. Most importantly, the pedagogical structure provides an opportunity for healthcare personnel to participate in the course without needing to take long leaves from work, which is not affordable especially in periods of heavy workload at the hospitals.

In evaluating the education or learning outcome, direct assessment of knowledge and skills is a more obvious and controlled instrument whereas self-reported assessment is a less prioritized practice (Nusche, 2008). On the other hand, usability and user experience tests are commonly implemented within industry and software development intended for human use (Lewis & Sauro, 2021) which implies the suitability of such tests for assessment of online learning platforms rather than performing a direct assessment of the learning outcome. Depending on the system under study, the choice of test could be a verbal report or think aloud approach, a performance assessment, or measurement through standardized questionnaires (Lewis & Sauro, 2021) where the System Usability Scale (SUS) is most commonly used (Brooke, 1996; Lewis, 2018; Schaefer, 2016). However, there are complications with the SUS questionnaire (Klug, 2017) including the scaling that down-prioritized its usage in this study. Other possible methods are behavior or engagement detection techniques using self-reporting questionnaires or automatic computer vision-based appearance tracking and classification methods (Dewan et al., 2019; Luo et al., 2023).

Several studies report on successful implementation of the Gather Town as a learning environment (Lee et al., 2023; Lo & Song, 2023; McClure & Williams, 2021; Tang et al., 2022; Zhao & McClure, 2022). Among these, some studies have specifically compared Gather Town with Zoom, concluding that students have a strong preference for the Gather Town platform due to a higher social connection among the peer students, easier transition between the private and public discussions and a higher sense of presence similar to being in a classroom (Latulipe & Jaeger, 2022; Lee et al., 2023). Lee et al. also report that there is an easier exchange of emotions through Gather Town (Lee et al., 2023). In a comparison of a VR technology-based platform, Zoom and Gather Town, the VR technology-based platform had the lowest evaluation and Gather Town the highest, specifically due to the higher engagement of the participants in the latter (Sriworapong et al., 2022).

The limitations of this study lie in the low number of implementations of the laboratory session which is too low for performing statistical significance tests. In addition, non-anonymous evaluation would allow tracking of the individual user’s experience. Training on the digital environment could be improved by using the same digital platform for all of the course sessions including lectures instead of only the collaborative occasions. However, this was limited in this implementation by the cost of the platform for a large population and by the fact that many lecturers were not familiar with the digital platform. Ideally, pilot testing should be performed before the main evaluation test, which was difficult to conduct in this study. In the next step, the laboratory design should be improved through iterative adjustments and evaluations. However, such a procedure is challenged by the technical upgrades of the commercial digital learning platform that is not known to the teachers and course designers.

In conclusion, an interactive remote laboratory module on electrical safety of medical devices in biomedical engineering education was designed, implemented and evaluated considering overall laboratory usefulness and the user acceptance of the digital platform. Some minor technical problems needed to be solved during the online collaborative sessions, mostly concerning the shared documents. All groups were able to carry out the laboratory work efficiently and within the allowed time for the session. The participants’ acceptance of the digital platform improved after they had used it on two identical occasions during the course. The importance of this work is in providing an instance of an interactive remote laboratory for teaching an essential skill within the biomedical engineering profession. The design of this laboratory can be easily implemented in other academic or vocational training centers and is envisioned to contribute to further implementations of sustainable biomedical engineering education.