Abstract
To enable interactive remote education on electrical safety in biomedical engineering, a real-life problem-based laboratory module is proposed, implemented and evaluated. The laboratory module was implemented in a freestanding distance course in hospital safety for three consecutive years and was based on electrical safety for medical devices, where standard equipment existing in most hospitals could be used. The course participants were from a total of 42 geographical locations in or near Sweden. To allow a high level of interaction, especially among peer students, a graphical digital platform (Gather Town) was used. The digital platform was additionally used in two group work sessions. The experience of the participants in terms of usefulness and satisfaction was rated on a range of [-2, 2] using a van der Laan 5-point Likert-based acceptance scale questionnaire. The laboratory module overall was scored 4.1/5 by the participants (n = 29) in the final course assessments. The evaluation of the digital platform alone showed that in the first usage instance, the participants (n = 21) found the platform to be useful (0.54 ± 0.67) and satisfactory (0.37 ± 0.60). The participants’ experience of the digital platform improved when comparing two identical group work assignments so that ratings of usefulness and satisfaction were 1.11 ± 0.59 and 1 ± 0.71, respectively, after they had used it in the second group work session (n = 38). This study provides an instance of an interactive remote electrical safety laboratory module that is envisioned to contribute to further implementations of sustainable education in biomedical engineering.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Innovation in education has been an emerging interest in recent decades, with an initial focus on technologies and a recent added focus on the pedagogical aspects (Jacobson & Reimann, 2010). In the digital era, online education has gained wide acceptance in society, especially after the extensive implementation of emergency remote education during 2020–2021 (Tulaskar & Turunen, 2022), and by continuing can contribute to further sustainable education. Several parameters are identified in the literature as important for a successful remote education practice, among which communication and interaction are highlighted (Tulaskar & Turunen, 2022; Vlachopoulos & Makri, 2019). Experimental and instrumentation laboratory exercises are among the most challenging teaching forms to be conducted remotely. Early remote laboratories (Ryan et al., 2013) have been focused on collaborative exercises among high-resource centers such as the Virtual Laboratory Project at the National Aeronautics and Space Administration for education using space simulators, or web-based simulation laboratories for dangerous or expensive experiments such as the Virtual Microscope (The Open University, 2012).
In more recent implementations of remote laboratories (Potkonjak et al., 2016), simulation laboratories are reported to be appreciated by students when provisionally used for triggering experimental setups and are suggested as an enrichment that should be added to the ordinary engineering curriculum (Kapilan et al., 2021) or as preparatory material for the physical lab (Motejlek & Alpay, 2023). More advanced works have employed virtual reality (VR) or augmented reality (AR) to replace experiments within the natural sciences, e.g., anatomy, chemistry and mathematics, or engineering education as well as medicine (Bajči et al., 2019; Blum et al., 2012; Bork et al., 2021; Hamilton et al., 2021; Rojas-Sánchez et al., 2023). Web-based AR methods (Garzón, 2021) can be used through widely available smartphones and can be useful for creating a hybrid virtual and physical environment, dynamic or animated content (Hincapie et al., 2021) and for increasing the 3D perception and motivation of students in self-studies and distance learning (Gurevych et al., 2021). However, their application for instrumentation or remote hands-on labs is still limited (Liccardo et al., 2021; Wu et al., 2021). During the pandemic in 2019–2021 when courses had to be swiftly converted to remote variants, numerous innovative approaches were implemented (Kapilan et al., 2021). Among these, a provisional but interesting approach was to distribute material to students that could be used at home, a so-called home-laboratory, and then to arrange online discussion groups (Dewilde & Li, 2021; O’Neill, 2021); however, this design is feasible only when distributed materials are safe, affordable and can be acquired in quantity.
The next level design in virtual or online learning environments has collaborative, multi-user, team-based and interactive aspects. Interaction in various dimensions, namely learner interaction with tutors, peers and course content is an important aspect of the online education; however, its effective implementation remains a challenge (Lazou & Tsinakos, 2022; Lowenthal, 2023; Vlachopoulos & Makri, 2019). The earlier proposed communication structures are asynchronous, or only partially interactive, and provide team-work opportunities through wikis, repositories and use of textual artifacts to communicate ideas (Jacobson & Reimann, 2010). During recent years, digital platforms more suitable for interactive meetings and group work have been used. Among these, Zoom (Zoom Video Communications, Inc.) gained popularity over other similar platforms mainly due to offering breakout rooms. Studies based on Zoom breakout room teaching (Haase, 2021; O’Neill, 2021; Ong & Yamashiro, 2022) report the appreciation of the students but also describe a slower pace of group work, which means that less work can be accomplished (O’Neill, 2021). Dewilde has also reported on the need for additional hours when the lab was held remotely but did not mention if this was specifically due to the Zoom sessions (Dewilde & Li, 2021).
Low interactivity has been identified as a factor that influenced the effectiveness of emergency remote education (Tulaskar & Turunen, 2022). There are few publications on innovative approaches for increasing interaction that impact learned engineering skills. One example is Trace Matrix, which is proposed for improving coordination in remote lab experimental exercises (Ahrar et al., 2021). Multi-dimensionality in terms of providing multiple strategies for communication positively affects interaction (Vlachopoulos & Makri, 2019). For instance, gamification or game-based learning is frequently mentioned as a successful strategy for increasing interaction. Gamification, defined as using game elements in a non-game context, has shown promise for engaging students and increasing their performance in learning (Dahalan et al., 2023; Ekici, 2021). However, it is important to implement gamification in accordance with the intended educational purpose (Dichev & Dicheva, 2017).
At the Department of Biomedical Engineering, at Linköping University, only the computer-based laboratories were offered as a remote variant in the 2020–2021 pandemic. According to course evaluations, computer-based remote laboratories were successful via Zoom, but the students’ experience was not as positive when evaluating remote laboratory variants of previous on site computer laboratory work. The students experienced the remote laboratory group work as less interactive and found that the educational goals of the laboratory exercises were not fulfilled to the expected extent. The main identified reasons were: (1) the interactions between the teacher and the students, and also among the students were reduced, and (2) sharing the experimental program and protocols was not easy within the groups.
Considering this feedback, a different laboratory structure was implemented in a freestanding course at Linköping University in ‘Safety in Hospitals’ with the aim of enabling interactive remote education on electrical safety in biomedical engineering education that could contribute to higher sustainability. To fulfill this goal, a real-life problem-based laboratory layout was designed, implemented and evaluated using a user acceptance questionnaire. The main considerations in the laboratory’s design were that it should offer: 1) use of common instrumentation and a common experimental protocol 2) a collaborative environment for online groups, enhanced peer-peer interaction (due to the importance of student peer-learning), and between the teacher and the students 3) real-life problem-based content, and 4) suitability for integrating both internal and external students and non-student participants in terms of access to digital platforms through the restricted computers in various hospitals, 5) sustainability and accessibility in terms of substantially reduced costs for participants and their employers, laboratory material at no expense, the effect of reduced travels on the climate, and the possibility for potential future laboratory updates.
2 Materials and methods
2.1 Design and implementation
The study was performed on an annually offered freestanding course for three consecutive years in 2021–2023. The course structure, students here referred to as course participants, equipment needed for the laboratory and the digital learning environment (Gather Town), laboratory and group work assignments are described in detail in below sections.
2.2 Course
The course title is ‘Safety in Hospitals’, with three higher education credits (ECTS). It is offered once a year as a freestanding course for biomedical engineering program students or freelancers. The course is important in the biomedical and clinical engineering profession and can be accredited for certification in Sweden. The course participants are a mixture of students, and industry and hospital personnel. Well-functioning remote laboratory work is necessary as the ability to travel to another city to perform an experimental laboratory exercise can be periodically decreased, especially for healthcare personnel with an excessive burden of duties in their daily occupation. In this course, peer-learning can lead to a fruitful exchange of experiences as many of the participants have expertise in different departments of hospitals across the country.
The laboratory in this course is on medical device electrical safety. In the previous on site variant of the laboratory in the considered course, the exercises included theoretical principles of electrical safety, measurement of skin contact impedance and leakage current, testing of the protective earth, modeling a 4–5 lead system on a power distribution system simulator that is not widely accessible (Fig. 1a). In the updated and remote design, the laboratory is instead based on a real-life situation. In the remote laboratory, the participants (1) choose a device available at their workplace (2) perform a safety test on the chosen device and (3) interpret the results and validate the electrical safety of the device. In practice, the electrical safety of a medical device is checked by running a test based on electrical standards. In such a test, a list of electrical parameters are measured by an electrical safety analyzer and compared to the permitted limits in a specific electrical standard. Any value that exceeds the permitted limits will be marked in the analysis protocol of the electrical safety analyzer. Most modern electrical safety analyzers perform this test automatically; however, it is of great importance to choose the settings and connect the electrical safety analyzer and the test object correctly.
2.3 Course participants
Everyone with a high school degree including a grade in Swedish language skills could register for the course through the Swedish university admission system. In a separate form, the admitted persons were asked about their profession and city of residence/work; however, answering this form was not obligatory. Based on the information submitted via this form, most of the course participants were biomedical or clinical engineers/technicians who had graduated recently or a long while ago but some had other educational backgrounds, e.g. students in nursing or cognitive science. According to the registration form, in each year of this course, participants were from 17 to 24 geographical locations, making a total of 42 unique locations (41 in Sweden, 1 in Finland). During 2021, 2022 and 2023, respectively 35, 27 and 25 persons actively took part and completed the course. Among the active course participants, one, six and four persons chose to perform the laboratory on site at the course institute for the years 2021–2023, respectively.
2.4 Laboratory equipment and measurement protocol
The laboratory module was focused on electrical safety control of medical devices where a standard electrical safety analyzer, e.g., ESA612 (Fluke® Biomedical, USA), and a medical device existing in most hospitals, could be used (Fig. 1b-c) to measure a certain number of electrical parameters and compare these with the allowed limits based on an electrical standard. The medical device to test was recommended to be a class II device, e.g., an electrocardiography (ECG) system or a patient monitoring system. However, a wide variety of devices could be used and the course participants had the possibility to check the suitability of their systems with the teachers before the course started. The electrical analyzer was electrically connected to the medical device before the test. An example of the connections of the devices is given in (McMahon, 2015).
The electrical analyzer was controlled through a computer-based program provided with the system. An electrical standard was selected in the program and the device to test is assigned a class and a type. The class (I-III) was defined based on the electrical shock hazard that it presents for the patient or user. The type, i.e., body (B), body floating (B), cardiac floating (CF), is defined based on the insulation of the part of the device that is in direct contact with the patient and the level of protection provided for the patient’s body, and particularly heart, which is the most vulnerable body part to electrical shock. The type of device is commonly stated in the user manuals or is marked on the device.
A measurement protocol (Fig. 2) and electrical safety analyzer instructions were provided as shared documents for each group. The allowed limits for the mentioned parameters according to the IEC 60601-1 electrical standard for medical device, e.g., resistance, voltage, leakage current, were assigned in the measurement protocol for the corresponding class (I, II) and type (B, BF, CF) and an empty column was allocated for the values to be measured.
2.5 Graphical digital platform
The video chat Gather© (Gather.Town) platform (Gather Presence, Inc., USA) was used as a digital learning environment, where interaction between the individuals, the teacher and individual groups, as well as among groups, takes place very similarly to in a physical laboratory room. Gather© uses a web-based graphical platform where each participant attends the laboratory room with an avatar. There each participant can share the screen, sound and video via a camera and microphone with other participants in the same group, and can use the materials available in the room (Fig. 3a). Participants can also move around to communicate with others, and the teacher can be seen and heard by one or all groups at the same time when positioned in the teacher spots. An example of such a session is shown in Fig. 3b. During the sessions, a joint report could be written in a document shared via Gather© (Fig. 3c). The learning environment was evaluated after the laboratory work through a questionnaire (Table 1) placed in Gather classroom (Fig. 3a).
2.6 Laboratory assignment definition
The laboratory assignment was defined as performing an electrical safety test on a medical device of choice, filling out the provided measurement protocol (Fig. 2), and interpreting the measurements. Prior to the laboratory session, the teacher provided information regarding the laboratory assignment, including the device connections and measurements. The course participants were in geographically different locations, but everyone had access to his/her own equipment. The measurements could be performed individually or in pairs if two course participants were in the same geographical locations. The course participants without direct access to such equipment were offered the option of performing the laboratory assignment at a nearby hospital or traveling to the institute where the course was offered.
After the laboratory session, the teacher discussed the measurement protocol with the course participants and they could ask questions regarding their measurements, for example when their results showed deviations from the allowed threshold. The course participants were then asked to submit the completed protocol as a report. The report was used for the formal examination of the laboratory module. The scheduling and digital learning environments were first designed and implemented as shown in layout 1, and based on the evaluation results from layout 1, re-designed and implemented as in layout 2. A 2-hours lecture on electrical safety principles was scheduled prior to the laboratory module. Details of the layouts are listed below and in Fig. 4.
2.6.1 Laboratory module layout 1 (year 2021)
The teacher gave instructions on how to perform the laboratory assignment, scheduled for two hours, after which one hour was scheduled so the course participants could conduct the measurements on their own in their hometown or on site in the course institute. The teacher went through the measurement results with the class after this session on the same day. The Zoom platform was used for communication between the teacher and course participants for these layouts. The schedule is shown in Fig. 4a.
2.6.2 Laboratory module layout 2 in Gather© (years 2022 and 2023)
The teacher gave instructions on how to take part in the laboratory session and how to form random groups composed of two or three persons in Gather©. This session was scheduled for two hours, the same as in layout 1. Thereafter, two alternative laboratory sessions were scheduled: (1) Using the Gather© platform with a three-hour scheduled time for performing the measurements on day 2 of the course, and (2) A session scheduled the day after (day 3) for those who registered for the lab on site. Performance of the laboratory assignment on site took approximately one hour for each group. During the laboratory session in the Gather© platform, each participant while performing instrumental experiments at their home site, discussed their measurement results within the group in the Gather© platform and wrote either an individual or joint measurement protocol (Fig. 2) in a shared document. In the afternoon of day 3 of the course, the teacher reviewed the measurement results with all of the course participants who performed the laboratory on site and remotely. The schedule is shown in Fig. 4b.
2.7 Additional collaborative assignments
Two additional group work assignments (GW case A and GW case B) including case discussions with mandatory hand-in reports were performed using Gather© digital platform. GW case A was scheduled one week after the laboratory session, and GW case B one day after GW case A (Fig. 4). The participants were instructed to form random groups composed of four or five persons. The digital learning environment was evaluated after each GW session using a questionnaire.
2.8 Evaluation methodologies
The implementations of the laboratory and the GW sessions were evaluated using self-reported questionnaires for three consecutive years in 2021–2023, following the university practices in collecting evaluation and participant information. Therefore, the analyzed data is based on the responses of the course participants who voluntarily participated in the evaluations. The course participants evaluated the overall laboratory module quantitatively and qualitatively using the final course evaluation questionnaire while they evaluated the digital learning environment only quantitatively using the van der Laan scale. Three teachers in total participated in the digital platform sessions, and two of them used Gather© with minimal instructions for the first time in the course. The teachers did not take part in the questionnaire-based evaluations but the teachers’ feedback is included in the study in the form of reflections.
2.9 Evaluation of the overall laboratory module (quantitative and qualitative)
The overall laboratory module for years 2021–2023 was evaluated at the end of the course through the standard anonymous course evaluation questionnaire distributed by the university. The evaluation questionnaire was distributed to all of the registered course participants by the university evaluation system. A separate question was allocated to the laboratory module in the final course evaluation for ranking the usefulness of the lab on a scale of 1–5 with 5 being the highest rank (Fig. 5). In the same question, the participants could give free-text comments for qualitative assessment of the laboratory.
2.10 Evaluation of the digital platform using the van der Laan scale (quantitative)
A link to the questionnaire, for the evaluation of the digital learning platform was placed in the Gather© classroom (Fig. 3a). The questionnaire was generated in Google Forms and the purpose of the evaluation was included in its heading. The participants were asked to answer the questionnaire prior to exiting the session or prior to the next scheduled Gather© session. The platform was evaluated after the two group work assignments in a similar way as for the laboratory session. The course participants who performed the laboratory exercises on site at the course institute during 2022–2023 (n = 10) did not participate in the evaluation of the digital laboratory platform but were asked to participate in evaluation of the digital platform for group work assignments.
A van der Laan acceptance scale (Van Der Laan et al., 1997) was used for evaluation. The scale, originally adapted from Osgood (Osgood et al., 1957), has been widely used for evaluation of user acceptance of interfaces at advanced technological levels, e.g., for surgery, traffic scenarios, driver acceptance, haptic control, etc. (Chen et al., 2007; Maintz et al., 2019; Sayer et al., 2007). The questionnaire was provided in English to avoid translation errors. The questionnaire was anonymous, and the participants were instructed that adding their names to the questionnaire was voluntary. This evaluation was performed for the years 2022 and 2023 due to the implementation of the Gather© platform.
The van der Laan questionnaire has nine items representing either the usefulness subscale (items 1, 3, 5, 7 and 9) or the satisfaction subscale (2,4,6, and 8) on a 5-point Likert scale. Having several items for each dimension increases the reliability of the questionnaire. The assessment questionnaire is shown in Table 1. Each item in the questionnaire was averaged and then scaled according to the original van der Laan definition, where the Likert scores were scaled from + 2 to -2 for items 1,2,4,5,7,9 and from − 2 to + 2 for items 3,6,8 since they were mirrored. The most positive aspects were assigned a score of + 2. Mean and standard deviation (m ± sd) of the subscales was used for assessment of the overall acceptance of the sessions. Internal consistency was tested using Cronbach’s alpha calculated using a two-factor analysis of variance (ANOVA) without replication in Excel© (Microsoft Office, Inc.). A Cronbach’s alpha value above 0.65 is considered reliable by van der Laan (Van Der Laan et al., 1997); however, a higher value shows a higher reliability in the test (Lewis & Sauro, 2021).
3 Results
3.1 Evaluation of the overall laboratory module
For layout design 1 (year 2021), the overall laboratory module was rated 3.17/5 by course participants who answered the evaluation (n = 35) in the final course evaluation. In free-text answers, the major comment was the shortage of time for the laboratory hands-on session and the need for assistance during the measurements. In layout design 2 (year 2022 and 2023), the participants (n = 29) in the final course evaluation questionnaire scored the module as 4.10/5 (Fig. 5), with overall positive comments on the laboratory hands-on session including interaction with peers in Gather Town, and the time shortage was no longer a mentioned issue since the laboratory time was increased (Fig. 4).
3.2 Evaluation of the digital platform
The evaluation of the digital learning environment shows that overall, the participants found the Gather© town to be useful and satisfying as a learning environment. Cronbach’s alpha of 0.75–0.87 was obtained for the three tests (Table 2). The results of the evaluation questionnaire based on the van der Laan scale are presented in Fig. 6; Table 3. Both subscales, usefulness (0.54 ± 0.67) and satisfaction (0.37 ± 0.60), showed a positive user acceptance, i.e., above zero on a scale of [-2, 2] on the first occasion. By comparing the two GW sessions with identical assignments, the user acceptance on both subscales of usefulness and satisfaction improved from 0.75 ± 0.70 and 0.67 ± 0.75, to 1.11 ± 0.59 and 1.00 ± 0.71, respectively, after the second GW session. As a result, training on the environment could improve the metrics and experience of the participants. The changes in the acceptance items were assessed for the whole population who answered the questionnaire and not for each individual due to the anonymity of the evaluation questionnaires.
3.3 Teacher reflections
Overall, the teachers found the platform interesting and they experienced that they could easily supervise the groups. Even though the participants had different computer and internet settings, they could easily connect to and use the Gather© platform. Some minor technical issues, mostly concerning the documents, arose during the sessions for the group work which was possible to resolve by alternative solutions. One example was that the positioning or distance of the avatar from the document icon was not always optimal for accessing the document. Most of the groups wrote the report in the shared document directly and one group shared the document using the screen in Gather© platform. One issue for the participants was opening two documents, for instructions and the report, simultaneously in the platform. The advantage of Gather-linked document sharing for report writing was that the teacher could independently have a view of the shared documents during the session.
4 Discussion and Conclusion
In this study, an interactive remote laboratory was implemented in a digital learning environment and evaluated. The entire course was converted to a distant variant in 2021 as a consequence of the pandemic, and was intended to be a future structure for the course. The suggested laboratory layout is sustainable in providing real-life-problem-based experiments that could potentially be easily updated with the future upgraded hospital settings and are at the same time cost-effective and accessible for the organizing institution regarding providing the laboratory equipment and for the course participants regarding saving on travel costs. The possibility of performing the measurements remotely is ecological and climate-friendly since it avoids long travel and reduces use of consumables. Most importantly, the pedagogical structure provides an opportunity for healthcare personnel to participate in the course without needing to take long leaves from work, which is not affordable especially in periods of heavy workload at the hospitals.
In evaluating the education or learning outcome, direct assessment of knowledge and skills is a more obvious and controlled instrument whereas self-reported assessment is a less prioritized practice (Nusche, 2008). On the other hand, usability and user experience tests are commonly implemented within industry and software development intended for human use (Lewis & Sauro, 2021) which implies the suitability of such tests for assessment of online learning platforms rather than performing a direct assessment of the learning outcome. Depending on the system under study, the choice of test could be a verbal report or think aloud approach, a performance assessment, or measurement through standardized questionnaires (Lewis & Sauro, 2021) where the System Usability Scale (SUS) is most commonly used (Brooke, 1996; Lewis, 2018; Schaefer, 2016). However, there are complications with the SUS questionnaire (Klug, 2017) including the scaling that down-prioritized its usage in this study. Other possible methods are behavior or engagement detection techniques using self-reporting questionnaires or automatic computer vision-based appearance tracking and classification methods (Dewan et al., 2019; Luo et al., 2023).
Several studies report on successful implementation of the Gather Town as a learning environment (Lee et al., 2023; Lo & Song, 2023; McClure & Williams, 2021; Tang et al., 2022; Zhao & McClure, 2022). Among these, some studies have specifically compared Gather Town with Zoom, concluding that students have a strong preference for the Gather Town platform due to a higher social connection among the peer students, easier transition between the private and public discussions and a higher sense of presence similar to being in a classroom (Latulipe & Jaeger, 2022; Lee et al., 2023). Lee et al. also report that there is an easier exchange of emotions through Gather Town (Lee et al., 2023). In a comparison of a VR technology-based platform, Zoom and Gather Town, the VR technology-based platform had the lowest evaluation and Gather Town the highest, specifically due to the higher engagement of the participants in the latter (Sriworapong et al., 2022).
The limitations of this study lie in the low number of implementations of the laboratory session which is too low for performing statistical significance tests. In addition, non-anonymous evaluation would allow tracking of the individual user’s experience. Training on the digital environment could be improved by using the same digital platform for all of the course sessions including lectures instead of only the collaborative occasions. However, this was limited in this implementation by the cost of the platform for a large population and by the fact that many lecturers were not familiar with the digital platform. Ideally, pilot testing should be performed before the main evaluation test, which was difficult to conduct in this study. In the next step, the laboratory design should be improved through iterative adjustments and evaluations. However, such a procedure is challenged by the technical upgrades of the commercial digital learning platform that is not known to the teachers and course designers.
In conclusion, an interactive remote laboratory module on electrical safety of medical devices in biomedical engineering education was designed, implemented and evaluated considering overall laboratory usefulness and the user acceptance of the digital platform. Some minor technical problems needed to be solved during the online collaborative sessions, mostly concerning the shared documents. All groups were able to carry out the laboratory work efficiently and within the allowed time for the session. The participants’ acceptance of the digital platform improved after they had used it on two identical occasions during the course. The importance of this work is in providing an instance of an interactive remote laboratory for teaching an essential skill within the biomedical engineering profession. The design of this laboratory can be easily implemented in other academic or vocational training centers and is envisioned to contribute to further implementations of sustainable biomedical engineering education.
Data availability
The datasets generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.
References
Ahrar, S., Li, D. J., Towles, J. D., & Venook, R. D. (2021). Trace Matrix: A Framing Tool to Improve Communication and Debugging in Remote Instrumentation Lab courses. Biomedical Engineering Education, 1(1), 195–200. https://doi.org/10.1007/s43683-020-00036-7
Bajči, B., Šešlija, D., Reljić, V., Milenković, I., Dudić, S., & Šulc, J. (2019). Augmented reality as an Advanced Learning Tool for Pneumatic Control. 2019 5th Experiment International Conference (exp at’19). https://doi.org/10.1109/EXPAT.2019.8876552
Blum, T., Kleeberger, V., Bichlmeier, C., & Navab, N. (2012). mirracle: An augmented reality magic mirror system for anatomy education. IEEE Virtual Reality Workshops (VRW). https://doi.org/10.1109/VR.2012.6180909
Bork, F., Lehner, A., Eck, U., Navab, N., Waschke, J., & Kugelmann, D. (2021). The effectiveness of collaborative augmented reality in gross anatomy teaching: A quantitative and qualitative pilot study. Anatomical Sciences Education, 14(5), 590–604. https://doi.org/10.1002/ase.2016
Brooke, J. B. (1996). SUS: A ‘Quick and Dirty’ Usability Scale.
Chen, F., Qvint, G., & Jarlengrip, J. (2007). Listen! There Are Other Road Users Close to You – Improve the Traffic Awareness of Truck Drivers. In C. Stephanidis, Universal Access in Human-Computer Interaction. Ambient Interaction Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73281-5_34
Dahalan, F., Alias, N., & Shaharom, M. S. N. (2023). Gamification and Game Based Learning for Vocational Education and Training: A systematic literature review. Education and Information Technologies, 29, 1279–1317. https://doi.org/10.1007/s10639-022-11548-w
Dewan, M. A. A., Murshed, M., & Lin, F. (2019). Engagement detection in online learning: A review. Smart Learning Environments, 6(1), 1. https://doi.org/10.1186/s40561-018-0080-z
Dewilde, A. H., & Li, Y. (2021). Using Arduinos to Transition a Bioinstrumentation Lab to Remote Learning. Biomedical Engineering Education, 1(2), 313–316. https://doi.org/10.1007/s43683-020-00042-9
Dichev, C., & Dicheva, D. (2017). Gamifying education: What is known, what is believed and what remains uncertain: A critical review. International Journal of Educational Technology in Higher Education, 14(1), 9. https://doi.org/10.1186/s41239-017-0042-5
Ekici, M. (2021). A systematic review of the use of gamification in flipped learning. Education and Information Technologies, 26(3), 3327–3346. https://doi.org/10.1007/s10639-020-10394-y
Garzón, J. (2021). An overview of twenty-five years of augmented reality in Education. Multimodal Technologies and Interaction, 5(7), 37. https://www.mdpi.com/2414-4088/5/7/37
Gurevych, R., Silveistr, A., Мokliuk, M., Shaposhnikova, I., Gordiichuk, G., & Saiapina, S. (2021). Using augmented reality technology in Higher Education Institutions. Postmodern Openings, 12(2), 109–132. https://doi.org/10.18662/po/12.2/299
Haase, E. (2021). Moving a Team-Based Freshmen Biomedical Engineering and Design Course Online. Biomedical Engineering Education, 1(1), 31–36. https://doi.org/10.1007/s43683-020-00005-0
Hamilton, D., McKechnie, J., Edgerton, E., & Wilson, C. (2021). Immersive virtual reality as a pedagogical tool in education: A systematic literature review of quantitative learning outcomes and experimental design. Journal of Computers in Education, 8(1), 1–32. https://doi.org/10.1007/s40692-020-00169-2
Hincapie, M., Diaz, C., Valencia, A., Contero, M., & Güemes-Castorena, D. (2021). Educational applications of augmented reality: A bibliometric study. Computers & Electrical Engineering, 93, 107289. https://doi.org/10.1016/j.compeleceng.2021.107289
Jacobson, M. J., & Reimann, P. (2010). Designs for Learning Environments of the Future: International Perspectives from the Learning Sciences. Springer New York. https://doi.org/10.1007/978-0-387-88279-6
Journal Of Library User Experience 1(6). https://doi.org/10.3998/weave.12535642.0001.602
Kapilan, N., Vidhya, P., & Gao, X. Z. (2021). Virtual laboratory: A Boon to the Mechanical Engineering Education during Covid-19 pandemic. Higher Education for the Future, 8(1), 31–46. https://doi.org/10.1177/2347631120970757
Klug, B. (2017). An Overview of the System Usability Scale in Library Website and System Usability Testing.
Latulipe, C., & Jaeger, A. D. (2022). Comparing Student Experiences of Collaborative Learning in Synchronous CS1 Classes in Gather.Town vs. Zoom Proceedings of the 53rd ACM Technical Symposium on Computer Science Education. https://doi.org/10.1145/3478431.3499383
Lazou, C., & Tsinakos, A. (2022). Computer-mediated communication for collaborative learning in Distance Education environments. In P. Zaphiris, & A. Ioannou (Eds.), Learning and Collaboration Technologies. Designing the Learner and Teacher Experience. Springer. https://doi.org/10.1007/978-3-031-05657-4_19
Lee, Y., Jung, J. H., Kim, H., Jung, M., & Lee, S. S. (2023). Comparative case study of Teamwork on Zoom and Gather.Town. Sustainability, 15(2), 1629. https://www.mdpi.com/2071-1050/15/2/1629
Lewis, J. R. (2018). The System Usability Scale: Past, Present, and Future. International Journal of Human–Computer Interaction, 34(7), 577–590. https://doi.org/10.1080/10447318.2018.1455307
Lewis, J. R., & Sauro, J. (2021). Usability And User Experience: Design And Evaluation. In Handbook Of Human Factors And Ergonomics (pp. 972–1015). https://doi.org/10.1002/9781119636113.ch38
Liccardo, A., Arpaia, P., Bonavolontà, F., Caputo, E., Pandi, F., Gallicchio, V., Gloria, A., & Moriello, R. S. L. (2021). An augmented reality Approach to Remote Controlling Measurement instruments for Educational purposes during Pandemic restrictions. IEEE Transactions on Instrumentation and Measurement, 70, 1–20. https://doi.org/10.1109/TIM.2021.3101314
Lo, C. K., & Song, Y. (2023). 18–20 March 2023). A Scoping Review of Empirical Studies in Gather.town. 2023 11th International Conference on Information and Education Technology (ICIET) (pp. 1–5)https://doi.org/10.1109/ICIET56899.2023.10111430
Lowenthal, P. R. (2023). Synchronous Tools for Interaction and Collaboration. In O. Zawacki-Richter & I. Jung (Eds.), Handbook of Open, Distance and Digital Education (pp. 989–1002). Springer Nature Singapore. https://doi.org/10.1007/978-981-19-2080-6_55
Luo, Z., Zheng, C., Gong, J., Chen, S., Luo, Y., & Yi, Y. (2023). 3DLIM: Intelligent analysis of students’ learning interest by using multimodal fusion technology. Education and Information Technologies, 28(7), 7975–7995. https://doi.org/10.1007/s10639-022-11485-8
Maintz, M., Black, D., & Haj-Hosseini, N. (2019). Auditory and visual user interface for Optical Guidance during Stereotactic Brain Tumor biopsies. 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 1981–1984. https://doi.org/10.1109/EMBC.2019.8857310
McClure, C. D., & Williams, P. N. (2021). Gather.town: An opportunity for self-paced learning in a synchronous, distance-learning environment. Compass: Journal of Learning and Teaching, 14(2). https://doi.org/10.21100/compass.v14i2.1232
McMahon, D. J. (2015). Introduction to Electrical Safety: Part II. Fluke Biomedical. https://www.flukebiomedical.com.
Motejlek, J., & Alpay, E. (2023). The retention of information in virtual reality based engineering simulations. European Journal of Engineering Education, 1–20. https://doi.org/10.1080/03043797.2022.2160968
Nusche, D. (2008). Assessment of Learning Outcomes in Higher Education. OECD Education Working Papers, No. 15, OECD Publishing, Paris, https://doi.org/10.1787/244257272573
O’Neill, D. P. (2021). Redesign of a BME lab class to maintain hands-on Experimentation despite Remote Learning constraints. Biomedical Engineering Education, 1(1), 229–235. https://doi.org/10.1007/s43683-020-00039-4
Ong, V., & Yamashiro, S. (2022). Adapting a Human Physiology Teaching Laboratory to the At-Home Education setting. Biomedical Engineering Education, 2(1), 91–97. https://doi.org/10.1007/s43683-021-00055-y
Osgood, C. E., Suci, G. J., & Tannenbaum, P. H. (1957). The measurement of meaning. Board of Trustees of the University of Illinois.
Potkonjak, V., Gardner, M., Callaghan, V., Mattila, P., Guetl, C., Petrović, V. M., & Jovanović, K. (2016). Virtual laboratories for education in science, technology, and engineering: A review. Computers & Education, 95, 309–327. https://doi.org/10.1016/j.compedu.2016.02.002
Rojas-Sánchez, M. A., Palos-Sánchez, P. R., & Folgado-Fernández, J. A. (2023). Systematic literature review and bibliometric analysis on virtual reality and education. Education and Information Technologies, 28(1), 155–192. https://doi.org/10.1007/s10639-022-11167-5
Ryan, S., Scott, B., Freeman, H., & Patel, D. (2013). The virtual University: The internet and resource-based Learning. Taylor & Francis. https://doi.org/10.4324/9781315042022
Sayer, J. R., LeBlanc, D. J., Mefford, M. L., & Devonshire, J. (2007). Field Test results of a Road departure Crash warning system: Driver Acceptance, Perceived Utility and willingness to purchase. Driving Assessment Conference 4, 246-252, https://doi.org/10.17077/drivingassessment.1244
Schaefer, K. E. (2016). Measuring Trust in Human Robot Interactions: Development of the Trust Perception Scale-HRI. In R. Mittu, D. Sofge, A. Wagner, & W. F. Lawless (Eds.), Robust Intelligence and Trust in Autonomous Systems (pp. 191–218). Springer US. https://doi.org/10.1007/978-1-4899-7668-0_10
Sriworapong, S., Pyae, A., Thirasawasd, A., & Keereewan, W. (2022). Investigating Students’ Engagement, Enjoyment, and Sociability in Virtual Reality-Based Systems: A Comparative Usability Study of Spatial.io, Gather.town, and Zoom. In H. Li, M. Ghorbanian Zolbin, R. Krimmer, J. Kärkkäinen, C. Li, & R. Suomi, Well-Being in the Information Society: When the Mind Breaks Cham. https://doi.org/10.1007/978-3-031-14832-3_10
Tang, S., Pang, H., & Fung, F. M. (2022). 4–7 Dec. 2022). Designing Gather.town as a Learning Space in a Laboratory Module to Facilitate Social Interaction. 2022 IEEE International Conference on Teaching, Assessment and Learning for Engineering (TALE)https://doi.org/10.1109/TALE54877.2022.00114
The Open University (2012). The Virtual Microscope for Earth Sciences. http://www.virtualmicroscope.org/
Tulaskar, R., & Turunen, M. (2022). What students want? Experiences, challenges, and engagement during emergency remote learning amidst COVID-19 crisis. Education and Information Technologies, 27(1), 551–587. https://doi.org/10.1007/s10639-021-10747-1
Van Der Laan, J. D., Heino, A., & De Waard, D. (1997). A simple procedure for the assessment of acceptance of advanced transport telematics. Transportation Research Part C: Emerging Technologies, 5(1), 1–10. https://doi.org/10.1016/S0968-090X(96)00025-3
Vlachopoulos, D., & Makri, A. (2019). Online communication and interaction in distance higher education: A framework study of good practice. International Review of Education, 65(4), 605–632. https://doi.org/10.1007/s11159-019-09792-3
Wu, Y., Zhang, M., Li, X., Gan, Y., & Zhao, C. (2021). Augment reality-based teaching practice. Biomedical Engineering Education, 1(1), 237–241. https://doi.org/10.1007/s43683-020-00040-x
Zhao, X., & McClure, C. D. (2022). Gather.Town: A Gamification Tool to Promote Engagement and establish Online Learning communities for Language Learners. RELC Journal, 0(0), 00336882221097216. https://doi.org/10.1177/00336882221097216
Acknowledgements
This project was financed by Pedagogiska utvecklingsgruppen (PUG), Faculty of Science and Engineering at Linköping University, 2022.
Funding
Open access funding provided by Linköping University.
Author information
Authors and Affiliations
Contributions
Conceptualization: [NH, HJ]; Methodology: [NH, LC]; Formal analysis and investigation: [NH, MS]; Writing - original draft preparation: [NH]; Writing - review and editing: [NH, HJ, MS, LC]; Funding acquisition: [NH]; Resources: [NH].
Corresponding author
Ethics declarations
Disclosure statement
The authors report there are no financial or non-financial competing interests to declare.
Ethics approval
No ethical approval was needed for this study.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Haj-Hosseini, N., Jonasson, H., Stridsman, M. et al. Interactive remote electrical safety laboratory module in biomedical engineering education. Educ Inf Technol (2024). https://doi.org/10.1007/s10639-024-12636-9
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10639-024-12636-9