Challenge Statement

As is the case for most undergraduate biomedical engineering (BME) and bioengineering programs nationwide,3 our BME curriculum at the University of Virginia requires a hands-on instructional laboratory experience for all undergraduates in the major.1 And as was also the case for essentially all BME programs in Spring 2020, we were forced to abruptly adapt the delivery of our instructional labs in Spring 2020 during the COVID-19 pandemic when in-person instruction transitioned to remote learning. Looking ahead, similar course delivery constraints will almost certainly continue to persist into Fall 2020 (and potentially beyond), in which in-person instruction either will be severely limited due to social ongoing distancing guidelines or will be outright eliminated, as occurred in the spring.

The pandemic-induced remote (or hybrid) delivery of what are typically very hands-on laboratory courses presents a unique challenge for biomedical engineering educators. While students can learn about lab concepts and work with raw data analysis from prior experiments in a similar format to instructional delivery for non-laboratory courses,9 the experience of physically being present in the laboratory environment and picking up some of the requisite hands-on and psychomotor skills is far more challenging to replicate remotely. Of the five core laboratory competencies previously identified that are specific to best practices in BME labs,6laboratory technique presents the greatest difficulties in a remote learning environment.

Our instructional labs at the University of Virginia consist of a two-semester sequence split into nine modules, plus a final 4-week research project at the end of the spring semester.1 When our university transitioned to distance learning in Spring 2020, there was only one lab module remaining (a bioinstrumentation module on thermodilution for measuring cardiac output) and the final project. In this paper, we will describe our adaptations this past spring, the lessons learned from this experience, and our proposed course modifications for the fall semester, in which we will supplement last spring’s approach with gamified virtual lab simulations.

Novel Initiative

Bioinstrumentation Lab for Measuring Cardiac Output: COVID-19 Version

The final module of our two-semester integrative laboratory sequence (just prior to the 4-week final projects) was a lab illustrating the thermodilution approach for measuring cardiac output. The details of the procedure are not pertinent to this paper, but rather the modes of instructional delivery. When taught in person, students attend a lecture describing the approach, underlying concepts, design and analysis of the measurement circuit, and possible sources of error. In lab the students calibrate the measurement system and use an actual catheter to conduct multiple measurements of flow in a loop, analyzing the data outside of lab.

However, with all in-person instruction eliminated just prior to this module, we quickly made the following adaptations: (1) The introductory PowerPoint lecture was recorded using Panopto (a screen capture tool) and posted to our learning management system (LMS); (2) several short (~ 2–3 min) recorded videos of each step of the lab procedure were posted for the students to view in lieu of conducting and seeing the experiment in-person; (3) during the usual live lecture time, we held a Zoom-based mini-lecture overview covering commonly misunderstood portions of the procedure and analysis followed by open Q&A office hours (all of which was recorded and posted); (4) a complete set of raw data was posted for the students to analyze; (5) multiple Zoom office hour sessions were established (more than was typical during in-person instruction), in addition to extensive use of our LMS’s chat feature where students posed questions that the teaching team answered (also generating heavier traffic than prior to the shift to online learning). The assessments for the module were essentially unchanged from the in-person version of the module: students took the usual pre-lab quiz online at a time of their choosing, and students in their pre-existing groups collaborated (virtually) to analyze the data and prepare a lab report on the experiment. Students were also assessed on the module concepts and practical applications within a take-home open-book final exam, which also covered material from the first three modules in the course.

Final Research Projects (that Could Be Conducted at Home)

At the end of our two-semester lab sequence, each student team is normally assigned a final independent research project requiring them to apply the skills they have learned throughout both semesters. Project topics are widely variable and randomly assigned by the instructor, and they can range from instrumentation design and measurement (e.g. EEG, EMG) to molecular biology experiments to biomechanical testing to almost purely analytical projects where students process, analyze, and interpret real data from one of our faculty labs. One example from 2019 was a project in a faculty lab working with a senior capstone team bioprinting muscle cells directly onto an ECM scaffold and quantifying cell coverage of the scaffold and viability. Another project was building an Arduino-based mobile sEMG device and performing experiments to test its utility for biofeedback applications.

Due to the transition to online teaching this spring, while a couple of planned analytical projects proceeded unchanged (an arrhythmia detection tool for existing ECG data and an image processing project quantifying microbubble destruction by focused ultrasound), all of the other projects had to be redesigned to accommodate remote work. These remote projects fell into one of two categories: (1) experimental design projects, in which students were given a relevant biomedical problem and asked to develop a hypothesis and propose a detailed experimental design with appropriate statistics to test that hypothesis; or (2) analysis projects, for which students were provided either with raw data or with a computational model that generated realistic data that they could analyze. For both types of projects, a detailed rubric was provided, and the students submitted a final report at the end of the semester. An example experimental design project included researching gene delivery treatments for gliomas and proposing an experimental validation plan with hypothesis, experimental methods, controls, expected results, and data analysis. An example analysis project included an “experimental” determination of capillary permeability by using a MATLAB function that simulated realistic data and analyzing the results. Assessment of the modified final projects is provided in the “Reflection: Assessment of Spring Semester Transition to Online Instruction” section.

Reflection: Assessment of Spring Semester Transition to Online Instruction

Student Feedback on the Spring Semester

After the spring semester concluded, students completed an anonymous survey to assess the impact of transitioning from in-person to remote learning. Out of 120 students total, ~ 92% completed the survey and answered four 5-point Likert questions about the course transition and seven 3-point Likert questions about the impact of delivery approaches (Table 1).

Table 1 End-of-semester anonymous survey results (n = 100 out of 120 enrolled).

Most students felt they had adequate access to materials (including computing resources) to succeed in remote learning following the transition, although on average students also found that the shift made learning more challenging and required more effort to engage online. In terms of impact of specific course components, all were generally positive, but the recorded lab demonstration videos were considered especially helpful (2.90 on a 3-point Likert scale), as were the pre-recorded asynchronous lectures, chat feature, and virtual office hours (2.76, 2.77, and 2.77).

An additional 64% of respondents submitted free responses to a prompt to comment on their experience (both opportunities and challenges) in shifting to remote instruction. We coded the qualitative open-ended responses into broad categories including “positive,” “mixed,” and “negative” student responses, and we also coded them into more specific categories and quantified on a per-code response. For example, if a student appreciated the online demonstration videos but found working remotely with their team challenging, then in the first categorization the student response would be “mixed,” but in the second a response would be added to the tally for both the “positive: lab instruction delivery” and “negative: groupwork challenging” categories (a sampling of comments is provided in the Supplementary Material).

In their overall comments, 34% of respondents were completely positive, 36% were mixed in their responses, and 26% only mentioned negative comments. Breaking down by coded responses, 51% of respondents wrote broad, generally positive comments about the transition, 19% had positive comments specifically about how the lab instruction was adapted in the absence of hands-on labs (e.g. video demos helpful), 16% wrote broadly negative comments (without specifics), 29% specifically critiqued missing out on doing in-person labs, 17% mentioned that group work was harder after the shift, and 7% found communication with the teaching team and office hours less effective online.

We were also interested in how students responded specifically to the final research projects, all of which had to be conducted remotely this past spring. At the end of each spring semester the students are asked to indicate their agreement on 5-point Likert scale with the statement, “I was happy with my final project.” Student ratings of their projects actually increased from 3.90 to 4.15 from 2019 (in-person) to 2020 (online). However, this change was not quite significant (Welch’s two-tailed t test p = 0.11). We also compared opinions on types of projects in 2020 (i.e. experimental design vs. analysis projects, as described in the “Novel Initiative” section), and students exhibited a slight but insignificant (p = 0.20) bias toward analysis projects (4.25 vs. 4.01). While most students were very happy with the remote final projects, ~ 10% of them specifically mentioned that they missed the opportunity of generating their own data in the lab.

Assessment of Student Performance in 2020 Relative to 2019

To evaluate student learning outcomes, we compared performance on summative assessments last spring relative to the same assignments in 2019. Students performed very well on the instrumentation lab report, individual final exam, and final project following the transition to online instruction. In fact, the exam and project scores were significantly higher in 2020 than in the in-person version in 2019 (Welch’s unpaired two-tailed t-test p < 0.0001). However, the two exams are not directly comparable since the 2020 exam was take-home, as well as open-notes and without the usual 3-h time limit. Those changes likely accounted for the large difference in mean scores from 2019 to 2020 (average grade raised from C to A-). Most exam questions were not recall-based that could be looked up but instead involved analysis, troubleshooting, or integrative thinking, so our expectation is that the improvement was largely due to the expanded time. (Many students did not complete the timed exam in 2019.) The instrumentation lab report scores were marginally lower in 2020 (p < 0.0001), but the mean was still a low A, and students understood the lab concepts and data analysis despite not having performed the experiments themselves.

Proposal for the Fall

Using Gamified Virtual 3D Simulations to Enhance “Lab” Instruction

This fall our institution (as of the time of this writing) is planning for students to return to campus but severely restricting the number who can work at one time in a classroom or laboratory. The allowable capacity of our teaching lab space will be only 20% of the normal fall enrollment, meaning that only one student per lab station (eight stations total) could be in the lab simultaneously. And with rare exceptions, our institution is also requiring that all core courses be made available remotely for those students who choose not to return to campus. In order to meet these constraints, the delivery of the course will be online.

While the rapid transition to remote instruction this spring was generally viewed as successful by our students (see “Reflection: Assessment of Spring Semester Transition to Online Instruction” section), there are still some significant shortcomings to simply providing demonstration videos of key experimental steps, providing students with raw experimental data, and then having them conduct the analysis, all divorced from the experience of being physically present in a lab environment. Our students last spring had the benefit of experiencing over a semester and a half of in-lab instruction prior to the shift to online instruction, and the final module and projects did not include any new hands-on skills, so all students met the original course objectives. But even so, several students specifically mentioned that while they felt like they generally understood the instrumentation lab and enjoyed the topic, they still had difficulty motivating themselves to work on the analysis and report since they felt disconnected from the lab itself (on top of having to learn to work together as a team remotely). This coming fall, our students will be taking a BME lab course for the first time, and most of them will have never conducted research in a lab previously. We are concerned that student engagement with the material may suffer with the demo and analysis-only approach, particularly with a new cohort of students who have not already been exposed to the in-person lab experience.

To address these concerns, we have decided to supplement the approach used this past spring—asynchronous lectures, office hours, demonstration videos, and analysis of raw data—with an online “gamified” virtual lab simulation. Virtual labs of this type have previously been shown to be highly effective for enhancing student engagement and learning (see “Efficacy of a Simulated Virtual Lab Environment: Is It Equivalent to an In-lab Experience?” section). Specifically, students will access virtual labs developed by Labster (Somerville, MA) that consist of 3D simulated environments with guided activities demonstrating lab safety, cell culture, western blotting, gene expression analysis, fluorescence microscopy, and other techniques. Table 2 below indicates how specific Labster simulations map to our fall semester labs. The Labster simulation environment is similar to videogames where users interact with a realistic 3D onscreen environment, and their specific actions within that virtual environment determine what happens next, including whether they ultimately complete the goals of the simulation. Interspersed throughout the activities are explanations of the concepts and rationale behind specific steps, followed by short quizzes to assess student learning. This system integrates with our LMS, allowing for the Labster quizzes to be a part of graded student feedback. We will use the Labster quizzes as formative assessment, and students will receive credit when they complete the simulation and quizzes.

Table 2 Labster simulations mapped to specific labs in the fall semester.

As one example of how the Labster gamification works, within a simulation a student might put on gloves, click on a pipette, click on a box of tips, click to transfer liquid to a specific tube, and then be reminded if they forget to dispose of the used tip before attempting to pipette something else. Rather than passively watching a video, students will thus be able to engage with a virtual environment that looks like a (clean, very high-tech) lab. One advantage of online virtual labs like Labster is that when students interact with items and do something incorrectly, they are able to see what would happen—sometimes in a way that could never be safely replicated in a real lab (e.g. if they make a mistake in the lab safety module). The Labster simulations do not (yet) include exportable simulated or real data, but as a supplement to our own demos and data, it is our belief that the lack of a physical in-lab experience will to some extent be mitigated (also supported by the literature—see below).

Efficacy of a Simulated Virtual Lab Environment: Is It Equivalent to an In-lab Experience?

The objectives and core competencies of engineering labs,2 and specifically BME labs,6 have been established, and our experience from the spring is that demonstrations and data analysis can address most of these core competencies. However, video demonstrations followed by data analysis are unlikely to engage students in the culture of working in a lab or with the free inquiry needed for authentic learning in BME.5 The sort of virtual simulations we are proposing for the fall semester have been shown to support student learning as effectively as in-person labs in the natural sciences,7 and these simulation-based labs have more recently been shown to be equivalent to in-lab experiences for biology-focused labs of the sort we teach in the fall semester.4 This summer (when we teach a compressed version of the first semester lab to a smaller cohort) and fall, we will perform pre- and post-assessment of student self-efficacy and learning of core concepts. Informal unsolicited feedback on the Labster simulations from the nine summer students (currently in progress) is highly positive, with some students stating that those simulations are one of their favorite aspects of the course thus far. (One current summer student referred to Labster as being “like ‘The Sims’ [a bestselling videogame series], but for credit!”) If the simulated labs are shown to be successful following more formal assessment, we will likely continue to supplement our lab sequence with virtual simulations even after fully in-person instruction eventually resumes.

What About Hands-on Practice and Lab Skills? A Simulation Still Can’t Fully Provide that!

Even with a simulated lab environment coupled with lecture content, demonstrations, and the opportunity to analyze raw data, one key deficiency remains: that of the development of psychomotor skills associated with specific tasks (e.g. pipetting, loading a gel, etc.).2, 6 As stated above, our institution is currently planning for limited in-person instruction, provided strict social distancing and safety practices are observed. Based on our current enrollment, each student will have the opportunity to attend the 4-h lab period only twice during the semester, so our plan is to implement what is essentially an extreme version of seat replacement blended learning.8 Assuming COVID-19 safety regulations have not changed appreciably by this fall, we will offer the option for students to participate in two skills-focused workshops where they learn to use micropipettes, load a plate as part of a protein assay, use a balance, operate a microscope, load a gel, use a centrifuge, and other essential skills that cannot be replicated without a physical lab component. These skills labs will be optional since some students may be unable to return to campus or have underlying health conditions. In those cases, we will offer the same optional workshop at periodic intervals once normal in-person instruction is allowed to proceed.