Introduction

Many computer sciences and engineering (CSE) programs have increasingly chosen to offer their online programs as a more accessible alternative to traditional on-campus programs. Especially during the COVID-19 pandemic, online courses have become the primary option for the CSE programs to continue teaching and learning. However, affording effective collaborative problem solving (CPS) experience in online courses remains an unresolved challenge (Andrews-Todd & Forsyth, 2020). CPS is defined as the process in which a group of people coordinates individual and collaborative commitment to developing unified problem solutions (Graesser et al., 2018). CPS is an essential skill for prospective computer engineers especially given that when solving real-world networking problems, computer engineers collaborate with multiple stakeholders to devise optimal solutions under practical constraints (Graesser et al., 2018). Without the deliberate practice of CPS, prospective engineers may find it challenging to apply acquired knowledge when solving authentic problems.

In traditional CSE programs, laboratory experience is integral to foster students’ CPS (Reeves & Crippen, 2020). Students acquire CPS skills by working together in groups, exchanging ideas and managing conflicts to solve problems (Häkkinen et al., 2017); thus, providing prospective computer engineers with effective virtual laboratory (VL) experience becomes a priority. However, barriers of VLs in supporting real-time online CPS remain unresolved. For example, students have to manipulate various tools in a simulation-based learning environment, which resulted in a higher cognitive load (Liu & Su, 2011; Parong & Mayer, 2018). Cognitive load describes the extent of working memory—the central processor of information in the human mind—required to handle information in a given task (Sweller, 1988). Research has indicated that cognitive load significantly influences the effectiveness of problem-solving, in that reducing novice students’ cognitive load is necessary for them to solve problems in online settings (Larmuseau et al., 2020). Lan et al. (2019) add that online collaboration requires a higher cognitive load than it does in face-to-face settings, as students must attend to additional efforts resulting from the lack of rich nonverbal and environmental cues. In addition, efficient CPS requires students to fulfill individual commitments but also regulate the group effort to resolve a shared challenge (Tawfik et al., 2014). For many novice students, fulfilling those expectations requires a high cognitive load (Zheng & Cook, 2012). Thus, reducing novice students’ cognitive load in CPS is critical for the effectiveness of virtual experimentation activities.

Research on cognitive load in college students’ laboratory activities has provided plentiful implications, but existing evidence mainly stems from self-reported data and its validity for virtual experimentation activities design is limited (Andersen & Makransky, 2021). First, CPS involves iterative processes such as individual and team knowledge building (Wiltshire et al., 2018), but self-reported measures overlook the temporal variation of learning and thus fail to determine if any CPS processes require a higher level of cognitive load (Kolfschoten et al., 2014). Second, self-reported measures are unable to address students’ and groups’ need for real-time feedback in CPS, as the findings of self-reported measures mainly come from “post-collaboration analysis” (Goggins et al., 2015). To this end, a granular understanding of novice students’ cognitive load in CPS during virtual experimentation activities is needed.

The purpose of this study was thus to conduct a multimodal analysis to investigate novice college students’ cognitive load in CPS during virtual experimentation activities on computer networking. This study hoped to collect electroencephalogram (EEG) data supplemental to self-reported measures to provide a granular account of novice students’ cognitive load and CPS within virtual experimentation activities. Specifically, this study sought to identify: (1) novice students’ cognitive load in different CPS tasks; (2) the relationship between students’ cognitive load and their performance in problem-solving; (3) the change in students’ conceptual understanding of domain knowledge and their CPS skills after attending virtual experimentation activities; and (4) students’ perceptions of the virtual experimentation activities. The findings of this research provided empirical, generalizable implications from the pedagogical perspective for designing VLs and virtual experimentation activities.

Literature review

Virtual laboratory

Virtual laboratory (VL) is a simulated learning environment that allows students to complete laboratory experiments online and explore concepts and theories without stepping into a physical laboratory (Potkonjak et al., 2016). VLs can provide more instantaneous results at a lower cost and with less setup time compared to the traditional laboratory (Bortnik et al., 2017; De Jong et al., 2013). The advantages of VL in computer science education have also long been recognized. For example, Xu et al. (2014) report several advantages of VL in enhancing students’ conceptual understanding of computer networks. Kabiri & Wannous (2017) conclude that the application of VL in computer sciences education has increased manageability, scalability and flexibility.

VLs allow students to collaboratively perform experiments in systems via remote access (Faulconer & Gruss, 2018; Wolf, 2009). To date, online collaborative experiences in VLs take place asynchronously or synchronously (Jara et al., 2009). Asynchronous collaboration via e-mails or forums may provide students with opportunities for online communication, but a lack of instant feedback may result in unequal task allocation (Ranz et al., 2017) and students’ feelings of isolation (Jara et al., 2009; Lim, 2017). Thereby, students’ interests, motivation and engagement in online collaboration may be reduced (Boulos et al., 2005). Alternatively, synchronous collaboration in VLs may overcome the aforementioned problems. Synchronous collaboration afforded by VLs enables e-learning in a similar way to the traditional classroom, sharing experiences in real-time like face-to-face interaction (Islam, 2019; Jara et al., 2012). For example, Islam (2019) find that synchronous activities promote interaction and create a sense of connectedness among students. Accordingly, synchronous virtual experimental activities are more likely to afford an efficient collaboration experience for students.

Collaborative problem solving

Collaborative problem solving (CPS) is a joint and shared activity where dyads or small groups execute several steps to transform a current problem state into a desired goal state (Hesse et al., 2015). It is considered one of the critical and necessary skills used in education and the workforce (Andrews-Todd & Forsyth, 2020). The necessity to empower college students with CPS skills for preparing the 21st-century workforce was heightened by Graesser et al. (2018). Research about CPS in a technology-supported learning environment has been well documented. Chang et al. (2017) analyzed how students solved a physics problem using individual-based and collaborative simulations, and the result indicated that students using the collaborative simulations demonstrated a higher level of engagement in the CPS activity. Gu & Cai (2019) found that integrating semantic diagram tools in CPS helped students achieve a greater depth of understanding of the domain knowledge. The research conducted by Unal & Cakir (2021) also concluded that CPS supported by web 2.0 technologies has a positive effect on student achievements.

CPS skills are a precondition for success in many learning contexts (Häkkinen et al., 2017; Lin et al., 2015; Lin et al., 2015). However, pedagogical approaches to improving CPS skills remain underexplored (Graesser et al., 2018). Students receive indirect training on CPS skills: a variety of pedagogical methods including collaborative learning and problem-solving learning are used for developing varied forms of CPS competencies (Barber et al., 2015; Goldstein et al., 2011). One major factor that contributes to the success of CPS is the efficiency of reciprocity and cooperation among students (Fiore & Schooler, 2004; Graesser et al., 2018). Good collaborative practice will depend on team members’ proficiency in communication (Care et al., 2016; Dillenbourg & Traum, 2006). Constructive dialogues are the primary resource during the process, allowing divergent understandings, the production of shared knowledge and the resolution of problem-solving impediments (Lin et al., 2016). Problem-solving competency is defined as “an individual’s capacity to engage in cognitive processing to understand and resolve problem situations where the method of solution is not immediately obvious” (Greiff et al., 2013). The ability to conceptualize and solve problems is a highly valued skill in the knowledge-based, interdisciplinary and distributed work of today (Lin et al., 2015a; b). One’s conceptualization of the problem provides the foundation upon which all subsequent problem-solving activity is built (Larson Jr & Christensen, 1993; Newell, 2010; Sengupta-Irving & Agarwal, 2017). Therefore, a CPS case supported by a problem conceptualization activity and a problem-solving activity was provided in this study.

Cognitive load theory

Cognitive load theory (CLT) was first proposed by John Sweller (Sweller, 1988), depicting that our working memory can only deal with a limited amount of information at once. Working memory should be seen as limited in capacity and time, whereas long-term memory can be seen as infinite (Gathercole et al., 2008). Given that the goal of learning is to move new information from the working memory to the long-term memory (Ericsson & Kintsch, 1995; Sweller, 2016), CLT suggests that instructional materials and environments should be designed to reduce this load. Thus, removing distractions enables a more efficient passage of the desired learning from working memory to long-term memory (Paas et al., 2003a, b).

We can apply the concept of cognitive load to learning and training in several ways (de Jong, 2010; Paas, Renkl et al., 2003; Paas et al., 2003b). Cognitive load is typically increased when unnecessary demands are imposed on a learner, making the task of processing information overly complex. According to Galy et al. (2012), task completion is prone to excessive cognitive load owing to the task's difficulty level. When the cognitive load is well managed, students can learn new skills more easily than when a high cognitive load interferes with the creation of new memories (Sweller, 2011). Analyzing the impacts of devising strategies and tools on cognitive load can help learners maintain an optimal level of learning outcomes in various learning contexts.

Research has shown that students may suffer more cognitive load from collaborative tasks than individual tasks (Andrade, 2010; Burgess, 2000; Ophir et al., 2009), as collaborative tasks prevalently involve multitasking in information processing as well as interaction and communication tasks. Understanding the cognitive load involved in CPS is important in designing, scaffolding or facilitating technology-based collaboration and learning. Studies such as Kolfschoten et al. (2014) and Kolfschoten & Brazier (2013) focused on managing the cognitive load involved in three different phases of CPS (e.g., brainstorming, convergence and decision-making) based on the results of literature analysis, and various sources of cognitive load in collaboration are identified. But the limitations of the studies are that the results in this study are highly qualitative, and evaluation was done by experts.

Cognitive load has been traditionally measured via subjective questionnaires (Paas et al., 2003a; b). Recently, however, physiological methodologies such as EEG have been suggested for objective measures of cognitive load (Parasuraman et al., 2008). In general, EEG is measured by recording the voltage of electrodes on the scalp. The electrodes are placed in assigned positions distributed on the head (Nacke et al., 2011). In recent years, along with the increasing popularity of mobile, portable, fast, inexpensive and noninvasive EEG instruments, research has been carried out for continuous measurement of cognitive load (Antonenko et al., 2010). Cognitive load is reflected in EEG signals as changes in the theta-band power (4–7 Hz) in frontal brain areas and the alpha-band power (8–14 Hz) at parieto-occipital brain areas (Berka et al., 2007; Pesonen et al., 2007). Most studies demonstrated that either frontal theta increases or alpha decreases, both with a higher cognitive load (Käthner et al., 2014). Following these latest findings, we adopted the results of Holm et al. (2009) that the cognitive load index was calculated based on the theta and alpha oscillations as a reliable cognitive load index of students.

With the current study, we hope to obtain a better understanding of novice students’ CPS performance in a VL setting. EEG data about students’ cognitive load were collected and then we conducted a multimodal analysis to interpret how cognitive load fluctuates in CPS within virtual experimentation activities. A set of interdependent activities were established to promote social contact between students and a worksheet was designed to support the task-oriented roles of the team working in our study.

Specifically, the research questions investigated in this study are as follows:

  • Q1: What is the difference in the level of students’ cognitive load between different CPS tasks?

  • Q2: What is the relationship between students’ cognitive load and their worksheet performance?

  • Q3: What are the predictors of students' level of cognitive load in different CPS tasks?

  • Q4: What is the impact of students’ virtual experimentation experience on their understanding of networking knowledge and CPS skills?

  • Q5: What is students’ perception about their CPS experience in virtual experimentation activities?

Methodology

A concurrent mixed methods study (Creswell & Plano Clark, 2017) was conducted to provide a comprehensive account of novice students’ cognitive load during CPS in computer engineering virtual experimentation activities. Quantitative and qualitative sources of data were collected and analyzed concurrently (Creswell & Plano Clark, 2017). Specifically, quantitative inquiries investigated novice students’ cognitive load, conceptual understanding and performance in problem-solving during the virtual experimentation activities. Qualitative inquiries examined novice students’ perceptions of their experience with virtual experimental activities in their own words (Creswell & Creswell, 2017). In the end, the findings from those two complementary sources of data were converged to develop a comprehensive understanding of students’ CPS experience in virtual experimentation activities.

Participants and contexts

The study was conducted in a required introductory-level course, Computer Networks, for freshmen students enrolled in the major of data science at a public university in central China. The course lasted for 18 weeks, two of which were required course hours for laboratory experience. Due to the safety protocols of COVID-19, the course was offered in a hybrid mode during Fall 2020. Course hours for laboratory experience also switched to virtual experimentation activities, wherein students remotely attended the course using and collaborating with assigned teammates to solve a series of authentic networking problems. Each of the course hours for virtual experimentation lasted for 120 min and this study was conducted in the first course hour.

A total of 36 freshmen students were enrolled in this course. A consent form was sent to each student at the beginning of the semester. All the students agreed to participate in this research voluntarily, but they were informed of the right to withdraw from the project anytime without any influence on their course grades. None of the participants had participated in any online experimentation activities or had online experience with CPS. Nineteen of the participants were male (53%) and seventeen were female (47%). The average age was 20.5 years old (SD = 0.27). All participants were assigned to a group of three to collaborate in the virtual experimentation activities.

Virtual experimentation activities

Technical design

The virtual experimentation activities included the use of an experimental tool, Cisco Packet Tracer, and a video conferencing tool, Tencent Meeting. The class convened online in a shared Tencent Meeting link and then each group was assigned to a separate Tencent Meeting room to work on their project. Tencent Meeting, a cloud-based online video conferencing tool (similar to Zoom), allowed the participants to talk (video/audio), chat (text), share the screen and record the collaboration procedures (see Fig. 1). Cisco Packet Tracer, a simulation program with capabilities of network simulation, visualization and multiuser connections, provided affordances of collaboration for participants to exert CPS in this study. The multiuser peer-to-peer module of Cisco Packet Tracer enables the collaborative building of virtual networks in a real network (Demeter et al., 2019). Each group appointed one member to share their screen to coordinate and track the conceptualization (on the Worksheet for the first task) and the solution (on the Packet Tracer for the second task) of the problem.

Fig. 1
figure 1

One of the teams’ screenshots of the interface for virtual experimentation activities: a problem conceptualization, b problem solving

Pedagogical design

The virtual experimentation activities in this study focused on decomposing a network into smaller units of subnets. The pedagogical design mainly featured authenticity, processes and collaboration. First, participants were expected to complete an authentic task of assembling and decomposing the network for a newly established administrative unit in the institution. This authentic task was directly relevant to the participants’ lived experiences and helped them establish a contextual understanding of the problem and preliminary considerations of practical constraints. Second, the virtual experimental activities, with a focus on the CPS processes, consisted of two sequential tasks, problem conceptualization and problem-solving (Jermann, 2004). Each group conceptualized the problem by phenomena observing and analytical reasoning before starting with the problem-solving task by experiment design and hypothesis verification (Kim et al., 2013). Third, the pedagogical design reinforced the importance of collaboration by making it impossible for one participant alone to solve the problem. Each of the three members was assigned a distinct identity/role to represent a specific department in the new unit. In addition, each of the two tasks included a formative assessment that highly relied on each of the group members’ contributions to complete and submit it.

Specifically, the first task (15 min), problem conceptualization, required the participants to build the hypothesis of problem solution by group discussion. The formative assessment for this task was a worksheet with 5 fill-in-the-blank questions about specific IP addresses based on the IP address block assigned to each group and the number of hosts assigned to each group member. To fill in the forms, each of the three members needed to compute the potential solution based on the information associated with a specific role and then assembled the results from the three members into a plan that ensured the network worked efficiently.

The second task (25 min), problem-solving, required students to reify the conceptualized plan using the Cisco Packet Tracer to construct a network topology. During this step, participants worked together to coordinate each of the group members’ individual network topologies using multiuser connections. Then each group submitted a report as a formative assessment of their performance in problem-solving.

The study was conducted in five steps (see Fig. 2). Students remotely accessed the experimental platform and were required to wear the EEG headset for their brainwave data collection. An orientation on basic concepts about the network topology (e.g., Classless Inter-Domain Routing) and the use of Packet Tracer to construct network topologies was offered to the participants before virtual experimentation sessions. The orientation was to ensure each participant understood how to communicate between multiple devices in a network topology using (1) the assigned IP address and (2) multiuser connecting.

Fig. 2
figure 2

The flow diagram of the study

Data collection

As shown in Fig. 3, multimodal data collected in this study included self-reported data, student artifacts, performance data and EEG data.

Fig. 3
figure 3

The data collection the study

Quantitative

Quantitative data collected in this study included knowledge test scores, self-reported measures of CPS, EEG data and artifacts about problem-solving. Specifically, a knowledge test (e.g., questions on basic knowledge, near transfer and far transfer) and a CPS instrument (Siu & Shek, 2005) were administered twice (pre-and post-) to gauge participants’ level of conceptual understanding and CPS skills. EEG data collected through a portable wireless EEG headset was used to determine participants’ cognitive load during CPS. For each task, participants completed a worksheet as the assessment of students’ performance in problem-solving.

EEG

The EEG device used to collect brainwave signals was a non-invasive head-mounted device with a portable brainwave sensor developed by Emotiv Technologies. EEG data were collected from 10 electrodes using the International 10–20 method of electrode placement (Homan et al., 1987). A self-developed brain-computer interface system (API, see Fig. 4) was developed to build wireless connections to each EEG device for each participant to attend the course remotely. The API received brainwave signals and recorded values of cognitive load for analysis. Each participant was given a set of EEG devices in earlier weeks. Before the course started, each participant tested the EEG device and the connections. The baseline brain wave rhythm sample was recorded in a relaxed state before the virtual experimentation. Then each participant was required to wear the device throughout the virtual experimentation activities (40 min) to record their EEG data for the two CPS tasks—problem conceptualization and problem-solving.

Fig. 4
figure 4

The interface of the system to record and visualize students’ brainwaves

The revised Chinese social problem‐solving inventory

CPS follows a social learning process and requires the timely application of social skills. The Revised Chinese Social Problem‐Solving Inventory (C-SPSI-R; Siu & Shek (2005)) was adopted in this study to evaluate each participant’s CPS skills. The C-SPSI-R instrument included 52 questions assessing five dimensions, including rational problem solving, positive problem orientation, negative problem orientation, avoidance style and impulsiveness/carelessness style. The validity of this instrument has been confirmed by prior studies in college STEM classes (Gu & Cai, 2019). The internal consistency of the instrument was acceptable as confirmed by Cronbach's alpha value for the pre-test (0.813) and the post-test (0.802).

Knowledge tests

Knowledge tests were developed to assess participants’ conceptual understanding of network topologies. The pre-test included 15 multiple-choice questions, 12 of which focused on basic knowledge (e.g., from the following list, identify a valid Class B IP address) and three were about knowledge transfer (e.g., you are trying to connect your switch via the console port and are having trouble connecting to the switch. You check the setting of your terminal emulation program and find the following, which setting is incorrect and what should it be). The post-test retained the same items as the pre-test, but details such as texts and numbers were modified to mitigate the testing effect (Dimitrov & Rumrill Jr, 2003). Each test set the highest score as 50 points and each pair of analogical items were worthy of the same points in both tests. The two tests were validated by two scholars with expertise in computer networking and educational measurement.

Worksheets

Two separate worksheets based on the same authentic case (i.e., building network typologies for a new administrative unit) were used to guide students’ problem conceptualization and problem-solving as well as assess their performance in those two tasks. Participants in groups worked on the authentic case and completed fill-in-the-blanks questions in the worksheet. The worksheet for problem conceptualization asked each group to identify the specific IP addresses questions (e.g., the number of hosts assigned to each group number) based on their discussion and ideation. Then each group followed the plan conceptualized in the first task to construct network topologies and verify the connectivity using Packer Tracer. Upon the completion of problem-solving, participants submitted the worksheet with their constructed network topologies and the screenshot of connection validation using the ping command as a measure of their performance in problem-solving.

Qualitative

An open-ended questionnaire, including a total of five questions (see Table 1) was appended to the post-test to inquire about participants’ perception of the virtual experimentation activities with a focus on their learning experience in the virtual experimentation activities, as well as the pedagogical design and the technical design of the virtual laboratories.

Table 1 The open-ended questionnaire of the study

Data analysis

Quantitative

EEG data were processed to represent the level of participants’ cognitive load. For each of the CPS tasks, the average oscillation potential for each channel was calculated as the mean of the intervals of the sessions (e.g., Khader et al. (2010)). The cognitive load index was calculated based on the theta and alpha oscillations (Holm et al., 2009) as Eq. 1.

$$Cognitive\,load\,index=\frac{frontal\,theta\,power}{parietal\,alpha\,power}$$
(1)

To answer Q1, the calculated cognitive load index is first divided into segments of 1 s, where statistical features like mean and standard deviations are calculated for each data segment (Belyavin et al., 2002). Next, an analysis of variance (ANOVAs) was conducted to examine the differences in means between the CPS tasks (problem conceptualization and problem-solving). Furthermore, to answer RQ2, we conducted correlation analysis to examine the correlation between novice students’ cognitive load and worksheet performance among specific CPS tasks. Pearson product-moment correlation analysis was performed because cognitive load index and worksheet performance are continuous variables (Freedman, 2009). Decision trees generated from pre-questionaries identify factors affecting students’ cognitive load. Then, to answer RQ3, paired-sample t-tests were conducted to examine the impact of online virtual experimentation on students’ outcomes (e.g., understanding of networking knowledge, CPS skills) within computer networking CPS activities.

Qualitative

The research followed an inductive coding approach (Saldaña, 2016) to analyze participants’ responses to open-ended items to answer RQ4. Initially, two of the researchers individually familiarized themselves with the data and made preliminary codes in line with the research questions. Then the two researchers met to compare the list of preliminary codes, collaboratively discuss the discrepancies and elicit themes from the codes upon the 100% mutual agreement of those two researchers (Tang et al., 2020). To ensure the trustworthiness and rigor of the findings, member checking (Guba & Lincoln, 1994) was performed by emailing a list of themes and categories to five participants from different groups. All those five participants confirmed the qualitative findings accurately reflected their course experience. In addition, qualitative findings are presented in quotes from participants’ responses (Tang et al., 2021).

Results

What is the difference in the level of students’ cognitive load between different CPS tasks?

We compared novice students’ cognitive load in the problem conceptualization task and problem-solving task (see Fig. 5). For most participants (n = 21), their cognitive load in problem conceptualization is higher than that of problem-solving. The ANOVA analysis result indicated that students’ average level of cognitive load during problem conceptualization (M = 5.17, SD = 0.98) was significantly higher than that of problem-solving (M = 4.99, SD = 1.11), F = 0.521, p < 0.05.

Fig. 5
figure 5

The cognitive load index of each subject (1, 2…36) in specific tasks

What is the relationship between students’ cognitive load and their worksheet performance?

Participants’ worksheet performance was appraised by their grades in both worksheets. The correlation analyses indicated that participants’ cognitive load was negatively correlated with their performance in problem conceptualization (r = − 0.35, p < 0.05) and problem solving (r = − 0.38, p = 0.00 < 0.05). Specifically, novice students with a lower cognitive load tended to outperform their peers.

What are the predictors of students’ level of cognitive load in two CPS tasks?

To reveal factors that caused the difference in students’ cognitive load between the two CPS tasks, two decision trees corresponding to different combinations of the multimodal data were generated. The inputs of the first decision tree included three variables generated from the pre-tests (e.g., basic knowledge, transfer knowledge and CPS skills) and the outputs were students’ cognitive load in problem conceptualization. Input variables and output variables of the second decision tree were the three variables, with the addition of the participants’ cognitive load and the worksheet performance in problem conceptualization to the first decision tree. Figure 6 and Table 2 show the results of the inputs and the first two importance variable of each tree.

Fig. 6
figure 6

Decision trees for cognitive load in specific CPS tasks: a decision tree for the cognitive load in problem conceptualization, b decision tree for the cognitive load in problem-solving

Table 2 Description of the generated decision trees

What was the impact of students’ virtual experimentation experience on their understanding of networking knowledge and CPS skills?

We found a significant difference (t = 2.992, p < 0.05) between novice students’ post-test and pre-test scores (see Table 3). The scores of the post-knowledge test (M = 95.56, SD = 12.41) were significantly higher than those of the pre-knowledge test (M = 89.44, SD = 12.41). Those findings suggested that the experience of CPS within an online virtual lab environment has a significant positive effect on participants’ understanding of networking knowledge.

Table 3 Paired-sample t-tests analysis novice students’ knowledge and CPS skills in the pre-test and post-test

As shown in Table 3, the mean score of all students in the pre-CPS skill test was 3.18 and the standard deviation was 0.21, while the mean score of the post-CPS skill test was 3.27 and the standard deviation was 0.29. Paired-sample t-tests analysis revealed that there was no significant difference between the pre-CPS skill test scores (t = 2.237, p = 0.158) and post-CPS skill test scores, implying that CPS learning did not influence the individual student’s CPS skill within online virtual lab environments.

What is students’ perception about their CPS experience in virtual experimentation activities?

Student perceptions about their CPS experience in VL activities were gauged by their responses to five open-ended questions. Two themes with four categories emerged from students’ responses (see Table 4).

Table 4 Qualitative codes, categories, and themes

Theme 1: virtual experimentation activities strengthened students’ problem-solving competence

This theme describes participants perceived that virtual experimentation activities reinforced their competence to solve problems, including a better understanding of conceptual knowledge and a higher level of problem-solving skills. Most of the participants thought collaboration experience helped develop a refined understanding of the instructor’s course lecture and textbook content. In addition, participants considered collaboration in virtual experimentation activities to enhance their skills and confidence in working with classmates.

All my effort devoted to solving the problem during the virtual experimentation, no matter right or wrong, contributed to a refined understanding of the content discussed in our lectures and textbooks.

The collaborative task requires more coordination between our teammates. It’s difficult to finish it in a short time, and more time is needed in the next similar experiment. However, this experiment did improve my skillset of collaborating with my classmates. I have become more confident in working with them.

The online video conferencing tool is easy to use. I can share my screen to show my content to group members. I realized the importance of communication with group members to finish the tasks.

Theme 2: technical and pedagogical support was essential for an efficient experience with virtual experimentation activities

This theme outlines those participants who met challenges when participating in virtual experimentation activities and desired some technical support for an efficient CPS experience in the virtual lab. For example, most participants mentioned that communication was less efficient in online settings due to a lack of mutual regulation on each other’s actual progress in problem solving. Most participants also indicated that they would like to spend more time practicing experimental software before the next similar experiment as it took time to proficiently manipulate relevant tools and functions.

Compared with online settings, face-to-face is more convenient for communication with group members. In online settings, it is harder to observe other members’ reactions and their actual progress during the collaboration.

This is the first time I have participated in CPS in virtual experimentation activities. If I cannot receive feedback from my teammates, I become even more anxious

The proficient manipulation of the experimental software is fundamental in this activity; I will practice more with the experimental software before the next similar experiment.

Discussion

The purpose of this study was to investigate the effect of online VL on novice students’ CPS learning. A multimodal analysis including students’ learning gains on networking knowledge, cognitive load in different CPS tasks and student perceptions about the experimental course was investigated. The result clearly showed that students’ cognitive load was negatively correlated with their performance. Also, students’ average level of cognitive load during problem conceptualization was significantly higher than that of problem-solving. Additionally, the decision tree algorithm results identified specific factors affecting students’ cognitive load of students in problem conceptualization and problem-solving tasks.

Effects of online VL on students’ CPS learning

The results of the study show that students significantly increased their understanding of networking knowledge. The finding concurs with the view of Xu et al. (2014), which states that groups improve their understanding of computer science after undertaking online VL experiences and CPS activities. However, students did not significantly improve their associated CPS skills. There may be several reasons: first, the participants of the study are novice computer science students enrolled in an online Computer Networks course for the first time and have no online CPS experience. Being uninformed of how much time should be devoted to virtual experiments could also inhibit the improvement of their CPS skills significantly. Second, one major factor that contributes to the success of CPS is the team members’ proficiency in communication (Lin et al., 2016). However, the questionnaire composed of the open-ended questions in this study revealed that students had difficulty in working collaboratively because of the negative effect of online CPS, such as a lack of face-to-face interaction and unfamiliarity with equipment operation.

When it comes to cognitive load, students’ cognitive load in problem conceptualization was significantly higher than it was in problem-solving. This is in line with the findings of Delahunty et al. (2020), which indicated a significant reliance on memory during the conceptualization of problem-solving tasks. The cognitive load and CPS worksheet performance are highly negatively correlated among novice students, echoing the findings of previous studies (Nicholson & O'Hare, 2014; Redifer et al., 2021). Furthermore, students’ CPS skills are curial in their cognitive load in both CPS tasks. The findings in this study echo the viewpoint that students with higher CPS skills suffer lower cognitive load during CPS (Kalyuga et al., 2010; Sentz & Stefaniak, 2019). The most important variables among the decision tree for students’ cognitive load in problem conceptualization is their prior CPS skill, and the second is basic knowledge. When both variables are larger, cognitive load in problem conceptualization is evaluated as low. It is implied that it is helpful when teachers alleviate the cognitive load of learners by giving assignments that strengthen basic knowledge. The first two most important variables among the decision tree for students’ cognitive load in problem-solving are their cognitive load in problem conceptualization and CPS skills. The leaf (see Fig. 4) of low cognitive load in problem-solving classified with the largest number of subjects is a case that they get a high score on pretests of CPS skills and low cognitive load in problem conceptualization. Besides, students’ cognitive load in problem conceptualization is a key factor that influences their cognitive load in problem-solving. This study accords with prior findings of Larson Jr and Christensen (1993) that conceptualizing the problem provides the foundation upon which all subsequent problem-solving activities are built upon.

Practical implications

This study adds to the evidence that online VL is effective to enhance students’ CPS learning in college CSE courses. CLT provides a unique lens for understanding and addressing the challenges that students encounter in virtual experimentation activities. To keep students oriented toward the intended learning outcomes, educational designers may reduce the cognitive load that students experience in virtual labs. The efficient design of learning tasks can help address cognitive load (Kehrwald & Bentley, 2020). A key consideration for designing CPS learning tasks in VL is to engage students in problem conceptualization tasks before solving the problem. In addition, our findings suggest that students’ prior knowledge and CPS skills significantly predicted their cognitive load in CPS tasks. Thus, designing CPS learning tasks in VL needs to draw on students’ previous experience and knowledge to reduce students’ cognitive load and thereby enhance learning.

Limitation and future work

The study has several limitations. Multimodal data was collected in this study to investigate students’ cognitive load in CPS, but the granularity of the data from each source is not well matched presently. Future research might also investigate the inclusion of video recording of the CPS processes, which might lead to a more granular view of findings on cognitive load in virtual experimentation activities. In addition, the sample size of this study was limited as only students from one class were recruited. Future research might consider increasing the sample size of participants from multiple courses to confirm whether the results are valid and generalizable.

Conclusion

Engineering students should develop CPS competence to prepare for the future society in which humans mostly deal with ill-defined tasks. As online education proliferates, virtual labs become the major pathway to afford engineering student an effective CPS experience but concerns about the effectiveness of virtual labs for fostering students’ CPS competence remain unresolved. This study responded to those concerns by leveraging multimodal analytics to investigate students’ CPS in virtual experimentation activities from the perspective of cognitive load. The findings of this study provided practical implications for course instructors and designers on unwrapping technical affordance of virtual labs and refining pedagogical design of virtual experimentation activities in order to alleviate students’ cognitive load and develop their CPS appropriately. Meanwhile, further research may consider integrating multimodal analytics in the investigations of students’ cognitive, behavioral, and motivational patterns in CPS activities in order to seek efficient strategies that assist instructors in facilitating virtual experimentation activities.