Introduction

While embryonic, quantum information science and engineering (QISE) is expanding worldwide in both academia and industry, In 2018, the US congress enacted the National Quantum Initiative to support QISE research and development, spurring educational partnerships between higher education institutions and industry (Fox et al., 2020). Quantum information science (QIS) is the understanding of nature at its most fundamental level (i.e., atoms) and utilizes the laws of quantum physics to store, transmit, manipulate, and compute or measure information (Asfaw et al., 2022; Economou et al., 2020). When used in engineering applications, recent advances in QIS in algorithms, architectures, and qubit technologies (i.e., atoms, ions; semiconductors; superconductors; and supporting hardware) offer new opportunities and advantages over classical systems (Asfaw et al., 2022; Dzurak et al., 2022). These monumental advances in quantum applications have ushered in a second revolution whereby quantum is predicted to be a foundational technology transforming security, communications, computation, and sensing (Asfaw et al., 2022; Fox et al., 2020).

With accelerated technology growth, QISE provides engineers with ways to address society’s complex problems. However, one of the greatest challenges associated with the expansion of QISE is access to educated quantum-aware engineers to counteract the shortage of well-trained professionals, also known as the “Quantum bottleneck” (Kaur & Venegas-Gomez, 2022). National quantum strategies around the world (i.e., government, industry, and education) recognize the importance of developing a growing QISE pipeline. However, not all students have access to the QISE pipeline (Greenemeier, 2021); many underrepresented students are blocked by systemic barriers. As such, this Design based Case Study (DbCS) explores a three-year design-and-development process of a hybrid remote laboratory created to expand the QISE pipeline via partnerships with Historically Black Colleges and Universities (HBCUs) in the US. The hybrid remote laboratory was developed using rapid prototyping (Tripp & Bichelmeyer, 1990). The study describes the design, development, and formative evaluations at each prototype iteration. The outcomes of formative evaluations propelled the design decisions made by the instructional design (ID) team to ensure instructional alignment, a viable hybrid remote connection, and transferable simulated learning activities to reach underrepresented HBCU partner students with a fully functioning BETA lab.

Access to the QISE Pipeline

A significant rise in the number of diverse and international students taking part in engineering programs within the US necessitates attention especially in conjunction with access. According to the American Society for Engineering Education (2022), in 2021, 51.7% of international students earned master’s degrees in engineering and 58.5% earned Doctoral degrees. Thus, over half of students earning graduate degrees in the US have international residencies. Additionally, ethnic diversity increased significantly in students earning graduate degrees. Despite increases in diversity, distribution of power or access to engineering disciplines lag. With the advent of a QISE expansion, many education institutions aim to train the new QISE pipeline. However, few work to expand access to underrepresented groups in engineering, and broad gaps continue to exist between those with access and those without (Beddoes, 2020; Cross, 2020; Wise, 2010).

Simultaneously, industry fuels the growing pipeline calling for faster training for entry-level students to enter the quantum workforce and a greater need for laboratory and hardware training (Fox et al., 2020; Rainò et al., 2021). Companies also express a pressing need for lab exposure and experience focused on hardware facilitation and development (i.e., lasers; electronics; sensors). The industry suggests that higher education institutions switch to or add hardware training tracks emphasizing significant amounts of laboratory time (Fox et al., 2020; Rainò et al., 2021). However, engineering laboratories, while essential, are an expensive part of engineering education pedagogy. They provide a complex application environment for students—a place to “do” quantum engineering at steep costs (Chan et al., 2021; Seery, 2020). The reality is that many schools, especially systemically underfunded institutions such as HBCUs, lack the resources and access to these innovative laboratories (NASEM, 2011).

Virtual and Remote Labs in Engineering Instruction

As delivery of instruction gradually changes through technological innovation, and subsequently the impact of COVID-19 (Kapilan et al., 2021), engineering disciplines are capitalizing on new alternatives to traditional on-campus laboratories. New focus is on remote, online, and virtual labs which increase opportunity and access. Based on literature, “remote,” “virtual,” and “online” convey overlapping delivery systems. Remote indicates a synchronous connection in which communication is multi-channeled at a distance, whereby virtual or online indicate asynchronous operation at a distance. Distance education research is relatively new in engineering, specifically related to the development of remote or virtual labs (Esquembre, 2015; Glassey & Magalhães, 2020; Grodotzki et al., 2018; Mejías Borrero & Andújar Márquez, 2012). However, engineering disciplines have a long history of research associated with simulations and modelling of complex problems with information and communications technology.

Studies have determined remote or virtual labs provide greater access and accessibility, reduce costs, save time, and create safer experimental conditions (Chan et al., 2021; Lynch & Ghergulescu, 2017). Students access lab equipment easily, more often, and at a reduced cost (Ma & Nickerson, 2006). Students are also able to explore dangerous lab experiments otherwise off limits (i.e., nuclear reactors; quantum lasers). Modern engineering education emphasizes inquiry learning whereby virtual labs also provide flexible spaces to develop the critical thinking and problem-solving skills highly demanded by industry (Lynch & Ghergulescu, 2017). The most significant advantage based on student perceptions is distance approaches provide flexibility or self-regulated learning opportunities to students without access to traditional campuses. Moreover, virtual and remote labs provide engaging environments for students which increases motivation, engagement, and self-regulation (Bobbit Nolen & Koretsky, 2018; Chan et al., 2021; Lindsay et al., 2007).

While many benefits are espoused regarding virtual or remote labs, engineering research focuses primarily on media comparison or efficacy studies instead of value-added design research (Chan et al., 2021; Reeves & Crippen, 2021). Unsurprisingly, most studies indicate little to ‘no significant differences” between in-person labs and remote or virtual labs (Brinson, 2015; Chan et al., 2021; Faulconer & Gruss, 2018; Lockee et al., 1999). Remote or virtual labs hold up to hands-on in-person labs based on learning outcomes (Brinson, 2015; Faulconer & Gruss, 2018) despite general hesitation in the field or expressed concerns regarding emphasis on hands-on experience (Hofstein & Lunetta, 1982, 2003). According to Tatli and Ayas (2012) and Chan et al. (2021), studies need to focus on the design and development or instructional design behind remote or virtual labs instead and ascertain the effectiveness of the instructional components. As such, much less is known about how to design, develop, and iterate development for remote or virtual laboratories to ensure goals and learning outcomes are met.

Research Purpose and Questions

This study aimed to provide new perspective on the design and development process behind a hybrid remote QISE laboratory by asking:

  1. 1.

    How was a QISE hybrid remote laboratory designed, developed, and formatively evaluated for student access to QISE content and learning activities?

  2. 2.

    How did formative feedback at each prototype stage inform subsequent prototype iterations and design decisions?

  3. 3.

    What can be learned from the design, development, and formative feedback cycle related to the instructional design of hybrid remote laboratories?

Methodology

Research Design

This study utilized a Deaton and Malloy’s (2018) Design-based Case Study (DbCS) approach which combines two complementary research approaches: design based research ([DBR] McSweeney & Reeves, 2019) and case study method (Yin, 2018). First, DBR is:

A methodology designed by and for educators that seeks to increase the impact, transfer, and translation of education research into improved practice. In addition, it stresses the need for theory building and the development of design principles that guide, inform, and improve both practice and research in educational contexts. (Anderson & Shattuck, 2012, p. 16)

DBR focuses on the design process and the “how” of design and development in learning environments (Anderson & Shattuck, 2012). Second, case studies similarly focus on exploring and describing settings or contexts to increase understanding of a “bounded system” (Creswell et al., 2007, p. 245; Cousin, 2005; Yin, 2018). Case studies provide a bounded context to explore or investigate the “how” of design and development.

DbCS allows researchers to test design components in multiple learning contexts, via subsequent iterations within a bounded system. For DbCS, a specific context is selected, and an obtainable or desired goal within the constraints of the initial tools and resources available. Next, researchers create an initial prototype, which serves as a modelled starting point for the learning environment being designed (see Fig. 1). The beginning stages of prototyping only include the necessary design parameters and resources available at the time. Each prototype iteration is considered a unit of analysis (UA). Each prototype iteration is designed and developed, implemented, evaluated for formative feedback, and iterated into the next prototype using the formative feedback results. Researchers modify or adjust in each iteration to achieve the next desired outcome(s) based on formative feedback, and each iteration adapts and changes building towards the overall goal (Deaton & Malloy, 2018). After multiple iterations, DbCS demonstrates “…if, how, when, and why interventions [designed prototypes] were successful in meeting their goals” (Deaton & Malloy, 2018, p. 51), providing opportunities for other researchers and designers to learn from the process and transfer to respective settings (i.e., classrooms, labs, and community). This study started with an initial prototype and describes three units of analysis through design, development, and refinement.

Fig. 1
figure 1

DbCS approach

Overall Design Goal and Context

The hybrid remote laboratory was envisioned in 2020 by several faculty members with burgeoning interests in QISE. Faculty recognized the growing impact of Quantum applications and the need for more students, especially underrepresented at their own institutions, to have access to the laboratory equipment. The primary design goal of the QISE hybrid remote laboratory was to expand access to laboratory equipment housed on campus to institutional partners. The lab would be hybrid: online asynchronous lab preparation and assignments combined with synchronous remote access for real-time lab experiments. The faculty lead, laboratory equipment, and ID team were housed at a large R1 Southeastern university in the US. The R1 institution (institutions classified as “very high research activity” under the Carnegie Classifications) procured physical lab space to house the equipment and begin on-campus instruction. The hardware was funded by several large grants allowing for purchase of the German-made quTools Quantum kits (i.e., high powered UV diode lasers). The equipment cost and maintenance exceed $500,000, often making them inaccessible to partnered HBCUs. The design, development, and program support for the hybrid remote laboratory occurred in partnership with three other institutions: a small Southeastern HBCU and two public R2 (institutions classified as “high research activity” under the Carnegie Classifications) HBCUs in the South-Central USA. The ID team procured by the partnership had extensive experience in designing online learning environments for Engineering disciplines. The ID team worked with the faculty lead and partnership institutions over three years to transform the on-campus laboratory into iterative prototypes of the hybrid remote laboratory.

Design, Development, and Evaluation of Initial and Subsequent Prototypes

Initial Prototype

The instructional content was initially designed by a lead faculty member and two QISE lab experts for on-campus delivery at the R1 lab. Ten QISE laboratory lessons were developed and focused on laboratory hardware implementation using QISE concepts and applications. The labs were designed in traditional lab report formats which included instructions on how to perform the experiments and collect data for post-lab write up and conclusions. Once developed, the lead faculty member tested all ten labs in the physical on-campus laboratory across a single semester. The lead faculty provided the ID team these ten conceptually functional laboratory exercises. The ID team met with the faculty lead for one semester to verify alignment of each lab’s learning objectives and assessments. The ten labs functioned as the initial prototype for the ID team and would undergo multiple iterations.

First Prototype: Unit of Analysis 1

The first unit of analysis (UA), or first prototype, involved transferring all ten laboratory packages into a Learning Management System (LMS) shell (i.e., Canvas). The ID team determined placing the laboratory lesson and content into an adaptable instructional sequence in Canvas would allow for instructional alignment and logical sequencing for student learning in the online version. After information gathering with Engineering students and faculty members, the ID team decided to adhere to the traditional structure of Engineering labs (i.e., introduction, set-up, preparation, lab experiment, lab analysis, and report) to maintain familiarity with lab structures and the epistemic culture of engineering experimentation (Knorr-Cetina, 1999). Design documents were drafted depicting instructional sequencing and LMS layouts for each laboratory within the constraints of the epistemic nature of engineering disciplines. Each laboratory was then placed online into ten modular lab units.

The ID team used a modular approach in the LMS while emphasizing Gagné’s Nine Events of Instruction in the sequencing plan (Gagné et al., 2004; see Table 1 and Fig. 2). The lab “introduction, goals, and pre-lab” were designed to include a re-iteration of previous concepts, basic introduction of new concepts, learning goal statements, and outline of the lab expectations with learning checks (i.e., Gagné’s first four events). The “Laboratory Procedure” which includes lab experiment preparation, setup, and procedures involving the remote connection was assigned a placeholder for the initial prototype because this section required a remote connection not yet developed. However, the “laboratory procedure” reflected Gagné et al.’s (2004) next steps: provide guidance, elicit performance, and give feedback. “Lab Procedure” focused on “doing” QISE, i.e., performing the labs allowed for a combination of learning guidance through feedback and eliciting performance with the hardware and live instructor in remote sessions. Last, “Laboratory Analysis,” containing a traditional lab report, provided a formal assessment for students, while the entire real-world lab environment allows for retention and transfer (Gagné et al., 2004).

Table 1 Gagné’s nine events of instruction
Fig. 2
figure 2

Example of a single lab instructional sequence

Simultaneously, two ID team members trained in core accessibility competencies worked to ensure the course met ADA and WCAG 2.0 standards (i.e., PDF tagging, alt text, table markup). Then, the laboratory content was uploaded and placed in each modular lab unit based on the instructional sequencing. For example, “Lab Exercises” contained written step-by-step instructions on how to perform the lab experiment with the hardware during remote sessions. Development of the instructional sequencing and placement of content in the LMS required approximately six months before testing for formative feedback and refining a new prototype.

Formative Feedback: Data Collection and Analysis

To obtain feedback on the first prototype’s instructional sequencing and content, the ID team and lead faculty asked six students enrolled in the existing on-campus lab to forgo the traditional paper lab instructions and use the LMS-designed instructional sequencing to prepare for and perform the on-campus laboratory experiments each week for an entire semester. The participants were informed at the beginning of the on-campus course that the purpose of the LMS portion of the class was to support the design of a hybrid remote version of the on-campus laboratory for the HBCU partnership. The six participants represented both genders, several international countries, and different engineering backgrounds. At semester’s end, the students and instructor were invited to participate in a one-hour focus group (Barbour, 2018) on how to improve the instructional sequencing and content. Two ID team members were present for the focus group and recorded feedback via organized note taking (Barbour, 2018). Questions focused on what was effective or ineffective in the instructional sequencing and what could be done to support QISE content learning when the lab is moved online remotely. The formative feedback was cross-referenced, compared by recurring themes, and assessed on how to feasibly iterate the LMS portion of the laboratory.

Iteration Results

Themed feedback indicated several important changes to consider in two areas: to current content and sequencing and concern with translation of the laboratory to an adequate online version. In relation to content and sequencing, participants indicated students required more content examples and learning support built into the introductory concept lessons to apply them later in the lab. For example, one student noted that the ID team should provide traditional engineering schematics of the lab setups to better associate concepts in remote sessions. The student explained that most underrepresented students, including themself, have never viewed the equipment before and do not have pre-existing mental models to assimilate information. Likewise, participants appreciated the instructional sequencing, especially adherence to formalisms or epistemic culture in engineering laboratories. However, a pressing concern for most of the students was that the written laboratory experiment instructions within the sequencing would not translate for remote users: the technical writing for the lab instruction relied too heavily on an in-person perspective. Students argued that this could make performing the labs extremely difficult and prove to be isolating for online learners, ultimately discouraging engagement. Therefore, the ID team approached the next iteration aimed to remedy these challenges by introducing gradual adjustments to the initial prototype.

Second Prototype: Unit of Analysis 2

The second prototype, or Unit of Analysis 2 (UA2), was developed in response to UA1 feedback. The ID team decided to develop the remote connection to write better lab experiment instructions for remote sessions. Based on feedback from UA1, the instructions for the live lab experiments would need to be adapted for remote connection and performance. Therefore, the ID team established a live camera feed connected and broadcasted from the physical on-campus lab, along with a headset microphone for the lab instructor present in the same lab. After testing various platforms, Zoom was used to connect the on-campus lab, the lab instructor, and students. Zoom provided split viewing screens between on-campus lab broadcasting and the VNC viewer (i.e., remote lab control) on the remote student’s computer screen. The remote student accesses the quTools equipment on campus via the VNC viewer to control the lab hardware and lasers remotely (see Fig. 3). For four months, the ID team rigorously remote tested via live roleplay all ten labs to rewrite the lab instructions specifically for online performance and test the live connection’s robustness. Rewriting focused on anticipating what the students would need to know or be able to do as they operated the lab remotely from the VNC Viewer. Additional testing occurred between participating HBCUs to establish the robustness of the remote connection. Ultimately, the second prototype was tested and delivered in a week-long Quantum bootcamp in partnership with faculty and students at participating HBCUs in summer 2021.

Fig. 3
figure 3

Remote connection and operation of UV diode laser

Formative Feedback: Data Collection and Analysis

UA2 was tested at a week-long Quantum bootcamp in partnership with participating HBCUs. The ID team’s goal was to confirm the robustness of the remote connection and, more significantly, to gather feedback on lab experiment instructions for the remote setting. Two full-scale lab walkthroughs were demonstrated with partnership faculty and students, recorded, and feedback elicited through an exit survey. The exit survey aimed to determine perceptions towards the remote laboratory’s operation: both the connection and lab experiment instructions. The survey, delivered in QuestionPro, contained open-ended demographic questions, Likert-scale perception questions, and open-ended feedback (Creswell & Creswell, 2018). Over ten faculty experts and several students from HBCU partner institutions, with an average of 11.9 years of experience in QISE, participated in the lab walkthrough and answered the survey. Analysis of the survey data included basic descriptive statistics and themed coding of open-ended responses (Saldaña, 2016). The researchers also accessed the video recordings of the bootcamp demonstrations including Q&A periods to ascertain any concerns not revealed in the exit survey. One researcher reviewed the video recordings for hedging, uncertainty, and body language during the live sessions and questioning (Gee, 2014).

Iteration Results

The ID team used the survey and recordings to ascertain whether the remote lab connection and instructions were effective, and student supported. Based on survey feedback from open-ended questions, the laboratory was perceived as effective and beneficial for underrepresented students in QISE by participating HBCUs. Of the respondents, 77.8% viewed the experiential laboratory as promising for QISE learning, while 22.2% somewhat agreed, and no participants disagreed. Faculty and students were surprised by the basic remote capabilities and ease of access across relatively low bandwidth. One faculty member explained how access to these “[…] lab demonstrations illuminated how these lab activities could be incorporated into [other] future courses.” Another respondent applauded the lab for demonstrating the importance of access to “concepts of various fields of Quantum and their applications” for students with limited access. Most faculty members saw the possibilities a working remote connection could provide their students. Moreover, faculty confirmed the functioning of the remote lab as a second prototype and, likewise, perceived its overall value to the partnership.

Where the survey provided validation that the remote laboratory was effective, the ID team still required feedback specifically regarding laboratory instructions. Using the video recordings of the demonstrations, the researchers were able to determine uncertainties, hesitancies, and misconceptions regarding the online remote demonstration of the lab experiments. Analysis of the recordings revealed two major concerns regarding the remote lab:

  1. 1.

    Faculty expressed concerns about simulations of the hardware and whether simulations transfer just as effectively in instruction as face-to-face delivery. Many remained skeptical.

  2. 2.

    Faculty expressed concerns with wrap-around supports for students enrolled in an online course, and faculty training for instructing remote labs, or designing remote lab lessons (i.e., the overall system the course would be embedded in).

Faculty concerns were not with how the instructions were written but rather how the remote laboratory as instruction would transfer in real settings and subsequently be supported. Feedback prompted meetings between lead faculty and the ID team in which decisions focused not only on instruction but supporting the instruction in context. The next iteration required further consideration by the faculty lead and administration regarding how the instruction can be supported in several institutional contexts, while also considering systemic barriers of the course. The ID team turned to developing fully simulated parts of the online lab hardware to address the question of learning transfer.

Third Prototype: Unit of Analysis 3

The third prototype, or UA3, involved design and development of hardware simulations for lab set-ups to assuage skepticism that remote learning experiences lack transfer or engagement compared to in-person labs. After establishing an effective instructional sequence, incorporating appropriate content, and operationalizing a remote connection, the ID team researched and tested the feasibility of various options for laboratory simulations. The design goal was to simulate hardware manipulation occurring in the on-campus lab to maximize transfer for online learners. For simulations, high-fidelity options such as VR/AR highly appealed to the ID team. However, availability of resources, time, and required skillsets posed major constraints. Instead, the ID team decided to begin with high-impact low-fidelity hardware simulations in Articulate Storyline 360.

The Articulate 360 simulations for each lab unit emulated lab set up and preparation by giving students the opportunity to play through a walkthrough of the laboratory space, learn and manipulate hardware, and actively set up the lab configurations virtually before starting the remote access portions of the lab exercises (see Fig. 4). Students would be asked to identify, place, or adjust the lab hardware to resemble appropriate lab schematics, i.e., virtually configure the lab for the remote experiment. The ID team initially storyboarded each simulation before development and then spent four weeks professionally photographing all lab hardware and configurations in the on-campus physical lab. Then, simulations were designed over fourteen months, and development was partially contracted with additional instructional designers to meet deadlines. Once completed, a full usability test and focus group were conducted on several lab units.

Fig. 4
figure 4

Examples of articulate lab prep and set up

Formative Feedback: Data Collection and Analysis

To determine the transfer potential and engagement of the simulations, a usability test was conducted on three labs: the virtual hardware introduction and two virtual lab unit set-ups. The usability test consisted of an online survey integrating Articulate Review 360. The survey and links to Articulate Review 360 were delivered via an interactive survey in QuestionPro. Each survey section referred to a single lab unit being tested and the associated Articulate 360 simulation. Each survey section contained an interactive link and identical Likert statements (e.g., directions were easy to understand; objectives/expectations were clear; navigation was easy to understand) rated on a five-point scale of “strongly disagree to strongly agree.” Participants were also asked to leave comments about any concerns with the module in an open-ended response question. Additionally, when participants accessed each lab unit in Articulate Review 360, they were instructed to directly indicate operational issues in the simulation (e.g., a faulty button). Six engineering students with quantum experience, previously enrolled in the on-campus lab, were asked to participate in the usability test. Four participants completed survey responses and recorded Articulate Review 360 comments.

Additionally, a one-hour focus group was conducted to ascertain student feedback to support the usability test (Barbour, 2018). The focus group included several usability test participants, and several students previously enrolled in the on-campus version of the lab. Questions for the focus group targeted effectiveness, innovativeness, self-efficacy, accessibility, and basic usability related to lower scoring Likert responses from the usability test. Additional forecasting questions were related to possible changes for future iterations and ideas for enhancing student access. The focus group was led by the lead faculty member, recorded, and one researcher attended as a note taker. The notes were cross referenced with the audio transcript and assessed for overarching themes directly related to low-scoring survey questions and improving the course.

Iteration Results

Participants of the usability test primarily indicated they strongly agreed or somewhat agreed with statements such as: navigation in the module was easy to understand; objectives/expectations were easy to understand; I was able to adequately practice what I learned; I felt motivated to participate and learn about the hardware; visual aids/examples helped with my understanding; and the module was accessible (see Fig. 5). However, the most valuable feedback derived from focusing on “neutral” indicators in usability responses which were cross referenced with the open-ended questions, Articulate Review 360 comments (see Fig. 6), and focus group.

Fig. 5
figure 5

Example of usability test responses

Fig. 6
figure 6

Example of articulate review 360

The ID team focused on attempting to understand neutral responses from the survey by triangulating open-ended comments, Articulate Review 360, and the focus group. For example, the hardware introduction received a neutral response related to “visual aids/examples helped with my understanding.” Open-ended survey comments suggested that students required more “repetitive questions/examples to help remember lab parts”—similar to UA1 feedback. Additionally, the focus group discussion indicated that students wanted to see more visual examples because the hardware is very advanced and unfamiliar. Moreover, a neutral response was indicated for “the module was accessible.” Open-ended comments suggested that the modules were still “buggy” and “the only issue is that the hearing accessibility (speaking of the AI) would sometimes play two voice tracks at once if you try to move faster than the bot is speaking.” Articulate Review 360 feedback directly noted the number of “buggy” parts in the lab units. No students held issue or overall skepticism in being able to learn QISE from the simulations. Alternatively, several students excitedly offered new ways to improve the course. They encouraged more instructor presence via videos to support learning a new area of QISE, and additional “playtime” built into the course on the remote connection, i.e., a student mentioned being able to fail productively while experimenting.

Discussion: Design and Development Challenges and Decisions

A DbCS approach concentrates on the process of design in context towards an overall goal. Nelson (2013) states design research encompasses “research during design, research about design, and research through design” (p. 4, emphasis in original). While prototype refinement led to a completed and effective course BETA, the ID team also built valuable knowledge related to designing in this specific bounded context. This study focuses heavily on research during design which encompasses any research activity that occurs during the process of design. The ID team designed in phases in which each phase involved feedback or information gathering activities (Nelson, 2013) in which instructional design decision-making played a central role in prototype outcomes (Stefaniak & Tracey, 2014). These analyses were conducted with “the intent to gather information, identify opportunities and constraints, understand the context, and make design, production, and evaluation decisions” (Nelson, 2013, p. 6), and, upon reflection, valuable data was collected to inform practice. Via the iterative prototyping process used, the ID team learned much about the “how” and “why” behind design processes and design decisions in practice.

Unit of Analysis 1

For the ID team, much of the design for UA1 revolved around emphasizing good design principles—working from instructional design theory into contextualized practice. The ID team struggled with deciding which principles, models, and strategies to use to guide the design toward the goal of access. Current literature is deeply critical of the datedness or intended use of instructional design models and strategies, and debate exists around whether older models such as Dick, Carey, and Carey (Branch & Dousay, 2015; Dick et al., 2015) or strategies such as Gagné’s Nine Events of Instruction still hold relevance in designing for innovative twenty-first-century learning environments. While the researchers agree that many of the “the design models we have are not the design models we need” (Moore, 2021, p. 1), the issue is far more complex and centralized in “use.” For innovative designs, many designers by proxy turn to more innovative design processes. However, Branch and Kopcha (2015) imply that models are guides and that while models have not significantly changed, how they are used and implemented varies. Most models are not intended to be overtly prescriptive in nature but are used inflexibly in practice. Models are tools which produce different outcomes based on how they are ultimately used. Thus, the ID team decided to use foundational models to adhere to the basic tenets of good design. However, they learned that by clearly defining the goal for instruction (i.e., access) and maintaining flexibility with models, strategies, and sequencing toward that end, “archaic” ID models can be used effectively in new innovative and inclusive designs.

Arguably, Gagné’s Nine Events proved suitable in the context of QISE’s epistemic and formulaic culture for creating innovative learning with familiar structure and sequencing. Feedback in UA1 emphasized including more examples of hardware and hardware configurations. Many designers have a thorough understanding of Gagné’s Nine Events of Instruction because of its simplicity (Gagné et al., 2004). However, the ID team reviewed the original principles and research on the distinct events and learned that event 4 is often underperformed in design. “Provide learning guidance” in the context of an intellectual skill or cognitive strategy requires examples. QISE is complex and involves overlapping learning outcomes emphasizing intellectual skills, cognitive strategies, verbal information, and some motor skills. QISE learning requires many examples to build and expand existing or non-existent mental models (Jonassen, 2007). Many students, more than likely underrepresented students, taking the lab will never have interacted with or viewed the lab equipment before. As such, the design team realized that the entire online laboratory had to have rich visual examples (2D and 3D simulated). The design basics encompassed in Gagné’s Nine Events aided designers in iteratively improving instruction from UA1 to UA2 and maintaining the overall goal of access.

Unit of Analysis 2

UA2 proved the most challenging in relation to overall goals of access. The entire project relied on establishing a viable and robust connection for lower bandwidth while also editing the written lab instructions to be more accessible for the remote connection’s learning context. Unfortunately, the ID team discovered very little literature addressing “how” to develop robust remote connections for online stem labs. Much literature exists pertaining to simulations and innovative lab settings (see large-scale reviews; Brinson, 2015, 2017; Tho et al., 2017), but studies often focus on content selection and learning outcomes, not specifically on designing content, types of equipment, how to assemble the equipment, and how to use it to support learning—there is a significant lack of understanding surrounding design and development past simply measuring learning outcomes (Faulconer & Gruss, 2018). The ID team realized the technical challenges presented with developing a viable remote connection and decided to utilize the design-thinking processes inherent in engineering and instructional design to develop a working connection (Howard et al., 2008). The ID team defined the problem, conducted ample research (i.e., the faculty lead visited campuses with similar projects), conceptualized a theoretical model, and created the first version of the remote connection. The ID team repeated this process to test and refine the remote connection while conducting roleplays of the ten laboratories. After ten iterations, the ID team had a well-established remote connection for the remote hybrid lab. More significantly, the design process was documented and established as a process for developing highly innovate and technical laboratory connections (i.e., the process is currently being used and retested to develop AI-driven online labs in chemistry at the same institution).

Likewise, the technical writing for lab experiments was cited as a primary problem in feedback. The ID team reviewed the instructions for conducting lab experiments remotely and realized the instructions required rewriting for the remote context—i.e., the message design in the lab instructions was not adapted appropriately for online remote learning. The ID team turned to foundational literature in message design (Fleming & Levie, 1993) and multimedia learning (Mayer, 2008, 2009, 2014). As such, Bishop (2014) argues that instructional designers can view “teaching–learning problems as communication problems” (p. 374) via message design theories. The ID team decided to approach the lab instructions from a communication theory perspective. While the entire course was learner-centered, the existing lab instructions emphasized a transmission version of communication. For the remote lab, the lab instructions needed to be clear and precise for students to conduct the lab successfully without taxing cognitive load (Mayer, 2014; Mayer et al., 2001) especially for underrepresented students who often carry higher cognitive loads based on low-economic status, racism, food insecurity, homelessness, and social marginalization (Verschelden, 2017). Therefore, the ID team removed extraneous material while emphasizing clear and precise actions in the instructions (Mayer, 2014). Simultaneously, the instructions integrated pictorial aids to support learning by doing in the remote setting (Mayer et al., 2002). The ID team relied on message design principles to integrate strong technical writing in instructions with multimedia videos and images to relay the desired actions of the lab experiments.

Unit of Analysis 3

The iterative design of UA3 underscored the challenge of learning transfer (i.e., the skepticism offered in feedback). While supportive of the remote lab, many faculty and students remained skeptical of the remote hybrid laboratory’s ability to provide a comparable learning experience to physically studying in an on-campus laboratory. The ID team recognized common misconceptions related to comparison studies, and the misleading nature of comparing online learning to in-class learning. Clark (1983) argued that comparing instructional technologies and delivery systems to each other would always result in “no significant difference” because they are “mere vehicles that deliver instruction but do not influence student achievement any more than the truck that delivers our groceries causes changes in our nutrition” (1983, p. 445). Clark’s (1983) bold statements highlighted the importance of instructional design methods as the key influencing factors in learning with instructional technology. Lockee et al. (1999) found similar issues in comparison studies between online learning and classroom learning, i.e., no significant differences. Online learning, and associated approaches, proved to be no better or no worse than traditional in-person learning in Engineering laboratory studies as well (Brinson, 2015; Chan et al., 2021). Learning outcomes were more significantly impacted by effective or ineffective instructional design behind online learning or in-person learning. Too often, delivery systems or contexts confound results.

As a result, the ID team chose to move the design of the simulations as far away from comparison to in-person laboratories as possible. The ID team realized to avoid comparisons in the final BETA, the simulations would need to emulate the physical lab space less, but still focus on the same learning outcomes. In Articulate Storyline 360, the ID team opted to not design a traditional lab setting simulation. Instead, the remote learner’s lab would always be the laboratory hardware situated abstractly in the institutional LMS—after all, these students would never set foot in the on-campus lab. As such, learning would focus less on place and more on concepts and skills ready for transfer to multiple lab settings (all developed and set up differently) in future employment—allowing development of the flexible/transferable skillsets desired by QISE industry (Fox et al., 2020; Rainò et al., 2021). The ID team designed for the learning objectives and simulated the hardware pieces emphasizing key intellectual skills, cognitive strategies, and verbal knowledge. For example, the remote lab’s introductory lab focused solely on hardware component identification and purposes in QISE—a cognitive skill acquired in in-person labs by interacting with the equipment regularly. Additionally, labs focused on the intellectual skills required to discriminate between hardware components and classifying them based on purpose/use. Last, the simulations had students manipulate and execute motor skill tasks on simulated lab schematics to learn how to set up and adjust lab equipment in situ. Based on instructional design theory behind defining performance objectives (Gagné et al., 2004), the students learn foundational lab equipment components and manipulations.

One of the last challenges determined from feedback emphasized the reality that ID models do not, as Moore (2021) explained, “frame the problems to include social, economic, and political realities” (p. 17). While useful for instruction, the models provided no guidance on key issues of access and wraparound supports when designing—access to internet, entry access through administrative systems to the course, access by achievement to the course (i.e., prerequisites), and systemic supports and barriers to the system the course exists within—all key access barriers feeding the “Quantum Bottleneck” (Kaur & Venegas-Gomez, 2022). Once situated in the Bootcamp setting and exposed to partnership feedback, the ID team recognized the extent of systemic barriers for the targeted audience. They realized that a holistic approach to developing, not just designing, the course was required to ensure success of instruction (Lockee & Clark-Stallkamp, 2022). As such, the ID team advocated for systemic supports because according to literature, many Engineering studies do not specify these types of supports or resources (Brinson, 2015; Chan et al., 2021; Reeves & Crippen, 2021). While the ID team did not have administrative power, they made known these issues in all upper-level meetings for course design. Administration worked and will continue to work closely with the lead faculty at institutions to understand the systemic issues and help support the course. Additionally, design decisions were made in conjunction with, not only the course design, but the system surrounding the course (Lockee & Clark-Stallkamp, 2022). The ID team attempted to build online resources and wrap-around support where available from participating institutions.

Conclusion

A functional beta version of the course is now available for comprehensive pilot testing in Spring 2024, and as QISE and its myriad applications continues to advance in industry, the course will fulfill demand for QISE aware and capable engineers. The course will provide access to cutting-edge QISE training for those seeking positions in the growing field—counteracting the “Quantum bottleneck” (Kaur & Venagas-Gomez, 2022). More significantly, HBCUs and underrepresented students at those partner institutions rarely have access to cutting-edge and innovative educational environments, especially in rapidly advancing fields. As such, the development of a remote Quantum laboratory focused on providing access to underrepresented learners functions as an exemplar for both future course development and partner building. One of the most valuable outcomes of the BETA version of the lab lies in its ability to function as a reproducible model for participating HBCUs. Institutions can work on many levels (i.e., administrative, teaching, and course development) to reproduce similar learning environments and systemic supports. For example, the current partners in this project will now participate collaboratively to develop QISE lessons to fit all participating institution’s curriculums and offer niche areas of training for each. Moreover, lead faculty will replicate the granting process to obtain funding to build similar in-person hardware labs on HBCU campuses that act as localized hubs for larger enrolment in remote laboratories for QISE. A far-reaching goal is to develop an operationalized network of remote laboratories and support systems while expanding partnership member institutions.

Therefore, the DbCS describes how the remote laboratory was developed (both instructional and technical) over three years by an ID team with the HBCU partners using prototypical cycles of design, development, and formative feedback (Tripp & Bichelmeyer, 1990). Through feedback, the ID team iterated the design and development of the laboratory to meet partner and learner needs. The design and development process presented problems requiring designer decision-making which yielded lessons on design decisions made related to the following: chosen instructional design models and strategies, areas of technical design not heavily traversed in literature, and learning transfer in simulated learning environments.