Online learning and learning tools that employ technology were a part of college science classes well before the COVID-19 pandemic (Almarzooq et al., 2020). Yet, web-enhanced tools like learning management system platforms, digital textbook companion sites, open educational resources, and social media have proliferated and are increasingly used to supplement interactions in face-to-face (F2F) courses. Much of the research about online learning compares online to F2F formats, considering the efficacy of the online environment in supporting measurable learning outcomes as evidenced by test scores, course retention, or student satisfaction (see Bernard et al., 2009; Sun & Chen, 2016). A growing body of literature also notes the importance of interaction (see also Kebritchi et al., 2017; Vonderwell & Zachariah, 2005). To that end, Dobbs et. al (2009) and Dutton and Dutton (2002) stress that, while many courses are available in fully online, web-enhanced, or F2F formats, students’ enrollment patterns may be driven by considerations such as scheduling or transportation, rather than their learning preferences (see also O’Neill & Sai [2014], who noted that commuting times and logistics for F2F classes can be substantial). This is especially true for working or non-traditional aged students.

In online classes, due to their particular constraints and affordances (Sjølie et al., 2022), the way students behave, act, and interact with one another might differ from F2F environments. Since early 2000s, the field has used the term “digital natives” (see Prensky, 2010), to describe new generations of students who regularly use technology for communication, exchanging ideas, and learning new things. Yet, research highlights that “digital natives” still need instructor support and scaffolding regarding the expectations and culture for interactive and productive learning in online courses. This support includes learning to navigate online participation norms, practices, and interactions (Cho & Cho, 2014; Herrenkohl et al., 2011).

In this descriptive analysis, we consider the experiences of college students who were averse to taking online science courses but, due to campus closures during the COVID-19 pandemic, had no other option than to enroll in a general education online introductory biology class. They resigned themselves to the asynchronous online format, despite a stated dislike for online learning and/or a fear that a science class would be particularly challenging in the online environment. We refer to these students as “involuntarily online learners,” and our analysis considers how they engaged with content, peers, technology, and instructors. Recognizing that outside of a global pandemic, students may also find themselves involuntarily in online science courses for personal circumstances or logistical reasons, we consider how this cohort may offer insight to other students’ needs and experiences. We use thick description (Geertz, 1973) to explore this phenomenon within a single case example. The heuristic objectives, particularistic epistemology, and descriptive nature of a single case approach allows for greater attention to the participant experiences within their learning context (Merriam, 1998). In this analysis focused on the experiences of 12 students, we seek a deeper understanding of “why people responded as they did, the context in which they responded, and their deeper thoughts and behaviors that governed their responses” (Creswell, 2007, p. 40).

Conceptual Framework

Student engagement emerged as a field of study in the 1930s, and has evolved from simplistic measures of time-on-task to more holistic views of learning that include behavioral, cognitive, and emotional dimensions (see Groccia, 2018; Hu & Li, 2017). Student engagement is associated with improved persistence and learning outcomes, but rather than measuring student engagement, our interest was around describing the different mechanisms available to students, and how they interacted with them. For this analysis, we looked at frameworks that described the structures and infrastructures that support engagement in online learning environments.

Moore (1989) conceptualized distance learning engagement as a three-pronged construct: student engagement with content, student engagement with instructors, and student engagement with other learners (peers). Student-content interaction is defined as the “process of intellectually interact[ing] with content that results in changes in learner’s understanding, perspectives, and cognitive structure” (p. 2). Student-instructor interaction pertains to connections between students and their instructors. That is, instructors stimulate and maintain student interest in learning by tailoring supports to individual students’ needs, and concomitantly, students reach out to instructors for further enrichment or feedback. Student-peer interaction refers to the work “between one [student] and other learners, alone or in group settings, with or without the real-time presence of an instructor” (p. 3). Adding to Moore’s work, Hillman et al. (1994) incorporated the student-interface component, highlighting the additional interaction between learners and technologies used for instructional activities. Several decades later, we still find this framework to be relevant, and we used it to guide our analysis.

Science Engagement

The National Research Council (2012) describes an engaged student as one who has a sense of purpose for learning and feels motivated to utilize prior knowledge to understand scientific materials, tools, and practices. Science engagement is correlated with desirable learning habits, including motivation and curiosity (Hampden-Thompson & Bennett, 2013; Wu & Wu, 2020). It is also associated with improved learning and mastery of concepts (Grabau & Ma, 2017; Lin et al., 2013; Polman, 2012; Pugh et al., 2010; Sinatra et al., 2015), persistence in STEM degree programs, and entry into STEM careers (Watkins & Mazur, 2013). Science engagement has both individual and social dimensions (Kelly, 2013; Koballa & Glynn, 2013; Martin et al., 2021; Schmidt et al., 2018). In other words, it varies depending on students’ aspirations and dispositions (Mujtaba et al., 2018; Olitsky & Milne, 2012; Schmidt et al., 2018). However, though online learning is increasingly used to deliver science courses, most studies of science engagement are conducted in F2F learning contexts.

Engagement in Distance Learning

While distance learning literature has benefited from Moore’s (1989) and Hillman et al.’s (1994) frameworks, recent empirical studies – especially ones informed by constructivist and social perspective approaches – have advanced and complicated their original conceptualizations. Prior to the COVID-19 pandemic, engagement in online learning had emerged as a distinct subfield in education research (Allen & Seaman, 2011; Meyer, 2014; Yang et al., 2018). Similar to the work in science engagement, work in the area of distance learning engagement is associated with positive learning outcomes including satisfaction and retention (Chen et al., 2018; Cheng & Chau, 2016; Green et al., 2018; Hegeman, 2015; Jaggers & Xu, 2016; Sun & Chen, 2016).

Scholarship documents that much of the online learning experience is contextual, tied to course characteristics or instructional approaches (Vonderwell & Zachariah, 2005), and student backgrounds and dispositions (Cheng & Chau, 2016; Paulsen & McCormick, 2020). However, as it relates to engagement, learning is maximized when students exhibit interest, self-efficacy, and self-regulation (Sun & Rueda, 2012), and when the course is characterized by a strong instructor presence and community of support (Chen et al., 2018; Ma et al., 2015; O’Shea et al., 2015). Responsively, today’s online learning platforms and tools offer increased opportunities for learners to interact – with content (e.g., interactive learning modules), with other instructors (e.g., real-time chats, video and voice recordings), and with other learners (e.g., digital whiteboards, real-time chat features, and asynchronous discussions). However, we wondered how this took shape when the students in the online environment were accustomed to – and preferred – F2F learning environments.

Setting and Context for Our Study

In the middle of the spring 2020 semester, at least 1300 US institutions canceled in-person classes or shifted to online-only instruction amidst the global pandemic (National Center for Education Statistics [NCES], 2021). Though college enrollments fell in the fall of 2020 and the fall of, 2021 (Weissman, 2021), enrollments in distance learning increased 93% between 2019 and 2020 (Lederman, 2021). The University of Alaska Anchorage (UAA), where our data were collected, followed these national trends ((UAA) Office of Institutional Research, 2021). However, prior to the pandemic, fewer than a third of undergraduate credits at UAA were delivered using fully online or hybrid (i.e., a combination of F2F and online delivery) models.

In March of 2020, the main and community campuses were closed, and classes were shifted to online delivery. Soon after summer and fall registration opened, campus leadership determined all summer 2020 and nearly all fall 2020 classes would be delivered online. The decision that spring 2021 classes would be mostly online was made before spring registration opened. For undergraduate introductory biology for non-majors, the focal course for our analysis, F2F classes were unavailable during the academic year. Almost all students – with the exception of very early registrants for the summer and fall 2020 semesters – knew that the course would be delivered online when they made the decision to enroll.

UAA is a comprehensive, open-enrollment state institution offering certificates, 2-year, 4-year, and graduate degrees. Its main campus is situated in the metropolitan area of Anchorage, Alaska. Enrollment is just over 10,000 undergraduate students; 61% attend part-time and 51% identify as students of color (NCES, 2022). Twenty-nine percent of beginning students receive Pell grants, and 44% of students are under the age of 25 (NCES, 2022).

For our analysis, we were particularly interested in the experiences of students who were not majoring in biology or taking it as a prerequisite for their major. That is, most of these students take the course to satisfy a general education requirement. Fundamentals of Biology at UAA is a 4-credit experience with 3 credits of lecture and a 1-credit complementary lab. The lecture courses are taught by tenured or term faculty in the biology department, and the curriculum was designed to align to the concepts and best practice recommendations for inquiry-based learning as outlined in Vision and Change (Brewer & Smith, 2011). Prior to the pandemic, most sections of the lecture course were offered in F2F format with web-based tools accessible through the textbook website and BlackBoard LMS. During the pandemic, UAA faculty received support to transition their instructional content to online delivery, but this was not previously a common format for most biology faculty teaching at the introductory levels. During the period of analysis for this study, classes were delivered asynchronously via BlackBoard, and students also had opportunities for synchronous virtual meetings with instructors in Zoom-based office hours. Though each instructor organized their course differently, the online biology class generally included instructor-recorded lectures, quizzes, instructor-designed individual assignments, threaded discussions, and a group project.

Participant Context

We arrived at the term “involuntary online learners” to describe our participants. As this term is not described in the academic literature, we situate our analysis from our participants’ stated preferences around F2F courses. The majority of our participants took the online biology class because it filled a general education requirement (GER). Several expressed their determination to finish their degrees online rather than wait for the pandemic-related building closures to lift. As one participant said, “I didn’t want to delay it just because it’s online. I need to get all my GERs out of the way so I can get to the next thing. … This is the only way to get it done this school year.”

Several of our participants said they preferred F2F classes in general, and some had unfavorable experiences in previous online courses. One participant said, “I’ve tried to avoid online classes as much as possible until this last semester.” Other students, particularly those who took the class in the spring 2021 semester, indicated that they postponed enrolling in online math or science classes specifically. Instead, they enrolled in humanities or social science classes in the fall of 2020 with the hope that spring 2021 courses would be available F2F. Many students expressed concern and confusion around how a science course could be delivered online, recalling their high school science courses as “hands on” experiences with demonstration-style instructional guidance about how to perform experiments in science labs. A student said, “I was hoping everything would be [F2F] because online is so much harder, especially with these biology … labs and these science classes.”

With regard to biology over other available sciences (i.e., chemistry, physics, or astronomy, which also satisfy GERs at UAA), most of our participants took biology because they felt it would be the “easiest” of the available options, noting that they “didn’t want to jump into something that would be too hard.” However, they also were “curious” about the subject of biology, saying that it was, “important … since it has to do with life.” We characterize our student participants as somewhat reluctant and nervous upon enrolling in the course. They were extrinsically motivated to take the class online to fill graduation requirements, but despite their trepidation over the course format, they were moderately interested in the subject of biology.

Recruitment

Near the end of the fall 2020 and spring 2021 semesters, we obtained a list of all students enrolled in Foundations of Biology. We removed all students under the age of 18 (as they could not independently consent to be participants in research) and sent email invitations to 122 and 137 students, respectively, who were enrolled in 6 different sections of online biology. Students who did not respond to email invitations within 2 days received one follow-up phone call invitation, and students who agreed to participate received a reminder email or text. Ultimately, 18 signed up for the study, but upon receiving the reminder notification, six indicated their schedules had changed and they could no longer participate, resulting in 12 students who represent approximately 5% of the total enrollment. Table 1 depicts demographic information for the class and the study participants.Footnote 1

Table 1 Class and participant features

Informal instructor surveys administered in some of the target classes indicated that approximately two thirds of enrolled students would have preferred a F2F class,Footnote 2 thus we estimate that our participants represent approximately 8–10% of the enrolled students who would have preferred a F2F class. It is important to note that our approach is qualitative and exploratory; that is, rather than aiming to meet the threshold for representativeness, at this stage, we are working on developing understanding of a phenomenon that has not yet been well described in the literature. The research served as an opportunity to engage with a rich corpus of data and to describe students’ experiences within a theoretical framework of engagement.

Data Collection

We opted for focus groups because fundamentally, our research questions were exploratory and focused on students’ perceptions, thoughts, and feelings around their learning experience, via “deep and detailed discussion” (Stewart, 2018, p. 689) in order to understand how they constructed meaning (Liamputtong, 2011). In particular, we were interested in how participants responded to one another and how they co-constructed explanations of their course experience (Ivanoff & Hultberg, 2006) through dialogue. In addition, though the interviewers are not affiliated with the biology department at UAA, we are faculty and staff members in relative positions of authority, and we hoped the group structure would help participants to feel more comfortable, as focus groups are a method used to mitigate power imbalances between researchers and participants (Liamputtong, 2011).

We conducted four focus groups over Zoom, which were held after final grades had been submitted, and lasted an average of 85 min with an average of three students in each. We adapted Puchta and Potter’s (2004) focus group techniques to the online platform. Each interview included brief introductions, followed by a facilitated discussion that followed an interview guide (Liamputtong, 2011) covering five topics: students’ expectations for the course, course format and delivery, course assessment and assignments, students’ overall evaluation of the course, and key takeaways.

Data Analysis

Our exploratory analysis took place in five distinct phases (see Fig. 1). First, transcribed interviews were culled for significant statements (Riemen, 1986). Specifically, we identified all statements that aligned to the research question, covering different aspects of students’ expectations, preferences, and experiences in the course. Once the corpus of relevant data was identified, in the second round of analysis we followed Hillman et al. (1994) and Moore’s (1989) online engagement framework to assign provisional codes to each statement (Saldaña, 2015). That is, we labeled each statement to indicate whether students were describing their experiences with their instructor, other learners, content, or technology.

Fig. 1
figure 1

Our five-step coding process. Note. Our five-step coding process allowed us to take relevant data from the raw interview transcripts, code it to categories aligned to our theoretical framework, and describe the student experience in themes

The initial provisional coding identified a significant overlap of codes. As Moore (1989) noted, the instructor can have a prominent role in engaging a student with content and other learners. As such, with each unique student interaction as the unit of analysis, it was not always possible for us to code learner-instructor categories that were exclusive of other forms of engagement. Students rarely discussed engaging with just one category at a time. For example, they may have engaged with another learner as they discussed the content or worked with the instructor to navigate the technology. Thus, our preliminary provisional categories were expanded in a third round of coding to include intersections or joint activities.

Third round codes included the categorization of instructor, instructor-technology, instructor-learner, instructor-content, content, content-technology, content-learner, technology, technology-learner, and learner. We realized that the complexity of these interactions would be too much for an in-depth analysis around all of the interrelationships. However, the instructor role featured prominently in the codes (appearing in nearly half of the significant statements across all focus groups), so we focused our analysis on students’ perceptions of the instructor role. Thus, in our fourth round of analysis, we used open coding to assign meaning units (Kvale & Brinkmann, 2009) to each significant statement that included the instructor. Figure 2 depicts the process of distilling significant statements into meaning units and themes for the instructor-content category.

Fig. 2
figure 2

Themes developed from raw data. Note. This figure includes some example statements from the instructor-content category and depicts how we organized them in our analysis. Raw data (significant statements) were coded into meaning units that captured their fundamental connotation. These meaning units were then grouped into themes that captured the essence of the students’ statements in each category of our framework

To ensure reliability of codes (Cho, 2008), at the round 3 stage, Dayna Jean DeFeo and Leah Mason coded one transcript together in real time using the screenshare feature of Zoom, and developed a codebook and operational definitions for categories and meaning units. They each independently coded a second transcript, then compared codes, achieving 82% agreement (see Roaché, 2017, for an overview of percent agreement in qualitative coding) in the first pass. They negotiated areas of discrepancy and further refined the codebook (see MacQueen et al., 1998) until they aligned and achieved agreement on all codes, then Leah Mason coded all remaining transcripts using that codebook, refining it continually and consulting with Dayna Jean DeFeo when the codes could not be readily applied and needed to be further refined. When all significant statements were assigned meaning units, Dayna Jean DeFeo did the fifth round of coding, sorting the codes in each category using the constant comparative method (Glaser & Straus, 1967) iteratively and in discussion with Leah Mason, Trang Tran, and Sarah Gerken until they were distilled into themes.

Findings

Our participants regarded the instructor as the “hub” of their engagement with content, technology, and peers. Though Moore’s (1989) and Hillman et al.’s (1994) frameworks position the instructor as a facilitator of student engagement, our participants positioned the instructor as more of a director of their learning experience.

Student-Instructor

We first explore how students characterized their experiences with the instructor directly, outside of exchanges that involved content, technology, or peers. Our participants reported wanting to have personal connections with their instructors. As Owen said, “I think it is nice to know more about our own professor, and be able to actually … talk to her, even though it is in the Zoom.” At the same time, our participants did not think this kind of connection was likely or possible in an online course, “because it’s online, it’s kind of hard to feel [supported].” Carla said, “with online classes, I feel like it kind of … dehumanizes people.”

Though classes were delivered asynchronously, Diane recalled that she and her peers were encouraged to go to their instructors’ virtual office hours. She said, “[My instructor] always made it abundantly clear if we needed him for anything. Every single video lecture at the end, he said, ‘Remember, this is my office hours. If you have any questions, reach out to me.’” Yet, most participants felt that email was the most appropriate way to communicate. Catherine said, “I figured email will be the best because all the professors were really good at sending out announcements through emails.”

While crediting the convenience of emails, our participants simultaneously said that they actively limited their email interactions. Across multiple focus groups, our participants used the word “burden” to characterize email exchanges with their instructors. Owen said, “most of my work is the daytime, so I think once I finished work it’s kind of late, so I feel bad to actually send her an email at night. … I feel like I might be burdening her.” Specifically referencing email, Carla said, “Our professors, it feels like you’re burdening them when you email them, especially since [we] don’t have that relationship with them.” Another student, Dunia, said,

With this online learning, it kind of made it so you didn’t want to email the teachers. … Because you don’t have that relationship built. They don’t get to see you. It’s [about] something very human. I feel like if you don’t get to see someone, you don’t have that relationship, that connection.

Our participants reported that, outside of a relationship with their professor, they found it difficult to initiate email communication. As a result, they conceived of email exchanges with instructors as task-oriented and transactional, focusing on completing assignments to earn course credit. For example, Kelsi said she would email only when she had “a specific question about something.” Catherine would email when she “couldn’t find an assignment,” and Andrea would email if she “accidentally turned an assignment in late. … That’s pretty much it.” Dunia said, “There were moments where it was kind of nerve wracking to even want to email them because you weren’t quite sure if they were going to give you what you needed anyways [emphasis added].” Owen said, “I feel like it would be more fun if you actually can communicate with the professor rather than just ‘This is homework. This is what you want to watch. And I want you to do this.’”.

While dissatisfied with the curt and impersonal nature of email, our participants generally characterized their instructors as responsive. Andrea said that her instructor, “would always get back to me within like the next day or the next couple days.” Furthermore, our participants were not critical of their instructors. Melinda perceived that interactions through “email and … BlackBoard and whatnot, made [her instructor] seem very welcoming and totally there to help.” Kelsi said she felt cared for when her instructor “checked up” on her, saying that her instructor,

realized that we were real people. … All the emails that she would send out and she would give us extra time for things, when she would get overwhelmed with stuff, [as she] recognized that [we’re] probably overwhelmed too, and would push back the deadline, sometimes. And … to give some more time to breathe and work through stuff. … Just very sweet and kind, and … flexible and accommodating.

This distinction is important: while students were quick to note that their instructors were caring and responsive, they differentiated this attribute from having a relationship with the instructor.

Our participants had empathy for their instructors and what they perceived to be a sub-optimal learning situation marked by stresses associated with social distancing and the global pandemic. Catherine said, “I know that everybody’s going through a huge change. … The professor’s trying. The students are trying. We’re kind of pretty much in the same boat, and just have to be respectful and mindful of everybody’s situation.” Dunia said that the pandemic circumstances were, “hard for everyone,” and, “we were all having issues.” John noted that just as he was working hard and feeling isolated, his instructor probably was too. He perceived that instructors also relied on student engagement, noting that in F2F classes professors could interact with students “knowing that all the students are going to be there and it would validate their time a lot more.”

In providing comments around how they had engaged and wanted to engage with their instructors, our participants drew from their prior experiences in a more traditional setting. While participants recognized online learning as a “two-way street,” at once social and intellectual, they recounted numerous examples in the online class where they did not feel like they could achieve the level of interpersonal interactions with instructors that they had hoped for. As we discuss in our next sets of findings, students’ experiences were characterized by a sense of struggle to interact optimally with their instructors, and expectations for their instructors to help them engage with content, technology, and their peers (see Fig. 3).

Fig. 3
figure 3

Involuntary online learners’ engagement practices. Note. Moore and Hilman et al.’s online learning engagement frameworks center the student, who is meant to interact with instructor, content, technology, and peers in an inclusive learning environment. When our participants described their processes of engaging with the course, they centered the instructor as the “hub” of their interactions

Student-Instructor-Content

After looking at learner-instructor engagement, the majority of instructor codes overlapped with content.Footnote 3 Our participants expressed that they wanted their instructors to curate materials, to monitor their comprehension and progress, and to answer questions.

Curate Materials

Our participants said they wanted their instructors to make the biology content relatable and relevant. Catherine described that before the COVID-19 pandemic, most of her learning was from instructors’ lectures. She said, “if the teacher could kind of just use their own resources to teach, that might be a better option.” In this vein, some students wanted their instructors to develop their own materials, such as lectures where instructors “actually record themselves.” Other students felt that even if their instructors did not develop their own materials, they should curate helpful resources, like high-quality YouTube videos, lamenting that “we have to research, and look for the information [ourselves]” or “we actually have to go on YouTube and look it up.” Even among students who expected their instructor to rely primarily on a textbook, there was a strong preference for the instructor to curate other resources and draw from their own expertise to supplement what they were reading. Carla said, “There was so much information. But the way that he presented it made it so much easier to understand. … He had really added to the information that we had in our textbook.”

Our participants agreed that when instructors presented and interpreted the material, the content seemed more manageable, personable, and understandable. Part of this preference may have come from instructors’ enthusiasm for the subject, which our participants found to be palpable, even online. Catherine said that the course changed the way that she felt about science, which she used to think was “boring.” She said that her instructor,

always showed a lot of passion for [science] … Obviously, because their careers are in the sciences, so they show a lot of passion for it … It’s kind of nice to see that. … Even though that’s not my career path, or what I’m going towards in life it’s always nice to see where the professor is … excited about it. Where it’s like ‘Hey, this is what we teach, and this is why we love it.’

Monitor Comprehension and Progress

Beyond presenting the material, our participants articulated a desire for instructors to build mechanisms into the course that would monitor students’ progress and comprehension around the content, including correcting misunderstandings. Kelsi said that in a F2F environment, “the teacher can see when we’re confused or if we don’t get something. It’s easy for her to kind of read if we’re understanding the material and whatnot.” Our participants said that structured assignments and interim assessments in the course helped them to stay on track with learning content, as opposed to “let[ting] us just coast until mid-terms and then coast until finals.” They were concerned that a lack of frequent convening left them vulnerable to misunderstanding. Tani said, “I sometimes don’t know where I’m struggling and I need my professor to point that out to me.” These comments illustrate how our participants framed their own engagement with content as a function of the instructors’ evaluations, rather than a self-directed or co-constructed process of planning, monitoring, and assessing knowledge and understanding.

Answer Questions

Our participants placed high value when their instructors answered their questions about the content, stressing the benefit of immediacy and real-time communication. Different from the desire to have their comprehension and progress monitored (without students asking for it), the desire for direct question-answering included the expectation that instructors would explain content directly, walk them through content, or help them find the correct answers.

When commenting about her experience with the discussion board, designed for students to ask questions and respond to one another, Melinda voiced trepidation around using such a forum, saying,

[The instructor] had a discussion board out but … I’m not one that wants [that approach]. Because what if it’s a stupid question? You know? That normal student fear that every student has, you know? … So I want to contact her personally, and see if she could help me.

This aversion and approach positioned the instructor as the conduit for content. Paradoxically, it limited Melinda’s effective use of the technology (the discussion boards) as well as her opportunities to engage with her peers. Furthermore, while students regarded email to be their primary means of communication, they found it challenging to strike up rich conversations, ask complex questions, and elicit instructor feedback using this tool.

Overwhelmingly, when it came to content-related questions, our participants stated a preference for real-time communication. They liked not only the immediacy of responses in real time but also perceived higher quality interactions. John said, “If I can have a conversation with somebody, it’s much easier for me to understand what they’re trying to say. … It’s easier just to call her. See them face to face” [emphasis added]. When prompted to talk more about productive ways to learn content, our participants described question-and-answer exchanges, with students posing questions and the instructor providing definitive explanations. Melinda, for example, said of F2F exchanges that, “I do like having explanations on things, and being able to be there in that moment, while it’s being explained, to ask questions. Have that time to ask questions, for [my instructor] to answer my question [emphasis added].” Other participants said their desire to ask questions in a F2F class was to ascertain what the instructor wanted them to know. Here, we noted that the language our participants used to describe their relationship to the content focused on remembering, understanding, and applying rather than analyzing, evaluating, or creating new knowledge. When reflecting on their ideal learning scenario, their comments reflected limited autonomy and agency in their own engagement and learning.

Student-Instructor-Technology

Our participants also said they needed their instructors to help them navigate technological tools and platforms. Interestingly, though students had access to additional resources like YouTube videos, other students, Google searches, UAA’s IT dce textbook publishers/software applications’ tech support, they did not mention using these, and instead deferred questions to their instructors. Participants identified challenges with initial setup of the technology, and also wanted their instructors to both ensure that it functioned properly and troubleshoot glitches. These expectations preceded their use of embedded technology to learn content and communicate with faculty.

Set Up and Ensure Functionality

The students were using new platforms (like publisher-based websites) and new applications (like VoiceThread, a tool similar to a discussion board that incorporates multimedia forms of expression) that would eventually allow them to share and engage with multimedia content. However, there was a learning curve en route to optimally using these tools. Dunia suggested that her instructor should regularly,

Check in and make sure that our platforms we were using were working properly. … And having them help us figure it out. Maybe even creating a video on how to use certain things or answering common questions would help maybe a little bit.

She elaborated, “This is something that we’re all struggling with. We’re all trying to figure out, maneuver through. … Maybe they don’t have all the answers, but no one really checked in to make sure that this was working.” Carla, who was in a different online section, appreciated that her instructor provided this help by making weekly videos with “tours of BlackBoard” and other tech applications.

While Dunia and Carla wanted their instructors to proactively reach out and monitor that the tech was working, John was quick to seek help when he thought he needed it. He preferred more personalized tech support and recalled multiple sessions with his instructor to figure out how to use the applications. He said,

I had to call or video conference with her during the office hours [to set up and learn to use] several of the [applications]. I couldn’t figure out how to use the media. It didn’t make any sense. Like, I couldn’t figure out on my own. It didn’t come with an instruction book. It just had the assumption that you knew where to go to put things. … She was more than helpful … but [the application] was not self-explanatory at all.

Like John, other participants found their instructors to be accessible when they asked for help with technology applications. Diane said, “I only had the need to do that, like two or three times, but anytime I reached out to him via email he is – BOOM – right there.” In these stories, students described their instructors as providers of just-in-time or on-call tech support.

Troubleshoot Glitches

Though setting up the technology may have created some connections to instructors, our participants spoke most frequently about troubleshooting tech applications and their need for instructors to provide ongoing tech support. Some of the tech problems were fairly simple system glitches; for example, Diane alerted the instructor when the publisher-based quiz did not properly record her score:

I had a computer glitch. And I know for a fact I answered every question, except maybe one because I might have run out of time. … But [I emailed my instructor and said that] technology fails, and if we’re going to be doing this all based on technology, there has to be a [workaround when technology fails]. And he was awesome. He’s like, “I reset it. Go take it again.”

While some technology issues could be quickly resolved, when the participants saw the instructors as their only source of support, this also had the potential to become a source of tension. For example, when Dunia encountered a tech problem and had questions, she found it “a little annoying” that she needed to send “a message [to her professor] saying, ‘Hey, I can’t upload this. I’ve tried multiple different files, so I’m going to email it to you. Maybe you could give me an answer for that.’” When her instructor was unable to help her troubleshoot, she said, “I was like, ‘Okay. Well, I don’t know how to do this then.’”.

Our data demonstrate a tension around the instructor-technology relationship. On the one hand, students did not want to “burden” their instructors with too many emails, especially outside of a personal relationship with that instructor. Yet, they still positioned their instructors as gatekeepers in their ability to use technology. We also note that “troubleshooting” interactions, while important to students, are a long way from engaging optimally with the technology available in online courses, or with their instructor.

Student-Instructor-Peers

In all of our data, we had the fewest codes in the peers category. Many of our participants regarded online learning as a solitary experience, one they characterized as being “on my own.” The available codes in our data indicated that our participants’ connections with their peers were frequently mediated or initiated by the instructor through assigned group work. In addition, participants expressed a desire for their instructors to facilitate small group connections.

Assign Group Work

Shari identified positive engagement with other learners, spurred by a project that the instructor required. Shari said that her instructor,

Assigned us a project where we had to have a partner from class, which was really nice … to have that feeling of working together with someone and kind of have more interaction and able to bounce ideas off of each other and share information that the other one may not have learned.

Coordinate Small Group Meetings

Participants indicated that they preferred the instructor to take a more hands-on role in coordinating informal student–learner interactions, one that mimicked pair and group work in a F2F class. For example, Kelsi said that in F2F classes, “it can be easier to just have that kind of interaction with the teacher and with the students. And it can be easier to just facilitate relationships in groups.” In online classes where they did not occupy the same physical space as their classmates, our participants perceived that they needed instructors to facilitate peer connections. Like Kelsi, Catherine also appreciated the intimacy of a small group and felt that this could be best coordinated by the instructor. Commenting on instructor-led Zoom sessions where students could drop-in outside of class time, Catherine suggested,

Maybe in small groups, maybe, no more than five? ’Cause I feel like – let’s say we were picturing this during the pandemic – having 10–20 people on a Zoom could kind of get a lot when people are trying to ask different questions at the same time. So, maybe [the instructor could] minimize it to about five people per tutor session.

Catherine also indicated that she often felt alone in her online learning and deferred to the instructor to facilitate informal check-ins. She suggested,

I feel like, … with COVID and people not being interactive, it might be a bit harder on some students to … interact with their classmates … so we could maybe have like a … weekly [session for students to connect with their peers] and be like “Hey, just check in … how are you guys feeling?”

Student-peer exchanges are a valuable form of engagement. Although our participants spoke infrequently about experiencing these interactions, they did indicate that they appreciated the interactions they did have in the online environment. However, while they could identify strategies and schedules for what would benefit their learning, they suggested that these should be organized by the instructor. As with other forms of engagement, our participants centered the instructor as the convener of their engagement – even with peers.

Discussion

As we explored student perspectives around remote learning, they frequently compared their online class to prior F2F learning experiences. We refrain from comparisons that position F2F as the curricular ideal or standard. Not only do our data not accommodate this comparison, but many F2F learning contexts struggle to engage students as well. Instead, we present the online participant experience as it was experienced by our participants, noting that different learning environments and student needs present unique challenges, and no model is perfect.

Our work responds to the call for research centering the voices of undergraduate students taking online classes (O’Shea et al., 2015; Wiggins et al., 2017), especially in the context of “emergency remote learning” (see Khlaif et al., 2021) during the COVID-19 pandemic (Anderson, 2020; Neuwirth et al., 2021). Our participants preferred real-time interactions and F2F classes – and in most cases were quite hesitant to take online classes, particularly in STEM disciplines. We explore this reluctant or involuntary relationship and discuss ways that students who resist online learning are positioned and position themselves in the remote learning environment.

Because the students centered their instructors as directors of their learning, they largely evaluated their learning experiences based on the perceived quality of learner–instructor interactions. Our participants positively characterized their instructors’ pedagogical choices, caring attitudes, and timely communication. Most expressed that the instructor interaction they experienced in their online class was “beyond expectation.” Though they had low expectations of their instructors prior to the start of the semester – drawing from their previous experiences with online learning (see also Glazier & Harris, 2021; O’Shea et al., 2015) – our participants were pleasantly surprised and pleased with instructor responsiveness. However, we note that satisfaction is not synonymous with engagement, and even as they placed a lot of expectations on their instructors, they expressly noted a lack of relationship with them. In our discussion, we connect our findings from the COVID-19 pandemic to the broader literature about learning in uncertain and disruptive contexts, designing for autonomy and agency online engagement, and identifying opportunities to optimize learning and engagement through technology.

Student Engagement in Uncertain and Unfamiliar Learning Contexts

Our participants described a strong preference for meaningful interactions, yet they largely positioned themselves as passive recipients in learning, and neglected to consider their own authority in engendering engagement. Research particular to the COVID-19 global pandemic documented instances in which students found ways to use technology that fostered deeper engagement with algebraic learning, therein developing agency and transforming the learning process (Decker-Woodrow et al., 2023). Dissimilarly, our participants expressed a preference for more didactic or teacher-centered practices (Perets et al., 2020; Wurdinger & Allison, 2017). Our findings are much more aligned to prior research that documented student preference for certainty and stability when learning environments become unpredictable (Conrad, 2010). Specifically, our participants wanted instructors to present content in a clear manner so students could follow along; to provide clear instructions to show students what they would be expected to do and would be graded on; to curate more platforms to accommodate students’ schedules and preferences; and to provide more out-of-class opportunities that would enrich their learning.

This desire for structure and predictability contrasted sharply with the uncharted context that students were living in the midst of the global pandemic. Across disciplines, prior research has documented that a desire for control and predictability often emerges when environments are characterized by uncertainty (see Stamatis et al., 2023; Anderson et al., 2021; Ezarik, 2021; Martinez & Broemmel, 2021; Saxena & Khamis, 2021). Our study aligns and adds to this growing literature, and our data suggest that, while the possibilities for critical reflection and student agency in response to uncertainty may be encouraging for some learners (see Decker-Woodrow et al., 2023), involuntary online learners may have different inclinations, and thus need different supports.

Social Expectations for Student Engagement in Online Contexts

Across all of our categories, the student experience was characterized by a desire for meaningful interaction, but students were uncertain about how to initiate or sustain that engagement with instructors – or with peers – in an online environment. Thus, we consulted the literature that explores social and cultural aspects of learning to interpret our participants’ experiences (see Lee et al., 2020). The online environment requires students and faculty to adjust existing conceptualizations and expectations for engagement that they have formed from in-person learning experiences (Anderson, 2020; Damary et al., 2017). This means that instructors use different pedagogical strategies that engender student autonomy and engagement in the online context (Benedict-Chambers, 2016; Bond & Bedenlier, 2019; Stone, 2017). It also requires different skills from students, including navigating cultural practices, norms, and values that are particular to online learning environments (Anderson, 2020; Lee et al., 2020; Nasir et al., 2020).

Our data suggest that involuntary online learners’ conceptualizations of how to engage in an online environment may be underdeveloped, and some of their reliance on instructors may be attributable to a lack of know-how (see also O’Shea et al., 2015; Richardson & Newby, 2006). In open-enrollment institutions like UAA, learners come in with different levels of familiarity with practices and discourses related to self-directed learning (Cornelius et al., 2013). Specific to STEM learning environments and particularly for freshmen who may still be developing familiarity with college expectations, our findings align to recommendations that self-direction be explicitly discussed at the beginning of a course and reinforced throughout (DeFeo et al., 2021). This established need seems to be especially important when students are new to online learning (see also Prince et al., 2020; Xerri et al., 2018).

Optimizing Engagement with Technology

When starting a new introductory-level course, students encounter a lot of unfamiliar elements: new instructors, new peers, new content, and probably new ways of learning. When beginning our analysis, we had expected students’ familiarity and comfort with technology to be the most stable relationship in the online learning environment. However, though both BlackBoard and Zoom were both used institution-wide by the time we collected out data, our students did not seem to immediately feel comfortable or familiar with these technological applications. Our findings align to emerging literature that identifies how technology can scaffold student-to-student interactions and facilitate high-quality learning experiences (Bickle & Rucker, 2018; Elumalai et al., 2021), but these opportunities can be lost or overshadowed when students experience technical problems (Dhawan, 2020; Hagedorn et al., 2022). Even before the COVID-19 pandemic, Henderson et al. (2017) noted the contradiction between the potential for digital technology to support and transform learning, and the realities experienced by undergraduate students. Considering the growing interest in technology-enhanced learning (see Henderson et al., 2017), our study details a complex nuance around a student population that is popularly regarded as “digital natives” (Prensky, 2010). Our findings suggest that, even for students who are tech savvy outside of the classroom, their ability to engage with technology may require as much nurturing as their connections to instructors, content, and peers.

Implications and Recommendations

Overall, we are struck by the volume of expectations that students had for their instructors and how expectations seem to be in contestation with each other. Aligned with a general theme in distance education literature, our participants wanted to engage with the course, and with their instructors in particular (Martin & Bolliger, 2018; Tanis, 2020). However, we are concerned about missed opportunities for meaningful engagement and learning when students defer so much to their instructors, and sobered by the wide range of tasks that instructors are asked to navigate and address that are not always recognized in their workloads (see also Kulikowski et al., 2022; Stone, 2017).

While our participants centered the instructor in their learning experience, we want to center students in our recommendations, with considerations around how to promote student agency. Drawing from James and Pollard (2011)’s conceptualization of effective pedagogies, we laud efforts to promote student engagement and self-directed learning, but students will need to be introduced and prepared for these expectations (see Perets et al., 2020; Tabak & Kyza, 2018). Specifically, expectations for self-directed learning tend to push students out of their comfort zones and, as such, may be met with student resistance (DeFeo et al., 2017, 2021). Thus, faculty will also need to be supported as they hold students to new and unfamiliar ways of engaging with learning.

More broadly, we note that student expectations and behaviors in learning environments are conditioned (Pope, 2008). As online learning becomes increasingly common – either for entire courses or as supplemental learning to complement F2F experiences – students will need to learn the culture of online learning exchanges and develop the competencies to engage meaningfully in online learning environments. This is also an opportunity for education systems working with youth prior to college enrollment.

Empirically, we note that prior scholarship on online learning engagement in the cognitive and information science fields has been highly quantitative in nature. This scholarship has focused on observed behaviors in online interactions (e.g., the number of posts in forum or time spent actively clicking in the learning management system). Our descriptive study provides qualitative data as a complement to these quantitative findings. Our participants certainly engaged in a lot of interactions with their instructors; however, these exchanges did not constitute meaningful or productive engagement. Our data suggest that future analyses accommodate the nature and/or quality of these interactions in a variety of learning contexts, in addition to the quantity.

Limitations and Opportunities for Further Research

First, though the COVID-19 pandemic school closures created an opportunity for us to study involuntary online learners, most academic professionals and students were unprepared for a rapid pedagogy pivot. The immediate response to the COVID-19 pandemic situation involved “making-do” with “what’s possible” as UAA and institutions across the USA shifted F2F classes to online platforms in March of 2020 with virtually no warning (see Anderson et al., 2021). As such, we did not use data from SP20 and instead conducted our focus groups in FA20 and SP21 when both students and instructors knew months in advance that they would be teaching and learning online. Student participants in our study and their and instructors universally agreed that pandemic teaching and learning circumstances were suboptimal, no matter how skilled instructors were in online pedagogies. While we offer an extreme case example (Mills et al., 2010), there is an opportunity to explore more typical involuntarily online contexts.

Our data represent a single university, with students who were resistant to online learning, and particularly averse to online STEM courses. We were not able to complement our interview data with classroom observations or artifact analysis, which limited our opportunities to capture alternative forms of engagement. Since engagement and learning are iterative and nonlinear, students may have engaged with their content or peers (or even their instructors) in other informal ways that our method did not capture. Additional work that attends to other subjects and learning contexts is warranted to further probe the transferability of our findings.

We also note limitations around our analytical framework. Though theoretical frameworks can illuminate new perspectives, they concomitantly conceal alternative interpretations (Fowler, 2006). While our choice of framework allowed us to look at roles and structures, it also neglected to specifically look for concepts and patterns that illuminate “agentic learning” such as agency, collaboration, and ingenuity.

A final limitation of our study lies in our participant identification strategy. We invited participation from students who “would have preferred to take their class F2F.” While we are pleased with our own coined term, involuntary online learners, its definition is underdeveloped. If the concept of involuntary online learners holds, there is an opportunity to further explore and better operationalize this phenomenon in future work, such as exploring how prior experiences with online courses and specific course attributes (e.g., subject, course length, instructor characteristics, and application of technological tools) shape their experiences and perceptions regarding online learning.

Conclusion

As online and hybrid learning models become more ubiquitous, as technology continues to advance, and as new generations of students engage in digital learning environments, our analysis suggests that even among students who value the access provided by online learning, enrollments may not reflect a preference or readiness for this learning modality. As contexts continue to change, instructors, technologists, curriculum developers, and researchers will need to keep pace to ensure that online environments are personally, intellectually, and socially fulfilling both for learners who prefer F2F learning environments and those who prefer to take classes online.