Introduction

The emergence of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) in late 2019 transformed our world. Neither individuals nor institutions were immune, including higher education. By examining institutional cases across Asia, Australia, Europe, and North America, we provide an assessment of what happened in rapidly refashioning teaching and learning in the face of COVID-19-induced public health edicts. Our attention focuses primarily on experiences in February to May 2020 as reported by faculty and students.Footnote 1

The breakthrough contributions of biomedical science to COVID-19 vaccines and treatments, which were based on decades of fundamental university research, showcase higher education in a powerful light. Wisdom’s workshops — James Axtell’s (2016) poignant phrase for modern universities — served humanity well. For biomedical innovations integral to COVID-19 interventions, higher education was a key contributor.

COVID-19 also generated a set of non-medical impacts through public health measures. Precipitated by the pandemic, two step-change events transformed teaching and learning. First, limitations on the size of social gatherings curtailed face-to-face teaching and learning. “Teaching with technology” was the common response. Second, public health restrictions mandated a “retreat to the household,” with sheltering-in-place lockdowns suddenly reshaping the learning environments of students.

These two phrases — “teaching with technology” and a “retreat to the household” — capture two key features of “emergency remote teaching,” a phrase often used within higher education to describe the COVID-19 pivot (Hodges et al. 2020). The household became the “remote” location of study, with technology as the means of instructional delivery. Early descriptions characterized the COVID-19 pivot as a shift to “online learning.” It was, however, quickly apparent that few faculty were able to transition instruction to anything remotely akin to robust online teaching and learning (Dhawan 2020). Most higher education institutions had little pre-existing capacity in fully online instruction, even at the level of the course of instruction (see, e.g., U-Multirank.orgfor one overview of online instruction capacity; also Qayyum & Zawacki-Richter, 2018). Beyond the challenges stemming from a lack of institutional capacity, the knowledge that teaching and learning was under the threat of an unseen, freely circulating virus, was hardly conducive to the cognitive enrichment of students. We explore how, in the eyes of students and faculty, remote instruction was undertaken and especially how it impacted teaching and learning.

Perspectives on the COVID-19-induced pivot vary, depending upon who you ask (e.g., faculty, students) and what you ask. Crawford et al. (2020) provide an early audit of what universities, as organizational units, attempted. Others have examined how students responded, often with a focus on well-being (Tasso et al., 2021) or with a general overview of country-level pandemic responses (Bozkurt et al. 2020 report on 31 case studies) or with reactions from academic staff (Watermeyer et al. 2020 for the UK). We focus on two main groups — faculty and students — and explore what transpired in individual courses of instruction from the vantage point of both the teacher and the learner. We show that prior proficiency with technology was important for both faculty and students, but with less impact on course transitions than we anticipated. For students, it was a range of immediate challenges within their households that had the greatest impact on their learning confidence, although some evidence of a digital divide among students was apparent. The level of support faculty provided to students was also significant in promoting more confidence among students.

Contributions and research focus

Our contributions come in asking three broad, interrelated research questions. We start with the recognition that the transition to emergency remote instruction, while sudden and unexpected, was nevertheless accomplished in the context of an increasingly digitally aware higher education system. Educational technology, from basic learning management systems to adaptive e-learning teaching and learning, has recently flourished, at least in its presence (see Brown et al. 2020; Kukulska-Hulme et al. 2021). But, as Selwyn (2014) cautions, a noticeable gap persists between the celebratory rhetoric surrounding learning technology and its adoption. This “digital disconnect,” as he calls it, continues. Liu et al. (2020) recent review of learning technology in higher education notes that while its increasing prevalence has been “widespread,” it remains “underused” (see also Ali, 2020; and Shelton, 2017 on reasons for abandoning learning technology).

The extent of a “digital disconnect” prior to the pandemic could have hindered the sudden but necessary conversion to remote teaching and learning. We understand the “digital disconnect” as a gap between the availability versus the adoption of learning technology. In contrast to this gap, the “digital divide” highlights the distribution of adopters and non-adopters, implying that learning technology includes a patterned or structured “hierarchy of access” (Cotton & Jelenewicz, 2006: 497). While clearly related, we understand the disconnect as focusing on the gap between presence and use, while the divide highlights how adoption levels are distributed among users. Variation in adoption rates has been linked to socially patterned inequalities in society, highlighting the potentially uneven division of e-learning experience and proficiency (Helsper, 2021).

Using the idea of the “digital disconnect,” we explore issues of prior experience. We ask, more specifically, how many students and faculty instructors had made use of learning technology before the pandemic. A greater disconnect would make the rapid reframing of courses more challenging. We then investigate, again for both groups, whether or not this prior connection with digital technology made any difference — did it have any impact on their teaching and learning? The concept, “digital divide,” implies a slightly different focus. Here we attend to how both the adoption of, and proficiency with, e-learning approaches might have been structured, divided, or socially organized (e.g., by gender or discipline). That is, we explore how digital experience and expertise was distributed among users, both teachers and learners, to see if any specific groups were more or less advantaged by the processes of ‘teaching with technology.’

Our second question, where we focus on students, asks what “the retreat to the household” entailed in relation to learning. Among a widespread set of public policy responses enacted to curtail the spread of SARS-CoV-2 (see Oxford Policy Tracker), strategies of containment and closure were central. Restrictions on social gatherings (e.g., closing schools, workplaces, restaurants), travel constraints (e.g., banning non-essential trips), and especially stay-at-home requirements (e.g., sheltering-in-place) made the household an unprecedented focal space for everyday life. The radical curtailment of human interaction created household bubbles as one of the more prevalent COVID-19 intervention strategies.

Household bubbles came in different shapes and sizes (Ammar et al. 2020; Okabe-Miyamoto et al. 2021). Size or density was one dimension where for some who sheltered alone, isolation and loneliness were challenges. For others, households became more crowded as other venues shuttered or were rationed (i.e., schools, worksites, leisure destinations). Composition was a second dimension where household bubbles could consist, basically, of single occupants, multiple family members, or several non-related housemates. Furthermore, some bubbles were simply extensions of pre-COVID-19 living arrangements, while others were newly formed as people altered living situations (e.g., students returning home).

In addition to people and their connectedness, or lack thereof, household spaces presented other challenges. The first, linked to the digital divide, relates to information and communication technology within households, and especially its variable, and sometimes shared, quality and quantity. Second, students also experienced households differently with respect to dimensions such as crowdedness, privacy, noise, and responsibilities for others. Abruptly, and certainly in unplanned ways, the household became a focal space that had a much deeper reach than normal in influencing individual welfare, and for students, their capacity to study and learn.

The “retreat to the household” was variable as living arrangements differed by size, composition, formation, and challenging experiences. We work to tease apart the different ways in which, after the pivot, students’ living arrangements and associated challenges impacted their confidence in learning. For example, Husky et al. (2020) report that students who did not relocate to their parent’s home as lockdowns were mandated experienced greater psychological distress, a condition that might well have undermined their confidence in learning.

As a third focus, we also investigate how household arrangements compared with digital access and adoption in shaping the ways students experienced their studies under pandemic conditions. Here we ask whether the “retreat to the household” or “teaching with technology” had more influence on a student’s confidence in learning. Also, we anticipated these twin challenges (e.g., household confinement and technology usage) might prompt responses from instructors to bolster the capacity and confidence of students to succeed in these newly transitioned virtual courses. To assess this, we include a measure of student perceptions of faculty support for virtual learning challenges.

Research Questions:

  1. 1.

    How did the digital disconnect and the digital divide influence faculty and student navigation of the COVID-19-induced pivot to remote teaching and learning?

  2. 2.

    What impact did household confinement have on student’s perceived ability to learn with confidence subsequent to the COVID-19 pivot?

  3. 3.

    For student learning, were digital factors or household differences more influential, after accounting for instructor efforts to support remote learning?

Research design and measurement

A partnership of nine institutions formed after colleagues from the University of British Columbia distributed an invitation to participate in a project entitled “COVID-19 and the Transition to Online Teaching and Learning.” The invitation was disseminated through international virtual networks of professional associations and listservs. The partner institutions, varying in size, teaching breadth, research focus, and composition of the student body are housed in six different countries across four different continents (see Table 6 for thumbnail sketches of each institution).

We began by selecting individual courses of instruction, and by default, the faculty members who taught them. At each institution, we attempted to include an array of courses that varied by content delivery methods, size, year level, student specializations, and so forth. Our goal was to identify what transpired in specific courses as they were pivoted to remote instruction by faculty members and how students adapted to those changes.

At nine institutions, we sent invitations to students asking them to complete a self-administered questionnaire (in most instances these were the same courses as the faculty instructors taught). In six institutional cases, we interviewed instructors (in two other instances, questionnaires were used, and in one case, only student data was collected). We targeted five disciplines common to most institutions but diverse in their intellectual focus: chemistry, civil engineering, history, political science, and psychology (this varied somewhat depending upon offerings and naming conventions).

Faculty participation rates ranged from over 60% to around 15% (few academic staff refused outright, with most non-participants simply not responding to repeated recruitment messages). Student response rates varied from a high of 46% to under 10%. In several places, we were able to use random sampling, but in some institutions, we had to rely on quasi-random samples where representativeness was hard to establish. We discuss how we dealt with sampling differences under analytic strategy, below.

Where possible, we report faculty data for eight institutions, and student data for nine. Not all partners were able to ask all of the questions. At one institution, every student was granted a passing grade in mid-March, but most instructors continued teaching through to mid-April in order to cover the course material necessary for student advancement. At other institutions, a handful of courses were not transitioned for a variety of idiosyncratic reasons, including principally class size, but sometimes because of course design (e.g., the latter half of the course was already online). We include only courses where faculty or students reported that “they [or their instructor] transitioned [the specific course name] to remote instruction.”

Measurement

To examine “teaching with technology,” we asked faculty interviewees three questions about prior learning technology use. First, we asked whether or not instructors used their institution’s learning management system (LMS) prior to the pandemic’s onset. The question first asked about broad use (i.e., have you “used the institution’s LMS”) and then enquired about specific uses: have you “posted materials (e.g., course outline, etc.),” “posted PowerPoint slides” (or equivalent), “posted videos, web links,” or “used specialized online teaching tools.” A second question asked whether instructors had taught “online or web-enabled courses” including courses that were “blended/hybrid,” “flipped,” fully online distance,” or had “online modules” (e.g., short components of the course online). Finally, interviewees were asked a third question about proficiency with e-learning tools: “how much prior experience would you say you had with web-enabled or technology-mediated course instruction,” with responses arrayed on a five-point scale from “no prior experience” to “I would consider myself proficient.”

For students, we asked the same question about proficiency as described above. Second, we asked students a question about their prior online experience, similar to faculty (i.e., had they taken a course that was “blended/hybrid,” “flipped,” fully online distance,” or included “online modules”). We also measured the virtual learning support students felt they received by asking them to rate whether their instructor “assumed students would navigate online learning on their own,” coded 0, to “students were given careful, explicit instructions about how to navigate online teaching and learning,” coded 10.

To explore how digital influences might have impacted teaching, we created a scale for faculty instructors that examined their sense of success in having rapidly shifted to remote instruction. This scale of “course transition” is based on responses to the following five Likert-based items:

I feel that overall, I handled the course transition well.

I personally felt overwhelmed by the transition to online learning (reverse coded).

It was more difficult to teach [after the transition] (reverse coded).

I was able to stay true to my original teaching goals and objectives.

My students received a lower-quality teaching experience (reverse coded).

The scale, with a mean of zero and a standard deviation of one, had a Cronbach’s alpha of .678. We used a principal component analysis to weight each item based on an item’s contribution to the explained variance of the first component. Higher values represent instructors who felt they handled the course transition well.

For students, we also explored their “retreat to the household,” using several different measures. Most broadly, we used a question that asked students whether or not “any of the following situations where [they] were living made it difficult to complete the online portion of the class?” The situations that could be checked-off were as follows: no internet access, slow/limited internet access, lack of adequate hardware/devices, too much noise, too many people, no dedicated study space, food insecurity, and living with relatives and/or children who required care. We also constructed three household measures from questions asking students about their living arrangements immediately prior to, and immediately after, their institution had shifted to remote instruction. We asked about their pre- and post-pivot living arrangements, on and off campus, using the following categories: living on my own, living with peers/roommates, living with family/relatives, and other, please specify. One of our measures captures whether or not students had to transition from one living arrangement to another as a consequence of the pandemic (i.e., were they movers or stayers with respect to housing?). Two other measures focus on whether or not students lived alone, or whether they lived with family/relatives after their course was transitioned.

We created a “confidence in learning” scale using four Likert-type items. Students were asked to “indicate [their] level of agreement with the following statements about the transition to remote instruction” using a seven-point scale from strongly agree to strongly disagree.

I was confident in my abilities to learn well in a remote online course (reverse coded).

I personally felt overwhelmed by the transition to online learning.

I found it was more difficult to learn.

I felt the quality of my work declined.

A principal component analysis was used to obtain factor scores for weighting each item. The unweighted scale has a Cronbach’s alpha of .790. See Table 1 for a summary of key measures.

Table 1 Explanatory variables

Analytic strategy

We begin with descriptive details to set the context, first for issues related to “teaching with technology,” and then factors associated with the “retreat to the household.” As the analysis progresses, we introduce multivariable analyses, using ordinary least squares regression, where we include dummy variables for both course discipline and institution. These dummy variables help to ensure that unobservable differences across disciplines and institutions, including sampling differences by institution, are not influencing our results. For each of our multivariable tables, we use a series of models in order to compare what happens to the dependent variable as additional independent variables are introduced (e.g., Model 1, Model 2, etc.). We conducted, but do not report, a multivariable analysis using logistic regression. The overall results are similar.

Findings

Our initial focus is upon how prepared faculty and students were for the sudden pivot to remote instruction. An unintended consequence of our research was to reveal the degree of adoption of learning technology across institutions prior to the pandemic’s onset (Liu et al., 2020; Sinclair & Aho 2018). The rate at which learning technology resources were utilized by academic staff signals the gap between availability and adoption, or the digital disconnect.

A rudimentary, baseline indicator of the digital disconnect in teaching and learning comes from evidence of LMS usage (Dahlstrom et al., 2014). Available in all institutional cases, 95.4% of course instructors reported using their local LMS prior to the pandemic, and a further 70.5% reported posting resources for students, including videos and web links (see Table 2). The use of more specialized online tools (e.g., Piazza), not necessarily related to an LMS, was much lower, at 23.5% (with a range from 6.5 to 31.8% across institutional cases). These levels of prior usage are consistent with the idea of a digital disconnect, but after rapidly refashioning teaching and learning, LMS usage was at almost 100% in all courses.

Table 2 Learning management system usage prior to COVID-19 pivot (in %)

A stronger measure of the digital disconnect in teaching and learning comes from assessing how many instructors reported using learning technology, beyond an LMS, in any of their classes prior to the pandemic. Figure 1 shows a minority of instructors, 47.3%, told interviewers that they had such previous experience (the lowest institutional percentage is 25.9%, the highest is 100%). Restricting the focus to fully online course instruction, the average level of experience drops to 24.6% (with a low at one institution of 4% and a high of 63.6%). When asked to rate their proficiency with web-enabled or technology-mediated instruction, 29.6% of instructors said they had “some expertise” or considered themselves “proficient.” Although all of our institutional cases provided opportunities for instructors to employ e-learning tools prior to the pandemic, the adoption rates shown in Table 2, on several different measures, reveal the extent of the digital disconnect between what institutions offered and what instructors made use of in teaching.

Fig. 1
figure 1

Prior E-learning experience and “proficiency” (in %). NB: For faculty, eight institutional cases for “experience” and seven for “proficiency.” For students, nine institutional cases for “experience” and eight for “proficiency”

For students, 55.4% reported some previous e-learning experience (Fig. 1), although when we asked about prior experience with fully online courses this percentage drops to 31.7%.Footnote 2 This difference between some e-learning experience (55.4%) and fully online experience (31.7%) gives an indirect indicator of the degree to which learning technology has been integrated into face-to-face instruction in higher education (the contrast for faculty was 46.8% versus 29.3%). For students, 31.9% rated themselves as having either “some expertise” or being “proficient” with learning technology. This context of prior learning technology usage suggests that the transition to remote emergency instruction was far from being a complete step-change of zero to 100, but in fact, for many students and faculty, a change with which they had some experience and some self-rated sense of proficiency.

Figure 1 reports central tendency, or average, levels of experience and ratings of proficiency. While just under one-third of faculty and students reported some expertise or proficiency with learning technology, we were curious about how this proportion would be distributed among different individuals and disciplines, a measure of the digital divide. Did, for example, men have more or less self-confidence in their use of learning technology, either as teachers or learners? Using ordinary least squares (OLS) regression, we explore how self-rated proficiency varied by individuals, disciplines, and institutions. In Table 3, we use two sets of models, one set for faculty members and another for students. For both sets, we first introduce, in Model 1, the attributes of individuals (e.g., gender), and then in Models 2 and 3, we add first disciplines and then institutions. This allows us to parse out effects at each different level — individual, disciplinary, and institutional.

Table 3 OLS regression of self-rated proficiency on individual attributes, course discipline, and institutions for faculty and students

For faculty members, Models 1 and 2 of Table 3 (leftmost data columns) suggest that neither personal characteristics nor course discipline are related to an instructor’s sense of online proficiency. Male instructors were no different than their female colleagues in their level of reported proficiency. There is a modest effect for “other” disciplines, but this vanishes in the final model. There is no evidence of a digital divide, understood as a differential distribution either of adoption rates or proficiency, among faculty based on individual attributes or across those course disciplines we canvassed.

Model 3 indicates that differences between institutions, rather than differences within them, matter. One of our institutions has a reputation as standing out as a distance learning institution and this is reflected in Model 3 (β = .443). This institution is not an outlier however, since several other institutions also have statistically significant coefficients (the percent of faculty reporting “some expertise” or “proficiency” with e-learning varied from a low of 16.2% to a high of 60.8% among our institutional cases, a variation reflected in the Table 3 results). In short, there are institutional differences which are related to online learning traditions, but a history of distance education does not capture the entire institutional effect. There is little evidence of any “digital divide” when it comes to faculty proficiency.Footnote 3

Among students, the factors influencing their self-ratings of e-learning proficiency are much different and much more complicated. Differences among individuals do appear, especially for year of study where students with longer university pedigrees report feeling more proficient with virtual learning (Model 1 for students; rightmost data columns). Once discipline and institution are both controlled (Model 3), women report lower levels of proficiency than do men (β = − .034), while the year of study effect remains significant, and much stronger than the gender effect. There are strong disciplinary effects with psychology students (the reference category) rating themselves among the more proficient, net of other factors. It is apparent that engineering students (or more broadly, applied science students) rate their proficiency with e-learning technology lower than students in psychology, as do students in both history and political studies. Again, there are significant institutional effects, which are more divergent than among faculty. The explained variance of all the models for students is relatively low, although statistically significant.

We take several different points from the analysis to this juncture. First, just under 50% of students and faculty reported experience with some forms of virtual learning prior to the onset of the pandemic. This level of prior learning technology involvement gives important context for how higher education was able to cope with the pandemic public health mandates. Second, while self-rated expertise was lower than was actual experience, for academic staff this proficiency was spread relatively evenly among individuals and disciplines. This diffusion, however, was more even among faculty than it was among students. As Table 3 revealed, for faculty, institutional differences are strongest, whereas for students a host of factors across all three levels (individual, discipline, and institution) are related to self-rated virtual learning proficiency. Evidence of a digital divide is apparent for students but not faculty. This suggests that more attention to student e-learning skill levels might be wise since there is clearly a high level of variation in reported proficiency (and as we show below, instructor support for e-learning was important for student learning).

One key question is unanswered by the above. Did digital experience or proficiency have any impact on how academic staff felt they handled the emergency restructuring of their courses? The answer is no (Table 4). Indeed, faculty ratings of the level of success in making the course transition were not systematically related to measures of prior digital experience or proficiency, nor to personal attributes of instructors, nor to either the disciplines of courses or the institutions in which faculty members taught. In one of our interviews, a faculty member summed up the impact of the COVID-19-pivot on teaching and learning as: “It was chaos.” The results in Table 4 support that conclusion — random differences prevailed, a conclusion in line with Watermeyer et al. (2020) who report “significant variation” among their UK respondents when asked to reflect on “preparedness and confidence for total online migration.”

Table 4 OLS regression of faculty “course transition” rating on digital E-learning measures and individual attributes, with course discipline and institutional variable controls

The suddenly essential role of learning technology was not the only major pandemic disruption in college and university life. Student learning was, in particular, also affected by containment and closure provisions. Not only did university campuses close, but so did most non-university public spaces such as libraries, community centers, and cafes — all places students could be found studying pre-pandemic. Before the pandemic, 16.1% of students in our cases lived on campus, a percentage that dropped to 4.6% immediately after the March 2020 pivot. As a result of the pandemic, 35.2% of students reported having to move (including those leaving campus residences). Finally, after the pandemic’s onset, 70.3% of students reported facing difficult living situations. Living arrangements among students were clearly disrupted.

How much did issues associated with the disruption of housing, and differences among students in e-learning proficiency, matter for student learning? When asked on a seven-point Likert scale to rate how “confident [they were in their] abilities to learn well in a remote online course,” over half of all students (60.9%) reported feeling at least somewhat confident (consistent with Gonzalez et al. 2020). We were curious as to how this level of confidence might be affected by both “the retreat to the household” and “teaching with technology.” To examine these issues, we relied on the confidence in learning scale described above.

Table 5 displays results for five separate regression models. In the first model, we include individual attributes of students as well as their instructors’ level of support for virtual learning. Looking first at student attributes, gender and age are both influential, although only the age effect is consistent across all five models. Not surprisingly, older students were more confident in learning than their younger peers. Women appeared to be less confident than men in the first few models, but this effect weakens across models and disappears once we account for institutional differences. We also included a measure of instructor support in navigating online learning, and this effect is strong across all models.

Table 5 OLS regression of student’s confidence in learning on individual attributes, instructor support, digital capacity, and household measures (with controls for discipline/institution)

Model 2 introduces digital divide measures. The self-rated proficiency measure we explored above has a positive influence on student learning, an influence that is consistent across all the models. This effect is also net of the support for online learning that instructors provided, and slightly weaker than that association. Prior experience with e-learning has a negative effect on learning confidence in the early models, but this influence disappears in Models 4 and 5.

In Models 3 and 4, we introduce our household measures. Net of other factors, living on one’s own or with your family is unrelated to a student’s confidence in learning. Model 3 implies that students who had to move saw their learning confidence undermined, but this effect does not hold in Models 4 and 5. It is Model 4 where the major effect appears. Adding the self-reported living challenges that students faced, we find both the largest standardized regression coefficient (β = − .322) and the largest increase in explained variance (.098). Although the effect is reduced mildly in Model 5, it nevertheless remains substantial. Open-ended responses from students, when asked about challenges, supported this finding. A view of many is captured in the following: “not [being] able to interact with peers, study with other students” and I “don’t have an appropriate space to learn (learning on my bed is not ideal).” We take our quantitative findings, and the many qualitative comments that support them, as evidence that students’ experiences within household bubbles were more important than either the size or composition of the bubble, or whether the bubble was or was not newly formed.

Restricted to their immediate households by COVID-19 constraints, students were joined by others who also faced limitations in workplaces, shops, and elsewhere. Removed from physical academic settings, as were almost all students in our sample (but not all), students faced a new set of difficulties that were less influential prior to the pandemic’s onset. Before the COVID-19 pandemic, students had access to campus resources that aided their learning — the availability of library and study space, technology infrastructure, academic and counseling support services, health services, and perhaps most importantly, face-to-face peer group support. After mid-March, access to these supports was diminished. The differences students experienced as they settled into exclusively non-university routines, save for e-learning and some virtual support services which institutions increased, had major impacts on their confidence in learning. This is a key finding and one reflected in qualitative comments as well. When asked about challenges to learning, one student summed up the reactions of many: “not being able to see classmates, felt super disconnected” and we were all “lacking [a] sense of community.”

Notice also, however, that the effect of learning technology proficiency and instructor support for online learning, both factors under some control in higher education, remained impactful across all models, as did a student’s age. In total, this evidence suggests it was extramural issues that especially undermined student’s confidence in learning and to slightly lesser extent things that were done within or between institutions (although “teaching with technology” clearly mattered, it just did not matter quite as much as the “retreat to the household”). Our measure of difficult living situations did include three indicators related to the digital divide (e.g., internet access), but separating out these three indicators has only a modest consequence to the findings shown in Table 5.

Our interpretation about extramural effects is consistent with the positive ratings that both faculty and students had about the transition. When students were asked whether they agreed or disagreed that their “instructor handled the course transition well,” over 70% responded positively (with another 15% being neutral). Furthermore, those who responded positively on this question were also very likely to have felt their instructor gave them good online learning support (gamma = .633; p < .001). On a parallel question, where academic staff were asked whether or not they felt that they “handled the course transition well,” 84% agreed (with 11% neutral). The latter item could be seen as self-serving since faculty were asked to rate themselves, but the student ratings confirm the positive sentiment about course transitions. Overall this implies relative satisfaction with how higher education personnel responded, at least on course transitioning, consistent with the idea that forces outside the academy might have had a stronger bearing on undermining the learning confidence of students than did factors inside the academy.

Discussion and conclusion

Prior to vaccinations, physical distancing restrictions were a major tool used to combat the spread of the SARS-CoV-2 virus. Closure and confinement measures had two major impacts on teaching and learning in higher education. First, with all instruction moving online, learning technology was essential. Second, lockdowns made households the prime learning space for students. Our focus was on how these two disruptive forces influenced teaching and learning.

We employed two conceptual lenses to examine issues related to learning technology — the digital disconnect and the digital divide. Both terms focus, although in slightly different ways, on the use of learning technology. As e-learning became the default medium for instruction, concern centered on teacher’s and learner’s prior experience with this technology, and whether that experience, and proficiency with it, might be distributed unevenly among instructors and students.

For instructors, there was evidence mainly of a prior digital disconnect. Before the pandemic, learning technology usage rates still showed a gap, sometimes sizeable, between availability and adoption (Fig. 1 above). While the use of LMSs, as platforms on which to post basic material such as course announcements and lectures slides, was growing, fewer teachers in higher education made much use of resources such as simulations, adaptive learning packages, and student response systems. Furthermore, among academic staff, there was little evidence of any differential distribution of learning technology use or proficiency as the digital divide would imply.

What we found was some variation in teachers’ proficiency between institutions. More broadly, there was consistent evidence that the effects of the public health edicts were so sudden, and so disruptive, that digital experience and proficiency had little impact on the success instructors felt in rapidly refashioning their courses (Bartolic et al. 2021). This represents a substantial difference between using technologies as a complement to most pre-COVID-19 course delivery and using technologies as the unique vector of teaching and learning, as was necessary post-pivot. Little in the pre-COVID-19 period prepared instructors for what they were required to do after public health measures transformed teaching and learning.

Most instructors, nevertheless, reported that they were satisfied with how they transitioned their course. “Transitioning well” and providing a vibrant teaching and learning environment are two different things, however. In many cases, success simply meant a continuity of teaching. Faculty members in “wisdom’s workshop” are good autonomous learners, even in difficult circumstances. Despite most having little preparation, they managed to carry on teaching. It is striking that most students agreed with the judgment of “transitioning well.” This latter finding was bolstered especially in those instances where students felt their instructor provided strong navigational support for online learning.

For students, there was some evidence of a digital divide, mainly around length of time within college or university, but as the low explained variance implied, this was a weak effect. For students, households became their mandated learning spaces. We found little consistent evidence that living arrangements including size, composition, or formation had any systematic bearing on student’s confidence in learning. However, the challenges students reported experiencing within their households were strongly associated with their learning confidence. Students who reported the most difficult challenges in their household surroundings, especially lack of dedicated study space and too much noise, also reported lower levels of confidence in learning. Difficulties with internet access were also related to student’s lower confidence in learning, although in general issues related to the digital divide did not show as large effects for students as the aforementioned household circumstances.

Several caveats come with our research. One worry is self-selection bias at both the institutional and individual level. Institutionally, our nine cases tilt toward medium and large institutions with a strong pre-pandemic presence in online/distance learning and learning technology infrastructure. Despite coping with emergency remote teaching and learning, these nine institutions had academic staff able to step-up and conduct this research. Among our respondents, individuals too had to agree to participate despite coping with a multitude of other demands. Where we were able to assess standards of representativeness, we are confident that our samples were relatively strong, but this assessment was not possible to make in all cases. Of course, we are unable to conclude anything about country differences as our institutions were not chosen to be representative of their countries.

Another caveat comes from how we treat the pivot to remote teaching. Effectively we, like most other COVID-19 studies, understand the transition in a before and after fashion. That is, we explore in detail what transpired after the sudden switch to remote teaching and learning, but we have less precision on exactly what was happening before the transition. While we asked questions about pre-pandemic behavior, we did this in the midst of the post-pandemic turmoil. How these subsequent events shaped people’s memories of what they were doing previously is not something we measured. Furthermore, our focus on student’s confidence in learning is not analogous to what students actually learned. Gonzalez et al. (2020) show that under conditions of autonomous learning, motivated students actually improved their academic performance.

In the face of the pandemic, the faculty instructors in wisdom’s workshop confronted serious challenges in fostering both knowledge growth and reasoning skills among students. Especially acute effects that undermined student learning came from the containment and closure orders that refashioned household bubbles. Household lockdowns often meant sheltering in settings that were more congested and cramped as others too followed public health confinement orders and retreated to the household. For students, the “retreat to the household” resulting from public health edicts was more consequential for their learning than was their experience of “teaching with technology.” As we move toward higher education in a post-pandemic world, we should take care that the environments in which teaching and learning take place are also considered in addition to access to and familiarity with technology.