The Coronavirus and Higher Education: Faculty Viewpoints about Universities Moving Online during a Worldwide Pandemic

Abstract

The purpose of this study is to examine faculty’s views about the move from face-to-face (F2F) instruction to online due to the COVID-19 pandemic. The researcher used Q methodology [Q], a mixed method, to determine and describe faculty views about this situation. The participants sorted 36 statements to reveal and describe their subjective viewpoints. In Q, similar sorts are grouped together mathematically into factors, each representing a unique viewpoint. The Q-sorting process is reflective and self-referent. The operation of sorting items allows participants to provide their internal viewpoint. This is different from responding to a Likert-type survey. Additionally, the analyses allow for differentiation of views rather than an aggregate of views. Each unique viewpoint is described by a representative sort, distinguishing statements, and participants’ post-sort responses. Three views emerged: Techies who like to teach (the view most positive in relation to teaching online); Overwhelmed as human beings (populated by caregivers); and It’s about what cannot be done online (those who are focused on the limitations of technology and abilities for online instruction). Generally, faculty felt frustrated with their ability to best support their students within the online format. Administrators should consider the results of this study to better understand the instructional and mental-health needs of faculty especially in an emergency situation, such as COVID-19. The findings indicate that creating the best learning situations for students is not one-size-fits-all and that there are discipline and pedagogical issues to consider when moving F2F courses online that are not fixed simply with technology.

Introduction

The World Health Organization [WHO] declared the COVID-19 virus a worldwide pandemic on March 11, 2020. In the United States, as part of a move to promote social distancing, colleges and universities moved their face-to-face [F2F] classes online. These were primarily decisions made by administrators, trustees, and/or state governments meant to promote social distancing and flatten the curve regarding the spread of the virus (NPR, 2020). With the March alert about the pandemic, many colleges and universities extended spring breaks to allow faculty to prepare for the move their courses online. Hodges, Moore, Lockee, Trust, and Bond (2020) refer to this online move as emergency remote instruction. Although we use the term online instruction throughout this manuscript, the context is a move that represents emergency remote instruction. Universities’ design and development staff worked overtime to train faculty for the move to online classes including creating videos of lectures, using synchronous teaching software such as Zoom and WebEx, and using various features of online learning management systems. In light of these new demands, faculty began Pandemic Pedagogy and The Chronicle of Higher Education started Higher Ed and Corona on Facebook. On these social media pages, faculty released their frustration with various related situations, looked for answers to questions, and shared ideas. Within these self-expressions on social media, there was a sense that different faculty were developing different views about their new experience with moving F2F classes online.

Purpose of the Study

Within this study, we seek to study scientifically the subjective experiences of college and university faculty within this COVID-19 response in a way that allows us to differentiate viewpoints rather than compile an aggregate view. For instance, Likert-scale surveys result in a loss of meaning such that findings are in aggregate, do not differentiate the viewpoints that exist, and lack the kind of descriptions found in Q methodology [referred to as Q throughout this manuscript] (McKeown, 2001). Instead, Q allowed us to examine and describe the divergent faculty viewpoints about moving their teaching online during the pandemic using empirical evidence.

Q is a unique 80-year-old inherently mixed methodology designed to study subjective viewpoints (Ramlo, 2020b). Initially, William Stephenson developed Q for use in psychology (Brown, 1980). However, Q’s use has expanded into many disciplines including education. Previously, the author used Q to study faculty viewpoints in other settings including professional development (Ramlo, 2011) and speech on campus (Ramlo, 2020a, c). Other studies using Q to study higher education include Godor (2016), Deignan (2009), and Kopcha, Rieber, and Walker (2016). The author also used Q to investigate students’ viewpoints about their COVID-19 pandemic university experiences (Ramlo, in review).

Faculty and Online Instruction

Frazer, Sullivan, Weatherspoon, and Hussey (2017) stress the importance of faculty competence and skills for teaching online. Within their study, the instructors’ perceptions of best practices were in alignment with the current literature on effective online teaching. These effective practices include collaborative student activities, the sense of instructor presence, and using a variety of methods of instruction. Previous studies found that online teaching is more demanding for an instructor when compared to F2F teaching (Andersen & Avery, 2008; Cavanaugh, 2005; Tomei, 2006). Cavanaugh (2005) used time logs for faculty and found that, as the enrollment increased, the instructor had to commit increased time for teaching their online course. Likewise, the number of students within an online class has limitations. Tomei and Douglas Nelson (2019) analyses focused on comparing faculty workload teaching online compared to F2F to select the best class sizes for each.

Tomei and Douglas Nelson (2019) stress that effective online teaching takes more time across what they call the three key components of instruction. These components consist of instructional content (including the actual delivery of instruction), advisement (office hours, counseling), and assessment. Bettinger and Loeb (2017) believe that the biggest contrast between online and F2F classes is the mode of communication. Thus, in online classes, student-student and student-professor interactions are virtual and, often, asynchronous. However, as Bettinger and Loeb (2017) found, students’ learning and persistence are worse when they take online courses compared to taking those same courses F2F. Thus, online courses are most difficult for those students who are the least prepared. Their findings provide evidence that students in online courses perform substantially worse than students in traditional F2F courses. This situation not only affects the performance of these students in their current classes but also in their future classes. Thus, the likelihood of dropping out of college increases for students who take online courses.

For online and F2F courses, learning management systems (LMS) house course elements within an online platform. Common LMS systems include Blackboard and Desire2Learn. Within F2F courses prior to the pandemic, faculty and students were experienced using LMS. In this way, most students and faculty had at least some experience navigating online course elements prior to the pandemic (Hachey, Conway, & Wladis, 2013; Harrison et al., 2017). However, Harrison et al. (2017) found that although faculty were experience and comfortable with LMS, they were less experienced with online conferencing, such as Zoom, and other aspects of effective online instruction. Additionally, these researchers found that the attitudes and experiences of teaching online varied across a range of disciplines indicating that some disciplines may be more easily adapted to online instruction.

Within this context as well as fully online courses, some researchers stress that effective online learning requires from careful instructional design and planning (Branch & Dousay, 2015). Yet researchers such as Harrison et al. (2017) do not address faculty perceptions of instructional design within online instruction. Shea (2007) found that lack of training in instructional design can be a demotivator for teaching online. Additionally, faculty cited missing F2F interactions with students when teaching online within Shea’s study.

The motivators and demotivators for faculty teaching online are in contrast with the motivators and demotivators for administrators. Typically, higher education administrators view distance-learning courses as a way to generate sizeable tuition revenue increases (Tomei & Nelson, 2019). The administrator’s thinking is generally, if a F2F course can contain 25 students with room space limitations, then an online course could easily contain 100 students (Tomei, 2006). Although there are inconsistencies regarding faculty’s beliefs in the effectiveness of online teaching (Harrison et al., 2017; Shea, 2007), administrators at institutions of higher education consistently believe that online courses are as good as or better than F2F courses (Allen & Seaman, 2016).

Allen and Seaman (2016) reported that only 29.1% of chief academic officers / provosts believe their faculty accept the value and legitimacy of online education. Allen and Seaman (2016) suggest that administrators and others have failed to convince higher education faculty, who they deem as their most important audience, that online education is a worthy a means of providing teaching and learning. Shea (2007) identified faculty acceptance of online teaching as critical to grow online education and to ensure its quality. Frazer et al. (2017) investigated faculty’s views about the effectiveness of online teaching including indicators for quality instruction. Their semi structured interview format study revealed specific qualities of effective online instruction. These effective instructors are strong communicators with students while they are also respectful, encouraging, and positive. Effective instructors work toward the goals of student reflection and construction of meaning according to the findings of Frazer et al. (2017).

Similarly, Martin, Budhrani, and Wang (2019) examined faculty perceptions about their readiness to teach online. This readiness consisted of two parts: (1) faculty perceptions related to their ability to teach online, (2) faculty attitude about the importance of online teaching. Martin, et al. suggest that faculty competencies to teach online require faculty to adjust their attitudes towards technology and teaching. Additionally, they found that faculty attitudes and perceptions played a major role in how faculty approach online teaching goals, tasks, and challenges Similarly, Shea (2007) found that as online teaching experience increased, faculty’s confidence to teach online effectively also increased. Shea also found that faculty believed a flexible work schedule was the biggest motivator for teaching online. The biggest detractor for teaching online was the increased workload. It is worth noting that Allen and Seaman (2016), Martin et al. (2019), and Shea (2007) used surveys to collect data that was then analyzed quantitatively. Yet surveys such as these do not allow for the differentiation of viewpoints. Instead, surveys such as these submerge the divergent perspectives of participants into categorical averaging. Thus, the researcher selected Q, rather than survey research, so that these different perspectives could be studied, as suggested by Brown (2008).

Method

Q was created specifically as a scientific way to study subjectivity (Brown, 1980; Stephenson, 1953). Q is a complete methodology in that consists of a specific set of procedures, theory, and philosophy of science (Stephenson, 1953). Q is unique with its inherently mixed data collection, analyses, and interpretations such that each of these stages offers a qualitative-quantitative hybridity (Ramlo, 2020b).

Before attending to the details, a brief overview of a typical Q study is provided here. Every Q study commences with the researcher collecting communications (statements) on the subject (concourse). From this collection, a sample (Q-sample) from the statements is chosen such that the diversity of communications represented within the concourse is maintained. Participants then place these statements into a grid with the statements most salient with their view on the ends (e.g. +5, −5) and more neutral statements in the center. Thus, these Q-sorts represent each participant’s internal view of the topic (Brown, 1980). The Q-sort collection is then analyzed statistically such that similar sorts are identified with factors (groups of similar sorts). Each factor (group) represents a unique viewpoint. The analyses also include the creation of tables that describe and differentiate each of these divergent views. These tables, in conjunction with post-sort interviews or written comments provide the means of interpreting each viewpoint (Brown, 1980; McKeown & Thomas, 2013; Newman & Ramlo, 2010).

Q-Sample

In this study, the Q-sample was adapted from an earlier study that investigated students’ views of the move from F2F to online due to the COVID-19 pandemic (Ramlo, in review). In that study, there were 51 items in the concourse. A theme analysis of the concourse resulted in the identification of seven themes: Fear, worldview / big picture, difficulty / challenge of online classes, F2F instruction, positive aspects of online classes, and regarding faculty. The researcher then used Fisher’s Design of Experiments, as recommended by Brown (1980), to select the representative Q-sample from this concourse. Using Fisher’s Design of Experiments allows the researcher to use variance design to ensure that this subset of items is representative of the different types of communications within the concourse. In this way, statements were selected across the seven themes found within the larger concourse to provide the Q-sample. The result was a Q-sample containing 36 items.

For this study, 20 of those 36 items remained unchanged (e.g. #23: ‘I didn’t realize how important F2F classes were to me until now’). Ten of the 36 items had slight changes, usually replacing student with instructor (e.g. #6 was ‘I miss the interactions with my peers / other students’ and is now ‘I miss the interactions with my peers / other instructors’). The remaining six statements were completely revised as far as content but not feeling or theme (e.g. #3 was ‘I am afraid of failing my classes this semester’ and is now ‘I am afraid of how the coronavirus situation will affect the financial health of my university’). Thus, revised statements maintained their categorization (e.g., fear remained the categorization for statement #3).

Q-Sort

Because of the campus closures and social distancing during this semester, the Q study was setup for online Q sorting. The researcher used HtmlQ, which uses freeware database hosting coordinated with a cloud computing company that offers hosting for web applications. The HtmlQ files were downloaded (https://github.com/shawnbanasick/easy-htmlq/archive/master.zip) and the setting files adapted for this study. Within this web-application, participants see the individual statements (selected randomly) and initially sort them into one of three piles: Agree, Neutral, Disagree.

Participants were then offered each pile of statements to distribute into the sorting grid. To begin, the participant selected the two statements that are most salient with them as Most Agree and they place those at +5. Subsequently, they found the next two statements that were most salient and place them at +4, and so on. They repeated this process for the Disagree pile placing two statements at −5, two at −4, and so on. Finally, each participant distributed the statements from the Neutral pile into the grid. Before submitting their sort, participants were asked to review the placement of the items to ensure that the placements best reflect their view about transitioning from F2F and, subsequently, teaching online during the pandemic. Each sort provides a snapshot of the participant’s viewpoint and, therefore, there are no right or wrong answers or arrangements of the items. Thus, within Q, the sorting process is reflective and self-referent (Brown, 1980; Stephenson, 1953, 1986). In other words, the operation of sorting items into the grid allows each participant to provide their internal viewpoint via their sort.

P-Set

Next, the researcher reached out to higher education faculty to participate in the study. Snowball sampling (where the research contacts one participant who then recruits others to participate) was one of the methods used to recruit a diverse set of faculty for the study. Additionally, the researcher posted a request for participants on the Facebook pages for Pandemic Pedagogy and Higher Ed and the Coronavirus. The idea was to get a wide range of types of faculty from various types of institutions, even though the type of institution (e.g. private or public) was not collected, within a variety of disciplines (this information was considered more relevant and was collected).

The ability to provide the sorting process online facilitated recruiting participants. However, it is worth mentioning that in Q, the sample size is the number of items in the Q-sample, not the number of participants. Typically, Q studies involve relatively small numbers of participants especially compared to large quantitative studies that use random sampling of participants. In recruiting participants for a Q study, the goal is to diversify the set of participants (P-set) as much as possible to assist in discovering the divergent viewpoints that exist about the topic.

Some demographic information was collected about participants through a questionnaire that was part of the online Q sort. Participants provided their sex, faculty type (tenure track, non-tenure track, part-time, other), and discipline. These discipline categories included STEM (science, technology, engineering, and mathematics), Fine Arts, Humanities, and Health Professions. Institution details (size, private versus public, etc.) were not collected and this is addressed within the study’s limitations section.

Analyses

Q analyses include correlation and factor analysis. Similar sorts are grouped together via factor analysis such that each factor represents a unique viewpoint. Factors are dependent on the Q sorters’ item placements. Thus, each of these factors is grounded in concrete behavior. Yet, factors are reliable and easily replicated (Brown, 1980). The researcher performed all analyses within KenQ (Banasick, 2019). This specialized software is web-based, allows for typical choices for factor extraction and factor rotation to group similar sorts into factors (viewpoints), and provides the detailed tables necessary to describe the divergent viewpoints that emerge (Newman & Ramlo, 2010).

The factor analytic stage within Q methodology is inherently mixed, qualitative and quantitative, rather than purely quantitative (Newman & Ramlo, 2010). This inherently mixed approach is represented within the factor analytic goal to find a best theoretical solution rather than a best statistical solution (Brown, 1980; Brown & Robyn, 2004; Stephenson, 1953). The researcher explored various factor solutions, comparing the various factor structures and factor descriptions. The various combinations included centroid with hand-rotation (focusing on exemplars among the participants) and PCA with varimax. However, in this study, a three-factor solution, with centroid extraction and varimax rotation, provided the best theoretical solution relative to the post-sort comments and the exemplars among the sorters.

The sorts identified on a factor are used to create the factor array for that factor. The factor array represents the sort for that factor (viewpoint) and is determined using z-scores. Those z-scores are ranked to determine the grid placements for the items in the factor array. Differences in z-scores are used to determine the items that are distinguishing for a factor, compared to the other factors, as well as consensus statements among the factors (views). P-values (less than 0.05 versus 0.01) for distinguishing statements are also determined from the z-scores (Newman & Ramlo, 2010). See Brown (1980) for technical notes on these calculations.

Results

Within this study, 78 faculty completed the Q-sort. Thirty-two percent were males; 67% were females; and one participant refused to answer this question. The mean age of the participants was 48 years old with a standard deviation of 12 years. The majority of participants (44%) were tenured faculty. Approximately a quarter of the participants identified themselves with STEM (science, technology, engineering, and mathematics), another quarter with humanities, and yet another quarter with social sciences (not including education). Table 1 details the demographic characteristics of the participants.

Table 1 Study demographics

The analyses revealed three factors (viewpoints). The number of defining sorts (people) on each of these factors are 21, 9, and 27, respectively. Other sorts were not defined by a single factor based on factor loadings, which represent correlations with the factor (Newman & Ramlo, 2010). Composite reliabilities were calculated within KenQ. The composite reliability is the reliability of the factor array and is dependent upon the number of sorts identified on a factor. The composite reliability for each factor is as follows: 0.988, 0.973, and 0.991 (respectively, with a maximum value of 1.00). See Brown (1980) for how composite reliability is calculated.

Table 2 contains the factor array with distinguishing statements and consensus statements noted. Distinguishing statements help differentiate the viewpoints (factors) that emerged from the analyses. Alternatively, consensus statements represent similarities between the different viewpoints (factors). Both distinguishing and consensus statements are determined empirically based on the factor arrays.

Table 2 Factor array for each of the three factors

The factor array for each factor, especially those most salient with the views, and the distinguishing statements are key components for interpreting the viewpoints the factors represent. Luckily, KenQ provides visual, composite sorts based on the factor arrays for each factor that also indicate distinguishing statements. These visuals assist in interpretation. Additionally, comments made by those participants who define a factor are important for interpretation. For instance, sorters were asked to explain their most salient statement placements (+5 and − 5) and these explanations often clarify the participant’s thinking and interpretation of the statements.

Factor 1: Techies Who like to Teach

Figure 1 provides a visual theoretical (composite) sort for this factor. This sort is based on the factor arrays in Table 2. The faculty represented by this factor are not just familiar with technology; they enjoy technology. These faculty recognize that they are technologically savvy (statement #31 at −5, distinguishing). Factor 1 faculty were ready to teach their F2F classes online as soon as the announcement was made at their university (statement #14 at +5, distinguishing; statement #10 at −5, distinguishing). They had already either taught online or had extensively used the LMS at their university (per written comments). These faculty enjoy teaching online (#18 at +4, distinguishing). However, these faculty are neutral about feeling rejuvenated as they adapt their classes to online instruction (#26 at 0, distinguishing) and about online classes making their lives easier (#16 at 0, distinguishing). Factor 1 faculty miss being in a physical classroom interacting with their students (#32 at +3). It is interesting that statement #1 (I am worried about my F2F classes being held online) is located at −1 (distinguishing) which is in the neutral area of the grid. We see an agree position connected to the concern about specific aspects of instruction including laboratories, clinical situations, and art studios (#17 at +2).

Fig. 1
figure1

Composite Q sort for Factor 1

Participants were asked to comment on their most salient item placements at +5 and − 5. Those comments provide insights related to further understanding each factor. For instance, Participant #3 stated, in regard to placing item #10 at −5, “I have much training and experience in online learning (QM certified, Master Reviewer...).” Regarding their placement of #31 at −5, this same participant commented, “I like tech, I use tech, I know tech.” Similarly, Participant #7 stated that they placed statement #9 at +5 because, “I’ve taught online classes many times, and all of my current classes were hybrid until the crisis. Moving to all online is no big deal.”

However, this does not mean that those on Factor 1 see technology as a panacea for improved instruction. For instance, Participant #1, in her written comments, expressed concern about these more hands-on aspects of teaching and learning. Her comments help further clarify how she perceives the importance of hands-on experiences for students. She explained placing #17 at +5:

Simulations and other types of laboratory experiences do not force students to challenge their misconceptions or allow students to develop good lab techniques. Watching faculty or graduate students perform laboratories and then giving students data is simply a waste of time for everyone involved. Students need to experience hands-on, authentic, inquiry-based laboratory experiences. Similarly, clinicals and art studios cannot be replicated online.

Participant #1 also stated, ‘I didn’t realize how untechy students and faculty are in general. The idea of the digital native is a ridiculous and false conceptualization.’

Participant #7 makes it clear that keeping up with technology should be an expectation of all faculty. He stated that (about his placement of statement #21 at +5) ‘Given the panic that was expressed by those who’ve resisted online teaching for years, it was clear that they neglected their professional obligation to keep up with technology.’ However, it is clear from the written comments that those on this factor embrace technology, keep up to date on technology, but enjoy teaching and interacting with their students. Participant #5 stated (about her placement of #5 at +5), ‘In-person interaction is a huge driver of my passion for teaching.’

Although one might postulate that those on this factor would be younger faculty, this is not the case. The mean age is 49 years with a standard deviation of 10 years. Additionally, the diversity and distribution across disciplines reflects that of the larger P-set. The distribution across the type of faculty (tenured, tenure-track, etc.) also reflects that of the larger P-set. There are 6 males and 15 females represented by this view that was named Techies who like to teach.

Factor 2: Overwhelmed as Human Beings

The faculty on Factor 2 are overwhelmed but it is not the result of being unfamiliar with technology. Figure 2 provides a visual distribution of the Q-sort that represents this viewpoint. Statement 30 (I feel completely overwhelmed) is at +5 and is distinguishing for this factor / viewpoint. Those represented by this view are focused on just getting through this semester (statement #27 at +5, distinguishing). Yet those on this factor are neutral about feeling isolated (#36 at 0, distinguishing). They feel as if their world has been turned upside down (#22 at +4, distinguishing) and are doing the best they can (#2 at +4, distinguishing). However, the sense of being overwhelmed is not about being unprepared for the transition from F2F to online (#10 at +5, distinguishing). Instead, the sense of being overwhelmed is the result of caregiving (self, children, parents, students, and colleagues) on top of moving F2F classes online and a worldwide pandemic (Fig. 3).

Fig. 2
figure2

Composite Q sort for Factor 2

Fig. 3
figure3

Composite Q sort for Factor 3

The written comments by several participants represented by this viewpoint will best clarify the humanity involved with the sense of being overwhelmed. Participant #21 makes it clear that their sense of being overwhelmed has nothing to do with being capable of using technology. Instead, the sense of being overwhelmed is directly related to dealing with a global pandemic and there is a sense that this faculty member needs mental health counseling. He wrote the following in relation to their most salient statement placements (+5 and − 5, respectively) where Canvas is an online learning management system (LMS):

I've been using Canvas for years, so all my course materials were already online. Designing online activities in place of in-person meetings to meet learning objectives is not difficult, and my institution has provided us with ample resources. I'm shocked by how much my colleagues are struggling. I need to process my feelings from this crisis, I feel like I'm in the middle of a nervous breakdown, but I don't have time to process any of that. I just want this semester to be over so I can stop moving and take care of myself. I don't have time to process the grief of watching a global pandemic because of how much time my teaching requires. But I need to process these feelings, and keeping them bottled up for, now, 6 weeks has taken a huge toll on my mental health. This has reduced my productivity, creating a feedback loop. I feel paralyzed and don't know how I'm going to finish grading by the end of the semester. I need time to be a human and to panic, but I don't have it. I've had technological difficulties from learning new software, but I solved them easily. All the tools we use (Canvas, Zoom, etc.) are intuitive and the internet is full of tutorials to answer questions when I have them."

Participant #36 describes a different situation that creates a sense of being overwhelmed. Yet, like Participant #21, her situation reminds us that faculty are humans first and have lives and feelings not just duties related to teaching. Additionally, her technology problems are related to the computer supplied by her university, not her knowledge of technology. The impact of being a caregiver exacerbates her issues with aging technology supplied by her university as well as her home internet’s capabilities. She wrote:

My life sucks right now. This is not the job I was meant to have. I hate being at home all the time. I feel isolated and unsupported. I'm not learning, I'm treading water and trying to get my students over the finish line. I don't have childcare. I don't have a workspace. Working at home is causing pain in my shoulders, hands, and back. I'm overwhelmed and in pain, and I'm just trying to survive. There's not enough information available, and I feel like I can't help with things. I know there are issues the university needs to confront, and I hate not being involved in helping. I haven't had any meetings since this started, and I feel so out of the loop. We had one university-wide meeting, but like there is nothing happening. It feels like a disastrous holding pattern. I haven't had significant problems with tech. The ones I am having are caused by the fact that my university expects me to teach online classes on a 5-year-old laptop that's running out of memory and doesn't have a working microphone. Also my home internet isn't robust enough to support multiple people doing complex online tasks.

Similarly, Participant #71 explained that she has a great deal of experience with classroom management systems and presenting online courses. However, her life is not just about being a tenured faculty member. She wrote:

The move to on-line instruction has coincided with problems I'm dealing with regarding my son and his father as well as with my parents, and has compounded the difficulties in coping with those issues, so that while online teaching alone does not make me feel completely overwhelmed (I've done it before, as well as substantial amounts of hybrid teaching), life as a whole has pushed me to what seems like the absolute edge of my coping skills. I have dabbled in digital humanities and I have come to what I think is pretty accurate self awareness about my interest in and ability to engage with technology. I'm aware of my limitations in terms of dealing with technology, as well as of my strengths.

Interestingly, all three of these quoted participants are from the Humanities and two are females who are caring for children. Eight of the 9 participants on this factor are females. Research has shown that female faculty tend to be the primary caregivers of family members as well as the primary caregivers to their academic family (Guarino & Borden, 2017). This caregiving seems to be a contributor to the sense of being overwhelmed for these faculty. Participant #41 is an NTT (non-tenure track, full time) female, STEM faculty member. She wrote:

Yeah no, this is a myth (regarding online classes making life easier). I am doing the best that I can. Online classes are not convenient. I am overwhelmed trying to meet my students’ needs.

Thus, the post-sort comments are especially meaningful in interpreting this viewpoint. The reasons for these Factor 2 faculty being overwhelmed have little to do with having classes move online and have a great deal to do with juggling trying to do their jobs well while dealing with a worldwide pandemic and its various consequences outside of the workplace. The mean age of these Factor 2 participants is 45 years with a standard deviation of 12 years. The type of faculty representation reflects that of the larger P-set. The disciplines represented are STEM (3), Education (1), Social Science (1), and Humanities (4).

Factor 3: It’s about What cannot Be Done Online

The Factor 3 faculty do not believe having their classes online will make their lives easier (#16 at −5, distinguishing). They do not feel rejuvenated as they play dual roles of instructor and learner within the online environment (#26 at −5, distinguishing). These faculty do not enjoy teaching online (#18 at −4, distinguishing) and feel neutral about getting along with online teaching (#11 at 0, distinguishing). Instead, the faculty represented by Factor 3 miss the interactions with students (#5 at +5, distinguishing) including being in a physical classroom with students (#32 at +5, distinguishing). Factor 3 faculty are neutral about their ability to handle teaching online (#14 at −1, distinguishing) and about being technologically savvy (#31 at −1, distinguishing). Yet these neutral positions may be connected to their concern with the types of learning environments that are not conducive to online settings including instructional laboratories, clinical experiences, and art studio (#17 at +4, distinguishing). They also miss interacting with colleagues (#6 at +4, distinguishing). Overall, Factor 3 is the most focused on hands-on and authentic learning experiences for students.

Participant #2 is a male, STEM, part-time instructor. He wrote:

If it were easy to create a substitute for laboratory experience, laboratory experience would not be required in the first place. Labs without manipulation of equipment, taking data, organizing that data and making sense of it is not a lab. Simulations can't satisfy all the parts. I teach Labs. Labs require interaction - if not then why have them. Difficult to replace hands on experience of labs.

Similarly, Participant #8 is a male, tenured, Humanities faculty member. He wrote (where the first sentence pertains to his placement of statement 17, replicating certain types of learning experiences, at +5):

This is self-explanatory - there is no online substitute. I don't think online courses are a good idea for undergrads. Most of our undergrads first need to learn to be students and that is best done in face-to-face environment. I really value getting to know my students and also value their interactions with each other. Actually, because students think that you are on-call all the time, online courses become very inconvenient invading all sections of your life. Also, setting up each course is subject to far more review than in-person classes, review that begins to cross the line into academic freedom issues.

Participant #6 is in the Fine Arts and concerned with replicating experiences online. She wrote:

I'm doing the best I can. I think I've done pretty ok. I changed assignments and approaches to the class to fit the new technology. It's better than nothing. But it's not why I signed up to be a professor, it's not anywhere near as good, and none of us like it. I'm in Theatre. I always knew and valued the importance of being in a room with other people. I work in theatre, a collaborative art form. We form close relationships with our students, most of whom acknowledge they learn better face to face. I have chosen my whole life to be in theatre, and though it requires perhaps many more contact hours with students (normally) than other disciplines, there's a reason I chose it. I'm doing the best I can but not to be able to make art with students? To hold in person discussions? To get up and move? Everything I love about what I teach is gone. My colleagues in my department and I aren't just faculty, we are also a functioning production company. We are all better for the exchange of ideas, the problem solving, the art we make together. Some people dread faculty meetings, but our weekly meetings are productive and effective. We still are getting work done virtually, but it is not the same.

Finally, Participant #35 is a tenured female faculty member who teaches physics. She wrote:

I hate teaching online classes. I really hate having to suddenly try to make physics labs be online when the WHOLE POINT is to get them familiar with equipment. I hate students not being able to just drop by my office whenever and ask a quick question in person and me being able to easily point at something in their work or draw a quick sketch or whatever they need. I hate trying to write long equations in emails. I hate not having the classroom environment where everybody's able to easily ask questions and discuss. I hate that some of my students have just vanished and I don't know what happened to them or how to help them. I hate it all. I am literally counting the days. I just have to get done. I just gave an online test, timed to an hour, tried my best to find a balance of security vs reasonable and still had people getting the questions up onto Chegg.Footnote 1 God grant me strength to get through this semester. And there is just no comparison with paper - I cannot fully express how strongly I believe that the ability to write out problems and easily show work for a physics test is crucial. There's no way to \"manage\" this. I worked 12 hours today and should still be working now.

This sense of just wanting the semester to be over, expressed by participant #35, is also represented by the placement of statement 27 at +3, distinguishing. Yet those on this view felt prepared for the transition from F2F to online (#10 at −3, distinguishing). Perhaps this is also why this view feels neutral about having an extended spring break to get ready for the emergency transition to online teaching (statement 15 at 0, distinguishing).

The age of those represented by this factor is similar to that of the other factors and the P-set overall. The mean age is 48 years with a standard deviation of 12 years. There are 13 males and 14 females on this viewpoint and, therefore, has a larger male representation percentage wise compared to the overall P-set and the other two factors. Two-thirds of the Fine Arts faculty in the P-set are represented by this viewpoint. The distribution across disciplines is as follows: STEM (8), Social Science not including Education (6), Humanities (8), Health Professions (1), Fine Arts (2), and Other (2).

Consensus

Although Q allows the researcher to describe the multiple, divergent viewpoints about a topic, Q also provides consensus statements. Within this study, there are three consensus statements. There is general agreement that the faculty want to be able to meet F2F with at least some students (#25 at 1, 1, and 2, respectively). Faculty slightly disagree that they did not realize how important F2F classes were to them until now (#23 at −1, −2, and − 2). Most concerning, however, is the agreement across all three factors that these faculty are afraid of how the coronavirus situation will affect the financial health of their university (#3 at +3 for each factor). No doubt, fear of the implications of financial problems at faculty’s institutions adds to the already stressful situation for faculty.

Discussion

In March of 2020, the COVID-19 led colleges and university to move F2F courses online in a hurried way to help prevent the spread of this virus. The need to implement social distancing meant that faculty left their campuses to teach remotely from home and students had to leave campus to learn remotely. With the switch from F2F to online occurring about mid-semester, a unique situation to investigate certain assumptions about online courses and faculty could be more easily addressed. The situation allowed faculty to compare their F2F situation to their online situation. However, it is worth noting that a worldwide pandemic also affected faculty’s life situations.

Three divergent viewpoints emerged in this study. Although none of these views rejected technology or even online courses in general, the Factor 1 view, Techies who like to teach, especially embraced technology. The Factor 3 view, It’s about what cannot be done online, is focused on the interpersonal aspects of teaching as well as the types of student learning situations that cannot be replicated online because they require specialized equipment, hands-on experiences, collaborative arts, etc. Factor 3 participants miss their F2F interactions, in alignment with findings by Shea (2007). Those on Factor 3 also struggle with the difference in the mode of communication between themselves and their students, as suggested by the Bettinger and Loeb (2017) study. This view also is concerned with student cheating as F2F assessments help prevent the use of online systems that provide students answers (e.g. Chegg), unwanted collaborations among students, and other opportunities.

Faculty represented by Factor 2, Overwhelmed as human beings, are trying to balance their teaching duties with stressful life situations that have intensified due to the pandemic including the sudden need to homeschool children, care for family members, care for students, care for colleagues, and care for themselves. Recall that studies indicate that online teaching is more demanding for an instructor when compared to F2F teaching (Andersen & Avery, 2008; Cavanaugh, 2005; Tomei, 2006) and that, of course, this teaching was accompanied by a worldwide pandemic. This view is predominantly populated by women demonstrating that within the pandemic situation, female faculty are more likely to be providing caregiving others, which is not unusual according to Guarino and Borden (2017). This viewpoint especially indicates that university administrators need to consider faculty’s mental health during the pandemic.

Working from home created issues for numerous faculty in this study, not just those on Factor 2, however these issues were more general technology problems. Some faculty experienced difficulty getting technical support from the university and limitations of home internet services, especially compared to what is offered on campus. Additionally, caregivers had the additional problem of shared internet access, homeschooling children, and other issues not associated with working on campus. Generally, faculty also missed their F2F interactions with students. There is a general sense of frustration among the faculty, especially for the Factor 3 view (It’s about what cannot be taught online). The Factor 3 faculty reject any conceptualization that technology and online instruction can replace all F2F learning experiences. Those on this view also miss direct interactions with students as well as colleagues.

Faculty in this study consistently commented on their ability to draw on their prior online teaching experience. As suggested by other research (Hachey et al., 2013; Harrison et al., 2017), these faculty were able to draw on their LMS experiences as they transitioned to 100% online instruction. This is an important distinction because these faculty already possessed competence and skills for teaching online, important for successful online instruction according to Frazer et al. (2017). These faculty were familiar with the technologies required for online teaching. Yet these faculty also acknowledge that teaching online is time consuming as suggested by previous studies (Andersen & Avery, 2008; Cavanaugh, 2005; Tomei, 2006; Tomei & Nelson, 2019). Overall, faculty remained focused on creating the best teaching and learning experience possible for their students. The general sense is that faculty felt frustrated with their ability to best support their students within the new normal of teaching online from home.

Limitations

This study has several limitations. For instance, based on the research of Martin et al. (2019) and Shea (2007), participants’ institution type (e.g. public versus private) was not collected. Other institution characteristics such as size (small, medium, large), instruction focused, research focused, etc. were also not collected as there was no research found that indicated that this was relevant information in regard to differentiating viewpoints. However, these are limitations and future research studies should address institution type and faculty viewpoints about emergency remote instruction.

Conclusion

Overall, online teaching experience during the pandemic did not alter faculty skepticism about teaching online, in contrast to previous research. However, faculty remained focused on students and their learning during the pandemic and this move to emergency remote instruction. This study demonstrates that faculty possess the specific competence and skills for teaching online, which Frazer et al. (2017) stressed as key for online teaching. In other words, this study’s findings indicate that creating the best learning situations for students is not a one-size-fits-all, especially when those courses / disciplines are laboratory based. Frustrations felt by this study’s faculty participants tend to focus on determining new meaning within the faculty-student relationship because of the move online. Feeling connected to students was often more difficult for the faculty due to the distancing within the online environment. Despite the ongoing pandemic, faculty were focused on doing the best job teaching their students as possible. Yet findings also imply that administrators should provide mental health support for faculty within this type of unique situation.

Administrators should treat the move online due to the COVID-19 pandemic as a way to understand the needs of faculty. The Factor 1 view especially enjoys using technology within their courses. Additionally, neither the Factor 2 nor the Factor 3 view rejects technology. Instead, there is a sense of limitations related to teaching online and of instructional technologies. Based on the findings of Shea (2007) and of Allen and Seaman (2016), it is likely that administrators think that the pandemic situation in higher education will lead to greater faculty acceptance about teaching online. However, this study’s results indicate that administrators would be mistaken if they were assuming this situation would create faculty transformation and decrease in skepticism about teaching online. Without a doubt, faculty are on the frontlines whether teaching F2F or online. Faculty are also in direct contact with students, often daily, and are the content experts. These are important aspects of academic expertise.

In this study, faculty did not struggle with knowing how to use technology although sometimes technology was not cooperative or up to date. However, providing outdated technology for full-time faculty is costly in human capital – it does not allow faculty to do their best for students and, therefore, affects students’ learning and success. Especially drawn from participants’ comments, technical support for faculty, as well as students, was spread too thin in many cases during this move to online instruction and within the months that have followed. Additionally, like much of the literature about teaching online, the university development and design services may focus too much on the design aspects of online instruction without understanding of the course content as well as the disciplinary nuances, especially when it comes to teaching laboratory-based courses. Certainly, this study identifies that the design and implementation of online courses, even when done in a rush, should be differentiated based on discipline, course level, and instructor pedagogy. In other words, online instruction is not a one-size fits all model. Thus, the results of this study indicate that faculty should be the judges about online instruction and its limitations based on their students, the learning environment, and the discipline. In closing, we want to stress the need for administrators and staff to understand that technology has limitations, faculty are the content experts, and administrators need to ensure that faculty have the necessary equipment to use technology in all of their courses, not just those online.

Data Availability

None.

Notes

  1. 1.

    Chegg is has been accused of institutionalizing cheating in higher education by providing textbook solutions and solutions to course assessments such as test questions (McKenzie, 2019).

References

  1. Allen, I. E., & Seaman, J. (2016). Online report card: Tracking online education in the United States. Babson Park, MA: Babson Survey Research Group and Quahog Research Group. http://onlinelearningsurvey.com/reports/onlinereportcard.pdf

    Google Scholar 

  2. Andersen, K. M. & Avery, M. D. (2008). Faculty teaching time: A time comparison of web-based and face-to-face graduate nursing programs. International Journal of Nursing Education Scholarship, 5(1), 1–12. https://doi.org/10.2202/1548-923X.1539

    Article  Google Scholar 

  3. Banasick, S. (2019). Ken-Q analysis (Version 1.0.6) [Software]. Retrieved from https://shawnbanasick.github.io/ken-q-analysis / https://doi.org/10.5281/zenodo.1300201

  4. Bettinger, E. & Loeb, S. (2017). Promises and pitfalls of online education. Economic Studies at Brookings: Evidence Speaks Reports, 2(15), 1–4. Retrieved from https://www.brookings.edu/research/promises-and-pitfalls-of-online-education/

    Google Scholar 

  5. Branch, R. M. & Dousay, T. A. (2015). Survey of instructional design models, Association for Educational Communications and Technology, 1-106. Retrieved from https://aect.org/survey_of_instructional_design.php

  6. Brown, S. R. (1980). Political subjectivity: Applications of Q methodology in political science. Yale University Press.

  7. Brown, S. R. (2008). Q methodology. In L. M. Given (Ed.), The sage encyclopedia of qualitative research methods. (pp. 700–704). Sage.

  8. Brown, S. R. & Robyn, R. (2004). Reserving a key place for reality: Philosophical foundations of theoretical rotation. Operant Subjectivity, 27(3), 104–124. https://doi.org/10.15133/j.os.2004.004

    Article  Google Scholar 

  9. Cavanaugh, J. (2005). Teaching online-A time comparison. Online Journal of Distance Learning Administration Content, 8(1). Retrieved from https://www.westga.edu/~distance/ojdla/spring81/cavanaugh81.htm.

  10. Deignan, T. (2009) Enquiry-based earning: Perspectives on practice, Teaching in Higher Education, 14(1), 13–28. https://doi.org/10.1080/13562510802602467

    Article  Google Scholar 

  11. Frazer, C., Sullivan, D. H., Weatherspoon, D., & Hussey, L. (2017). Faculty perceptions of online teaching effectiveness and indicators of quality. Nursing Research & Practice, 1–6. https://doi-org.ezproxy.uakron.edu:2443/10.1155/2017/9374189

  12. Godor, B. P. (2016) Moving beyond the deep and surface dichotomy; using Q Methodology to explore students’ approaches to studying. Teaching in Higher Education, 21(2), 207-218. https://doi.org/10.1080/13562517.2015.1136275

    Article  Google Scholar 

  13. Guarino, C. M., & Borden, V. M. H. (2017). Faculty service loads and gender: Are women taking care of the academic family? Research in Higher Education, 58, 672–694. https://doi.org/10.1007/s11162-017-9454-2

    Article  Google Scholar 

  14. Hachey, A. C., Conway, K. M., & Wladis, C. W. (2013). Community colleges and underappreciated assets: Using institutional data to promote success in online learning. Online Journal of Distance Learning Administration, 15(1). Retrieved from http://www.westga.edu/~distance/ojdla/spring161/hachey_wladis.html

  15. Harrison, R., Hutt, I., Thomas-Varcoe, C., Motteram, G., Else, K., Rawlings, B., & Gemmell, I. (2017). A cross-sectional study to describe academics' confidence, attitudes, and experience of online distance learning in higher education. Journal of Educators Online, 14(2), Retrieved from https://files.eric.ed.gov/fulltext/EJ1150576.pdf.

  16. Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A. (May 27, 2020). The difference between emergency remote teaching and online learning. Educause Review. Retrieved from https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning

  17. Kopcha, T. J., Rieber, L. P., & Walker, B. B. (2016). Understanding university faculty perceptions about innovation in teaching and technology. British Journal of Educational Technology, 47(5), 945–957. https://doi.org/10.1111/bjet.12361

    Article  Google Scholar 

  18. Martin, F., Budhrani, K., & Wang, C. (2019). Examining faculty perception of their readiness to teach online. Online Learning, 23(3). https://doi.org/10.24059/olj.v23i3.1555

  19. McKenzie, L. (March 12, 2019). The wrong partnership? Purdue professors say it is wrong for the university to work with Chegg, whose services they see as helping students cheat. Inside Higher Education. Retrieved from https://www.insidehighered.com/news/2019/03/12/purdue-professors-criticize-writing-partnership-chegg

  20. McKeown, B. (2001). Loss of meaning in Likert scaling: A note on the Q methodological alternative. Operant Subjectivity, 24, 201–206.

    Google Scholar 

  21. McKeown, B., & Thomas, D. (2013). Q methodology. Sage.

  22. Newman, I. & Ramlo, S. (2010). Using Q methodology and Q factor analysis in mixed methods research. In A. Tashakkori, & C. Teddlie (Eds.), Handbook of mixed methods in social & behavioral research (Second ed., pp. 505–530). Sage.

  23. NPR. (March 11, 2020). Public health experts encourage social distancing to 'flatten the curve' of infection. All Things Considered. [Radio broadcast episode] https://www.npr.org/2020/03/11/814603316/public-health-experts-encourage-social-distancing-to-flatten-the-curve-of-infect

  24. Ramlo, S. (2011). Facilitating a faculty learning community: Determining consensus using Q methodology. Mid-Western Educational Researcher 24(1), 30–38.

    Google Scholar 

  25. Ramlo, S. (2020a) Free speech on US university campuses: Differentiating perspectives using Q methodology, Studies in Higher Education, 45(7), 1488–1506. https://doi.org/10.1080/03075079.2018.1555700

    Article  Google Scholar 

  26. Ramlo, S. (in press / 2020b). Q methodology as mixed Analysis. In A. Onwuegbuzie & B. Johnson (Eds.), The Routledge handbook reviewer’s guide for mixed methods research analysis. Routledge.

  27. Ramlo, S. (2020c). Subjectivity as subjective-science: Campus freedom of speech as an intensive single-case study. International Journal of Research & Method in Education. https://doi.org/10.1080/1743727X.2020.1735336

  28. Ramlo, S. (in review). COVID-19 response: Divergent student views about moving face-to-face classes online. Manuscript submitted for publication.

  29. Shea, P. (2007). Bridges and barriers to teaching online college courses: A study of experienced faculty in thirty-six colleges. Journal of Asynchronous Learning Networks, 11(2), 73–128. https://doi.org/10.24059/olj.v11i2.1728

    Article  Google Scholar 

  30. Stephenson, W. (1953). The study of behavior: Q-technique and its methodology. University of Chicago Press.

  31. Stephenson, W. (1986) William James, Niels Bohr, and complementarity: II-Pragmatics of a thought. Psychological Record, 36, 529–543. https://doi.org/10.1007/BF03394971

    Article  Google Scholar 

  32. Tomei, L. A. (2006). The impact of online teaching on faculty load: Computing the ideal class size for online courses. Journal of Technology and Teacher Education, 14(3), 531–41.

    Google Scholar 

  33. Tomei, L. A. & Nelson, D. (2019). The impact of online teaching on faculty load revisited: Computing the ideal class size for traditional, online, and hybrid courses, International Journal of Online Pedagogy and Course Design (IJOPCD), IGI Global, 9(3), 1–12. https://doi.org/10.4018/IJOPCD.2019070101

    Article  Google Scholar 

  34. World Health Organization Director-General (March 11, 2020). COVID-19 media briefing. Retrieved from https://www.who.int/director-general/speeches/detail/who-director-general-s-opening-remarks-at-the-media-briefing-on-covid-19-11-march-2020

Download references

Acknowledgments

There are no acknowledgements; no funding; no grants associated with this manuscript.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Susan Ramlo.

Ethics declarations

Conflicts of Interest/Competing Interests

None.

Ethical Approval

This study was approved by The University of Akron’s Institutional Research Board for research with human subjects. This approval includes language related to:

˗ Consent to participate

˗ Consent for publication

Code Availability

N/A

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Web of Science Researcher ID: AAH-7052-2019

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Ramlo, S. The Coronavirus and Higher Education: Faculty Viewpoints about Universities Moving Online during a Worldwide Pandemic. Innov High Educ (2021). https://doi.org/10.1007/s10755-020-09532-8

Download citation

Keywords

  • Pandemic
  • Coronavirus
  • Faculty
  • Higher education
  • Q methodology