Introduction

Team-Based Learning (TBL) is one specific course design framework used to promote active, collaborative learning (Michaelsen 1983; Michaelsen and Black 1994; Michaelsen et al. 2004; Sibley and Ostafichuk 2014). Combining individual accountability with team problem solving, TBL improves students’ academic performance and self-efficacy (Haidet et al. 2014; Liu and Beaujean 2017). Sweet (2010) described TBL as:

A special form of small group learning using a specific sequence of individual work, group work, and immediate feedback to create a motivational framework in which students increasingly hold each other accountable for coming to class prepared and contributing to discussion. (as cited in Sibley and Ostafichuk 2014, p. 6)

Within TBL, course content is divided into modules of specific and sequenced elements. For the duration of the course, students are assigned to permanent, diverse teams of 5–7 students (Fink 2002); thus, the number of teams is contingent upon the number of students in the course. During the Preparation Phase students independently prepare by reading, viewing videos, or completing activities. The Readiness Assurance Process (RAP) includes an assessment—aligned with the preparation material—first taken as an individual (individual Readiness Assurance Test [iRAT]), then repeated as a team (team Readiness Assurance Test [tRAT]), followed by immediate feedback, appeals, and mini-lecture. At this point in the module, students have foundational knowledge needed to engage with content through Application Activities. Application Activities follow a 4S structure; that is, Application Activities are centered on a significant problem, all teams receive the same problem, teams must make a specific choice to solve the problem, and teams’ responses are revealed simultaneously.

While proven effective in higher education (Haidet et al. 2014; Lieu and Beaujean 2017; Swanson et al. 2019), TBL is primarily delivered in face-to-face settings. The current number of undergraduate and graduate students in at least one online course is steadily increasing (Allen and Seaman 2017), with most, if not all students now having some portion of their coursework delivered online amidst the coronavirus pandemic. Although a large number of students are taking online courses, many challenges exist with specific concerns about limited interactions with classmates and instructors (McBrien et al. 2009), lower sense of belonging (Peterson et al. 2018), inability to gauge social cues through text-based communication (Vu and Fadde 2013), delay in instructor feedback (Lowenthal et al. 2017), and difficulty in using technology (Martin et al. 2012). Faculty have also reported challenges in teaching online as they may lack proficiency with the needed technology, as well as the large time commitment needed to ensure all course components are accessible and understood by students (Ward et al. 2010). In many cases, faculty believe typical online instruction—asynchronous and instructor-centered (Taylor and Maor 2000)—has less value than teaching face-to-face (McQuiggan 2012; Wingo et al. 2017).

Adapting TBL to online instruction may provide students the same learning opportunities and benefits associated with TBL in face-to-face settings. Although efforts to adapt TBL within fully asynchronous and fully synchronous courses have been shared (Clark et al. 2018; Franklin et al. 2016; Samuel and Hinson 2010; Palsolé and Awalt 2008), a model that leverages both modes of engagement might be helpful for particular courses and/or instructors. As such, we designed the Integrated Online—Team-Based Learning model (IO-TBL) to combine the flexibility of asynchronous engagement and the connectedness offered through synchronous meetings. The purposes of this paper are two-fold: to describe IO-TBL and to detail students’ perceptions of IO-TBL using the Community of Inquiry framework.

Review of Literature

Few studies and reports have detailed the adaptation of TBL to fully asynchronous and synchronous courses (Clark et al. 2018; Franklin et al. 2016; Palsolé and Awalt 2008; Samuel and Hinson 2010). Affordances, challenges, and the adaptation of TBL to asynchronous and synchronous online instruction are discussed below.

Asynchronous Online Instruction

Online Learning Consortium (OLC) (2015) defined asynchronous online courses as those that do not require face-to-face meetings and on-campus activity; hence, all coursework can be completed online. General affordances in asynchronous environments include students being able to work anywhere at any time, no travel time to and from campus, and flexibility in access to instruction (Cook et al. 2011; Martin et al. 2012; Romero-Hall and Vicentini 2017; Yamagata-Lynch 2014).

Despite these benefits, there are challenges with asynchronous learning. Wegerif (1998) found that students felt that collaborative activities within the course were not effective and that it was challenging to develop a sense of community with their classmates through online discussion posts. Peterson et al. (2018) also found that students communicating asynchronously had lower perceptions of belonging, which suggested negative effects on cooperative learning. Rourke and Kanuka (2007) examined the barriers students face when using online discussion forums and found that little time was devoted to discussion within the forum as the pacing of the course pressed students to continue with the content; students desired more time to actually discuss the content.

TBL

In adapting TBL to an asynchronous course, many of the TBL elements typically completed in a single face-to-face class session are adjusted to span multiple days. Palsolé and Awalt (2008) had students complete the iRAT (1–2 days), allowed teams time to discuss their individual responses, and required one team-member to compile their responses and to retake the tRAT (2–3 days). They describe this entire process taking about 3–5 days. A similar procedure was used for students to engage in Application Activities; all teams were provided the same problem and given 5 or 6 days to engage with and submit the activity. Although TBL within an asynchronous course requires activities that might be completed in a single class-session to span multiple days, Samuel and Hinson (2010) identified multiple benefits of asynchronous engagement. In a typical face-to-face course setting, all teams are not likely to have the opportunity to share their reasoning due to time constraints (Samuel and Hinson 2010). In an asynchronous setting, all teams’ justifications and reasoning—for team assessments and Application Activities—are available to both the instructor and/or other teams (Samuel and Hinson 2010).

When examining the effectiveness of TBL within an asynchronous course, Palsolé and Awalt (2008) concluded that TBL improved both student performance and retention when compared to their previous iterations of this course. While all but two teams performed well in the asynchronous TBL course, the authors were able to predict student performance by the frequency of team engagement (i.e., frequency and quantity of discussion board posts). They likewise reported increased student retention in comparison to previous semesters of the course prior to implementing TBL, which was about a 90% retention rate. Overall, Palsolé and Awalt attributed the success of the adaptation to the efforts given in building team cohesion through introductory exercises and frequent opportunities for peer evaluation. They also reported that while the initial adaptation was time consuming, the student outcomes made their efforts worthwhile.

Synchronous Online Instruction

With recent developments in technology, online learning environments are allowing students and faculty to meet virtually from any location at any time. These are referenced as synchronous distributed courses and are defined as those that extend classroom exercises in real-time (OLC 2015). Synchronous online education has been shown to create a sense of connectedness between students and their instructor (Martin et al. 2012; Peterson et al. 2018). Peterson et al. (2018) found that the students communicating synchronously engaged in analytical thinking, were more likely to take academic risks, express authoritativeness, share ideas, and encourage their group. Similarly, Martin et al. (2012) found that synchronous sessions had a positive impact on students’ interaction with their instructor, their classmates, and the course content. Students appreciated the real-time conversations with their classmates and instructor, immediate feedback from the instructor, ability to “see” the instructor to ask questions, and the ability to hear other students’ comments.

While many studies report positive findings in synchronous learning environments, there are still areas for improvement. Martin et al. (2012) found that students expressed challenges regarding the technology aspect of the course. Students noted problems with their webcam, internet connectivity, microphone, and audio delays when talking at the same time as others (Martin et al. 2012). Similarly, Romero-Hall and Vincentini (2017) found that because of technology issues, students often misunderstood or could not hear the professor, which resulted in more time spent reviewing class materials. These technology issues also made it challenging to participate in class. McBrien et al. (2009) found that within synchronous meetings, students often felt confused due to the large number of simultaneous interactions; in many cases, different individuals would try speaking at the same time, which made it difficult to focus on a specific comment. Students also reported that the absence of non-verbal communication detracted from the learning environment (McBrien et al. 2009).

In synchronous online learning courses, students felt that it was important for instructors to provide checklists for each module/unit that includes assignment due dates. Students noted that this helped them organize their time and helped them prepare for the deadlines of major assignments (Bolliger and Martin 2018; Tanis 2020). Students also believed it was critically important to receive prompt and constructive feedback, noting that a “lack of feedback” was “detrimental to their online learning experience” (Tanis 2020, p. 9). Additionally, students believed that it was important for course instructors to incorporate strategies to engage them with course materials and with their peers in the online learning environment (Bolliger and Martin 2018), which creates an opportunity to build community and make sense of what they are learning (Mehall 2020). However, during synchronous sessions, students felt that it was not important for course instructors to incorporate various interactive teaching platforms (e.g., Pear Deck, Kahoot, etc.,) to interact with students (Bolliger and Martin 2018).

TBL

The adaptation of TBL to a synchronous course remains similar to that of TBL in a face-to-face course, but with student and instructor interactions occurring simultaneously, in a virtual classroom. Franklin et al. (2016) adapted, and then examined TBL within a synchronous setting; this study was conducted in the context of a pharmacy program. To determine if TBL was comparable across face-to-face and online, synchronous settings, one module was implemented with 70 students in a face-to-face setting, and 222 students in a synchronous, online setting. Online students were provided 11 different day and time options for when they would engage with the module. The module included an iRAT, tRAT, and an Application Activity. In comparing the two settings, performance on the iRAT was similar across both groups, however, students who engaged with TBL online performed significantly better on the tRAT. While students participating in synchronous TBL were able to perform equal to- or better than those students in face-to-face TBL, students who participated in synchronous TBL perceived team work and team interdependence significantly lower than those who participated in face-to-face TBL. In comparing module preparation time, face-to-face instructors spent 1.5 h preparing, while online instructors spent 5 h preparing. Those students who participated in the online module also attended 0.5–1 h of technical training. Franklin et al. (2016) concluded that online TBL is a viable alternative to face-to-face TBL, although the adaptation is time intensive in learning the technology.

The Integrated Online—Team-Based Learning Model

The goal of IO-TBL is to implement TBL online by leveraging the benefits of both asynchronous and synchronous instruction, while in-turn, mitigating many of the challenges associated with each. Namely, IO-TBL was designed to combine the flexibility of asynchronous engagement with the connectedness offered through synchronous meetings.

Prior to each module beginning, preparation materials and preparation objectives are provided to students through the learning management system (LMS). Each module begins with a synchronous meeting in which students start by completing the readiness assurance test. In developing this assessment, questions should align with the preparation materials and preparation objectives and are intended to determine if students understand the key ideas from the preparation materials (Michaelsen and Sweet 2009). As such, the questions should avoid picky details and instead focus on foundational concepts, while also being difficult enough to prompt discussion within the teams (Michaelsen and Sweet 2009). RATs are multiple-choice assessments that may range from from five to twenty questions. Students first take the iRAT, followed by teams collaboratively reworking the tRAT in breakout rooms (a feature of the web conferencing system such as Zoom) and then submitting each answer for immediate feedback. Based on team performance, the instructor provides a clarifying lecture and teams are given approximately one week to submit any appeals through an online survey.

The remainder of the synchronous session is used to engage teams in Application Activities, much like in a face-to-face course. An Application Activity is introduced to the whole class, and teams are sent to breakout rooms to collaborate and select an answer. Once all teams return from their breakout rooms, answers are simultaneously reported, and the instructor facilitates cross-team discussions.

At the conclusion of the synchronous session, the instructor gives an overview of the remaining team Application Activities for the module. Teams are typically working on more than one Application Activity during the out-of-class portion of the module. These out-of-class Application Activities include the 4S components, as teams work on the same, significant problem, select an answer, report answers simultaneously, and then engage in cross-team discussions. These Application Activities are consistently designed in two ways. Design 1 requires students to complete the Application Activity twice, first individually, and then again as a team. Teams’ choices are then simultaneously reported and students independently provide justifications for their individual and team’s decisions. Design 2 requires teams to create a product or deliverable. Following, teams view the other teams’ products, leave feedback, and make a specific choice. Teams’ choices are again simultaneously reported and students independently respond to received feedback and justify their team’s decision. At both mid- and end-of-semester, students are provided opportunities to asynchronously complete peer evaluation. An overview of an IO-TBL module is visible in Fig. 1. For a more thorough description of the model, and to view a sample IO-TBL module, see Parrish et al. (in press a; in press b).

Fig. 1
figure 1

Overview of Integrated Online—Team-Based Learning model

TBL has been accepted as an effective framework for promoting collaborative learning in higher education, primarily within the context of face-to-face settings (Haidet et al. 2014; Lieu and Beaujean 2017; Swanson et al. 2019). Given both the increase of enrollment in online courses, as well as the emerging need for translating active learning strategies into online courses that address challenges of online learning, adapting TBL to online instruction may provide many of the same benefits associated with TBL in face-to-face settings. While initial efforts have been made to implement TBL in both a fully asynchronous- and fully synchronous courses, both models maintain limitations. In asynchronous TBL, all course activities—typically completed within a single class session—are extended and completed over multiple days (Palsolé and Awalt 2008), which may require almost daily engagement with the course site and be perceived as less favorable and cumbersome by some students. While synchronous TBL has only been examined within the context of a single module, participating students perceived teamwork and team interdependence as significantly lower than those students participating in face-to-face TBL (Franklin et al. 2016). A study examining synchronous TBL beyond a single module is needed to determine if prolonged team engagement would improve students’ perceptions of team dynamics. The current study will examine students’ perceptions of IO-TBL—a model that combines asynchronous and synchronous instruction—during and at the conclusion of a semester-long course. The Community of Inquiry framework is the lens through which students’ perceptions were examined.

Theoretical Framework

Student perceptions from the first two implementations of IO-TBL were examined using the Community of Inquiry (COI) theoretical model. COI provides a practical way to evaluate online learning experiences with its three-part focus on cognitive presence, social presence, and teaching presence (Garrison et al. 2000). Each “presence” of the model frames aspects of the learning environment that capture interactions among students, instructors, and content.

Cognitive presence is “the extent to which the participants in any particular configuration of a community of inquiry are able to construct meaning through sustained communication” (Garrison et al. 2000, p. 89). Cognitive presence is further conceptualized through the practical inquiry model: a triggering event, exploration, integration, and resolution (Garrison et al. 2000). Through well designed tasks, students are exposed to a triggering event in which they experience a state of dissonance or unease. Through exploration, students search for information, knowledge, or a solution to make sense of the experience. Students then integrate this new information and knowledge to develop a concept or idea. Lastly, students reach resolution as they apply the new concept or idea to the original task or challenge.

Social presence is “the ability of participants in a community of inquiry to project themselves socially and emotionally, as ‘real’ people (i.e., their full personality), through the medium of communication being used” (Garrison et al. 2000, p. 94). Collaboration among students and instructors in the expression of emotion, open communication, and group cohesion is essential for social presence particularly as it supports cognitive presence (Garrison et al. 2000). Expression of emotion is “the ability and confidence to express feelings related to the educational experience” (Garrison et al. 2000, p. 99); open communication is characterized by “reciprocal and respectful exchanges” (Garrison et al. 2000, p. 100). Being explicit in demonstrating mutual awareness and recognizing others’ ideas were identified as examples of open communication and of particular importance within an online medium as opportunities to build social presence through smiles, eye contact, and other nonverbal mechanisms might not be available. Group cohesion includes opportunities for focused collaboration as critical inquiry and discourse are improved when students view themselves as part of a group.

Teaching presence is defined using two functions, the design- and facilitation of the educational experience (Garrison et al. 2000). The design, primarily completed by the course instructor, includes “the selection, organization, and primary presentation of course content, as well as the design and development of learning activities and assessments” (Garrison et al. 2000, p. 89–90). In contrast, the facilitation of the learning environment might be shared by both the course instructor and students. Teaching presence includes three categories: instructional management, building understanding, and direct instruction. Instructional management relates to the planning and implementation of educational experiences; “setting curriculum; designing methods and assessment, establishing time parameters, and utilizing the medium” (Garrison et al. 2000, p. 101). Building understanding is focused on learners’ knowledge acquisition; “through active intervention, the teacher draws in less active participants, acknowledges individual contributions, reinforces appropriate contributions, focuses discussion, and generally facilitates an educational transaction” (Garrison et al. 2000, p. 101). Direct instruction reflects “teaching” in the truest sense, presenting content and confirming student understanding through the design and implementation of assessment (Garrison et al. 2000).

Methods

As one purpose of this study was to detail students’ perceptions of IO-TBL, a qualitative research design was used to capture and analyze students’ feelings and ideas about the model (Strauss and Corbin 1998). IO-TBL was implemented in a course with a total of 28 graduate students seeking initial secondary teacher certification (i.e., grades 6–12) across two semesters. In Spring 2018, the course was taught in two sections: one with 15 students and one with 6 students. In Summer 2018, 7 students were enrolled in the course. All enrolled students consented to participate in the study and 64% of the participants were female.

Data

Course Evaluations

At the conclusion of each semester, all enrolled students were provided an opportunity to complete the university administered, online course evaluation. Two questions on the evaluation were analyzed for this study: (a) Please list up to three things that you liked about this course, and (b) Please list up to three things that you would change about this course. Since students often listed multiple items for each response, comments were disaggregated by topic change. Across all sections, all but one student completed the evaluation.

Small Group Instructional Feedback

Small Group Instructional Feedback (SGIF) sessions were conducted twice each semester, at mid-semester and at end-of-semester. SGIFs are a formative evaluation process to gather information from students on their learning experiences in the course (Crow et al. 2008; Diamond 2004). The SGIF session was conducted by a third-party individual to foster open communication between the students and the facilitator to determine the students’ perceptions of the course (Diamond 2004). Students were asked three questions: (1) what is going well; (2) what suggestions do you have for improvement; and (3) what other comments does your group have about the learning environment. In their teams, students discussed answers to these questions and reported back to the SGIF facilitator. Following the SGIF session, the facilitator prepared an SGIF report that organized team responses by question.

Combining Data

All units of analysis were organized into spreadsheets by topic. The first spreadsheet was identified as “Going Well” and included students’ responses to question (a) on the course evaluation (Please list up to three things that you liked about this course?) and the first question on the SGIF (What is going well?). A second spreadsheet was identified as “Needing Improvement” and included students’ responses to question (b) on the course evaluation (Please list up to three things that you would change about this course?) and the second question on the SGIF (What suggestions do you have for improvement?). For those responses to the third SGIF question (What other comments does your group have about the learning environment?), two researchers examined each comment by topic and assigned the comment to either the “Going Well” spreadsheet or the “Needing Improvement” spreadsheet. Note, that the course also included field observations of practicing teachers as a program requirement and not connected to the structure of the course that is IO-TBL. We suspect required field observations might be similar to other disciplines in which a clinical or practicum component is included as part of the course, but again, not connected to the structure of the course.

Analysis

For each spreadsheet, two researchers (Authors 1 and 3) coded the responses pairwise for similarities; that is, each pair of responses was marked as either similar or dissimilar. This produced a concept map in which vertices represented student responses, and similar responses were connected by an edge. We then applied the cluster analysis technique described in (Balan et al. 2015; Kane and Trochim 2007) to group the statements into clusters. However, that method only allows for a single coder; as in (Lewis and Estis 2020), we applied the more robust Louvain cluster detection algorithm. The Louvain algorithm maximizes modularity on weighted graphs, allowing us to use multiple coders; further technical details can be found in (Blondel et al. 2008). Once the clusters were identified, three researchers (Authors 1, 2, and 3) collectively assigned each cluster into one of the three components of the COI model by examining how the student comments aligned with cognitive presence, teaching presence, and social presence, as defined by Garrison et al. (2000).

Researcher Role

While the instructor of the course was a researcher in this study, care was given to minimize potential bias in both data collection and analysis. As noted above, the SGIF sessions were conducted by a third-party individual to foster open communication. Similarly, students’ completed the course evaluations anonymously, and results were not released to the course instructor until final grades were posted. A second researcher independently coded the data, alongside the course instructor. The process of assigning each cluster to one of the three COI presences was completed by the instructor and two additional researchers. In all cases, the data was anonymous and did not include any identifying information.

Results

When students were asked what is going well in the course, seven clusters emerged: observations, RATs, course content, synchronous online meetings, learning, teamwork, and instructor. When students were asked what specific suggestions they had for improving the course, nine clusters emerged: online tools, RATs, observations, logistics of observation, peer evaluation, workload, team composition, and time commitment. Table 1 displays the cluster name, the associated COI presence, response category (Going Well or Suggestion), a representative student quote, and number of responses. Note that because data was collected at various time points across each semester, as well as the data being anonymized, the number of responses (n) in Table 1 only indicates the number of responses per cluster, and not the number of respondents.

Table 1 Cluster overview

For each spreadsheet, a Fisher-Freeman-Halton test was used to determine if the size of each cluster depended on the semester. For both Going Well and Suggestions, there was not a significant difference (p = 0.135 and p = 0.356, respectively) in cluster membership between the two semesters.

Cognitive Presence

Cognitive presence included one cluster related to learning (n = 31) (see Table 1) and was identified as an aspect of the course going well. Comments that specified why they felt their learning was improved focused on the course structure, format, or environment of the course. More specifically, the structure and predictability of the course was attributed to learning in that students knew what to expect and how to prepare. Students also attributed the structure of the course, as well as the accountability inherent within TBL, as contributing to their learning; “Although this class was hard, the team-based approach definitely caused greater accountability and I learned much more than in a regular online class.” Remaining comments (n = 11) did not specify why they felt their learning had improved, but that the class was informative and a valuable learning experience. One student stated, “I’ve learned the most from this class out of all the ones that I have taken in the program.” Although this student recognized learning occurred, what specific part of the course led to that learning was not described. Similar comments made requests for other courses in the department to adopt a similar teaching approach—because of perceived learning—but again lacked specificity around what aspects of the course lead to this learning.

Social Presence

Social presence included five clusters (see Table 1). The first three clusters, teamwork, synchronous online meetings, and instructor were all identified as aspects of the course going well. The clusters peer evaluation and team composition were both identified as aspects of the course needing improvement.

Going Well

Within the teamwork cluster, (n = 33) an opportunity to work in teams was identified as both a strength of the course, as well as a means of improved accountability and collaboration. Generally, students felt teamwork was enjoyable and positive, something the students liked: “The group activities have been the greatest strength of the course.” Other comments provided additional depth about the benefits of teamwork and described the value in equal team member contributions, opportunities to have topics clarified, and improved learning; “I love that the class has the individual and team components. Being able to discuss things as a team really enriches the class experience, and we are able to swap ideas and experiences that are beneficial to one another.” Other students described how the team component provided an additional source of accountability in their learning: “The class has much higher accountability and feedback for an online class when compared to classes that just have forums.” Some of the comments (n = 4) were also specific to students believing they had a great group; “Team stuff is good but got lucky with a good team.”

The synchronous online meetings cluster (n = 27) was identified as a net positive in the course, with specific comments that it provided an opportunity for increased connection and accountability. Students commented that the synchronous meetings were beneficial or enjoyable; “This is the only class I have the video conferencing component for, and I really enjoy that.” Just over half of the comments (n = 15) were specific to the connectedness of the synchronous meetings provided within the course. Some students attributed this connectedness to an increased sense of learning: “Being put into smaller groups really helped with understanding the material and my group is awesome.” Other students appreciated the personal connection that synchronous meetings prompted as they were able to talk with their instructor and classmates. Consider two students’ comments: “Zoom web conferencing [video conferencing software] is very beneficial for online courses—liked the personalization the technology brought;” “The video conferencing does help give me some sense that I am actually talking to my professor and classmates.” Additionally, students also enjoyed the ability to meet with their team in a breakout room during the scheduled synchronous meeting, or while completing Application Activities outside of the scheduled synchronous meeting. One student described the benefits of engaging with their team in a breakout room: “...then breaking us up into ‘Breakout Groups’ of four members is great because it lets us get to know 3 other people in the class pretty well.”

Within the instructor cluster (n = 21), students recognized that the instructor genuinely cared about their success, noting his approachability, helpfulness, and communicative nature. Other students appreciated the ability to easily communicate with the instructor, specifically noting his timely responses. Students stated, “The instructor is great about responding to emails or any questions we have” and “is very accessible and emails back promptly with helpful answers.”

Suggestions

The peer evaluation cluster (n = 7) focused on the requirement to rank classmates after building camaraderie. The Michaelsen Method of peer evaluation, which requires students to assign a varying number of points across their team members, was used (Sibley and Ostafichuk 2014). For example, if the team has 5 students, they have 40 points (excluding themselves) to assign across their 4 team members, and they can’t give each team member 10 points. This method forces discrimination in team member performance. Consider how one student described this conflict: “I didn’t like the way you ranked your teammates. I liked my teammates and felt they all did their duties to the team, but one had to take the fall for no real reason.”

Team composition (n = 3) was specific to who was included on the team, team size, or having to change teams. While in some cases it was possible to create teams by students’ content focus (e.g., mathematics, social studies, science, etc.,), other times the teams were interdisciplinary; “I wish I could have been in a group with others in my focus. Between English, History, and ESL [English as a Second Language], there were quite a few idiosyncrasies that made group lesson plans difficult.”

Teaching Presence

Ten clusters reflected teaching presence. Three clusters were identified as going well and included course content, observations, and RAT. Seven clusters were identified as needing improvement and included workload, time commitment, more lecture, online tools, observations, logistics of observation, and RAT.

Going Well

In the course content cluster (n = 9), students appreciated the course materials, especially the readings. One student stated, “The readings and textbooks are excellent - helps us see strategies to use in real classes” Another student stated, “The textbook is comprehensive and a practical resource.” Students also appreciated the technology resources shared in the course; “[The professor] introduced us to what I feel was a wide variety of technical resources that we will need to keep pace with the direction education has been going.”

Within the observation cluster (n = 7), students valued the opportunity to observe teachers and felt as if they benefited from the experience. One student stated, “...it helps me prepare for what to really expect in the career” and another student stated, “the observations offered beneficial insights into secondary education.”

In the RAT cluster (n = 5), students appreciated the alignment between the readings and the RAT, stating that the preparation learning objectives guided their reading, which was more helpful than just reading and picking out key points themselves. Other students specified that they appreciated the tRAT because it allowed them an opportunity to hear their teammates’ reasoning. Another comment expressed that the option to appeal missed questions was fair.

Suggestions

In the workload cluster (n = 24), several students (n = 13) stated that the course was time consuming for a three credit-hour course. For example, one student had trouble managing the workload while also working a fulltime job. Additionally, although students appreciated the team aspect of the course, some students felt that meeting with their team outside of scheduled synchronous sessions was time consuming and difficult. One student stated, “Group work was very intense, it was very beneficial, but we had to meet outside the classroom a lot.” Other students noted the difficulty in coordinating with their teammates’ schedules stating “The group work was excessive at times. It was difficult to schedule a time to meet that would include the whole group.” One student suggested that only working in a group during the synchronous meetings would have been more advantageous.

Within the time commitment cluster (n = 22), students expressed a desire to have synchronous meeting and field observation expectations communicated during course registration. Students registered for the class under the impression it was strictly online—they were unaware of the field observations and the synchronous class meeting. As a result, some students (n = 5) did not feel they had the time to plan accordingly for their job and/or childcare responsibilities. Several students (n = 9) also mentioned reducing the amount of observation hours, especially for students who are already full-time teachers, stating that the observations were burdensome for adults with children and fulltime jobs.

In the more lecture cluster (n = 17), students wanted more direct instruction to clarify difficult concepts and procedures such as writing a lesson plan; “I wish the class would have met every week, just so [instructor name] could have been around to help clarify difficult concepts.” Students also felt the assignment instructions could have been clearer; “more explicit instruction in assignments would have been helpful.”

In the online tools cluster (n = 14), students expressed a desire to have the various online tools (e.g., learning management system (LMS), TBL management system, Google docs, Google sheets, etc.,) streamlined into one platform. Students felt the learning environment was challenging because there were many online tools to manage, or because of technical challenges with online tools, such as out-of-class video-conferencing software crashing. Students also reported having each of their online instructors organize their course sites within the LMS differently and that this made finding information difficult; at a minimum, students requested the calendar tool be utilized for assignment and module deadlines.

In the observation cluster (n = 7), students requested specialized topics for the observations to help them focus on specific aspects of the observation. At the time of the study, students were told to write a general reflection on the observed instruction for each of their observations. In the logistics of observation cluster (n = 7), students suggested there needed to be more communication between the university and cooperating teachers; students reported that their cooperating teacher did not always know they were coming.

In the RAT cluster (n = 6), students did not appreciate having to take an assessment before a clarifying lecture, one student stated it felt like they were pre-testing for a grade. Other comments (n = 2) mentioned some of the questions being a bit confusing and felt tricky.

Discussion

Clusters of student perceptions related to IO-TBL were associated with each presence of the Community of Inquiry framework (Garrison et al. 2000). Most commonly, clusters were specific to teaching presence, followed by social presence, and then cognitive presence. In examining cluster topics, students most frequently commented on perceived learning (n = 31), synchronous online meetings (n = 27), teamwork (n = 33), instructor (n = 21), and workload (n = 24), all of which—except for workload—were identified as an aspect of the course going well.

Cognitive Presence

One of the largest clusters across the dataset was specific to students’ perceived learning within the course. When students did specify which aspects of the course supported their learning, course structure, format, and environment were mentioned, all of which are reflective of the IO-TBL model. Furthermore, within COI, both social presence and teaching presence have been identified as “supports” for cognitive presence (Garrison et al. 2000). Many of the students’ comments regarding IO-TBL match this premise. Within this cluster, some of the responses were vague and did not always specify which aspects of the course led to perceived learning. Rather, when students commented on specific aspects of IO-TBL that led to an increase in learning, the contributing aspect was often identified as its own cluster within either social presence or teaching presence. For example, the teamwork cluster within social presence included the following quote: “Although this class was hard, the team based approach definitely caused greater accountability.” Learning, in this case, is attributed to team learning, and was therefore assigned to the teamwork cluster (in social presence), and not the learning cluster (in cognitive presence). Because the content of clusters are mutually exclusive, students’ comments that attributed learning to a specific element of IO-TBL were not also included within the learning cluster.

Social Presence

Teamwork was frequently included in student comments as they described benefits of completing the course alongside a team. Students attributed the synchronous component of IO-TBL as an opportunity to get to know one another, just as in other studies examining the role of online synchronous engagement (Martin et al. 2012; Peterson et al. 2018). This is in contrast to one study showing low rankings of team interdependence with online student groups (Franklin et al. 2016). One possible explanation is that Franklin et al. (2016) only implemented one module of TBL online whereas students in this study completed an entire semester of TBL online. In alignment with the essential elements of social presence defined by Garrison et al. (2000), students explicitly expressed their feelings about IO-TBL through an appreciation of the synchronous meetings as they promoted a sense of learning and connectedness.

Peer evaluation was reported to detract from the social presence within the course, specifically group cohesion. The Michaelsen Method was selected for this course as it was also the method embedded within the TBL management software utilized by the instructor and thus, allowed for efficient management of peer evaluation responses. Levine (2008) reported that instructors who use the Michaelsen Method often receive pushback from higher-functioning teams as they believe all team-members contributed equally. An inability to develop group-cohesion, and or disruption to group-cohesion, was identified as a point for improvement in the course. For one section of the secondary education course, grouping students by content areas was not feasible with the small number of students.

Teaching Presence

Many clusters of comments were specific to the design of the education experience, a defining function of teaching presence (Garrison et al. 2000). Students were complimentary of the course design, including selection of relevant course materials, RAT assessments, and opportunities for observations.

Students felt the workload and time commitment required by the course were in need of improvement—both of which are related to instructional management (Garrison et al. 2000). At the time of registration, students were often unaware of seven required synchronous meetings and 60 h of required observations. In addition, students reported spending a significant amount of time completing assignments with their team beyond the scheduled whole-class, synchronous sessions. Although IO-TBL allows teams both synchronous and asynchronous options to complete the out-of-class Application Activities, it appeared most teams preferred meeting synchronously and thus incurred scheduling conflicts. The content of the course was redesigned the semester in which IO-TBL was implemented. Since the time of this study, the number of observation hours have been significantly reduced. Because of this, some comments about workload were likely specific to the course assignments and not associated with the IO-TBL model. Additional suggestions for improving instructional management related to the number of platforms required within a single module—some adding that having to utilize so many different platforms made the course feel cumbersome. One area that was improved after reviewing student feedback was the observation reflections. Instead of students writing a general reflection for each observation, students are now provided observation prompts that align with the current module. Students also expressed a desire for more lecture and clarity around procedures and assignments. To clarify assignments in future semesters, we applied the Transparency in Learning and Teaching (TILT) framework (Winkelmes et al. 2016). Each course assignment now specifies the purpose of the assignment, the task to be performed by students, and the criteria of the finished product—either through a sample product or a rubric.

Implications

This study provides several practical guidelines when considering the implementation of IO-TBL within an online course. To start, early and frequent communication with students about the required synchronous meeting is important for both clarity in course expectations, as well as to provide students with an opportunity to make job or childcare arrangements. Instructors should also work to maintain the structure and predictability inherent within the IO-TBL modules, particularly with the consistent deadlines for out-of-class application activities, as students identified as this as key in supporting their learning. Likewise, although completing activities as a team was identified as an overwhelming benefit of the IO-TBL model, finding common times for teams to meet was reported by many as a challenge. Instructors should work to find a balance between the quantity of- and the time required for students to complete the out-of-class application activities. The IO-TBL model specifies that both asynchronous and synchronous means of completing out-of-class application activities are available, but teams may choose to complete some of these activities using synchronous technology. Because of this, ensure students are provided access to quality synchronous software, and if possible, embedded within the LMS. Following the completion of application activities, including inter-team discussions, students reported a strong desire for direct instruction or feedback from the instructor. If team discussions were satisfactory, the instructor might affirm a previously stated student or team comment. Instructors should carefully consider which peer evaluation method is used within the course. Students found it problematic when forced to rank or deduct points from a teammate, if for no other reason than the design of the implemented peer evaluation method. See Sibley and Ostafichuk (2014) and Szatkowski and Brannan (2019) for alternative peer evaluation methods.

Conclusion

The goal of this study was to not only detail IO-TBL, but to also examine students’ perceptions of IO-TBL through the Community of Inquiry framework. By combining asynchronous and synchronous modes of engagement in a structured TBL format, IO-TBL is capable of fostering cognitive, social, and teaching presence. This model provides one mechanism for implementing TBL online and serves as a practical strategy for educational developers and faculty to focus on active learning in the online environment.

While it might be possible to implement various components of IO-TBL within an online course, the findings highlight the interconnectedness of the course design framework. While there are clear benefits of online synchronous meetings absent IO-TBL, the combination of synchronous meetings and students being assigned teams proved to be important. The largest cluster across the data set was specific to the value of learning alongside team members. We likewise posit that the sense of connectedness reported within these teams would not have been possible without synchronous web-conferencing.

A cluster specific to technology issues did not emerge within the findings, although technology challenges were commonly reported in the literature (Martin et al. 2012; Romero-Hall and Vincentini 2017). Within IO-TBL, the first synchronous course meeting is designated as an orientation that not only provides students with an overview of course expectations but provides an opportunity to test and remedy students’ technology issues.

Limitations

The primary limitation is that there were only 28 students, with no section larger than 15 students. With relatively smaller course sections, it was also relatively easy to train students in the required technology and course structure, this would increase in complexity—for both the students and the instructor—in larger courses. Because all participants were from the same teacher education course, further examination of IO-TBL in larger courses, as well as courses in fields outside of teacher education, is warranted.

Specific to the current study were the student reports of difficulties scheduling team meetings outside of synchronous class sessions. Although the IO-TBL model offers teams both asynchronous and synchronous modes of engagement for such activities, it appears teams preferred to complete many of these activities using synchronous technologies. The results of the study might not reflect the implementation of the model if many of the teams completed the out-of-class application activities solely through asynchronous methods of engagement.