Introduction

Asynchronous online learning affords students distinct opportunities for self-direction and reflective communication (Garrison, 2003). However, the absence of visual and auditory cues in online courses that rely solely or predominantly on asynchronous text-based communication can foster feelings of social isolation and lack of community among students (Borup et al., 2013; Graham, 2006; Garrison et al., 2000; Lowenthal et al., 2021). Research suggests that one way to address social isolation in online courses is through effective communication and feedback (Young, 2006) as well as the development of social presence (Bolliger & Martin, 2018; Boston et al., 2011). However, instructors and students can struggle to effectively communicate using predominantly text-based asynchronous communication in online courses (Aloni & Harrington, 2018; Lowenthal & Dunlap, 2020). Given this, online instructors are increasingly turning to synchronous and asynchronous video to improve communication in online courses (see Belt & Lowenthal, 2021; Lowenthal & Mulder, 2017). Technological advancements have made the use of asynchronous video, in particular, relatively easier than in the past (see Lowenthal et al., 2020; West & Borup, 2021). While asynchronous video communication lacks the immediacy of live face-to-face communication (whether in person or online), it can provide non-verbal, affective, and visual cues that can improve online communication (West & Borup, 2021).

Online instructors have experimented with different ways to use asynchronous video in online courses. For instance, online instructors have used video lectures to provide direct instruction or to demonstrate topics to support student learning; they have also used video orientations and announcements to clarify points or provide instructor commentary, support, and guidance as students’ progress through online courses; and they have used video-based blogs, assignments, and discussions to help students both create and interact with video-based content and/or communication (Belt & Lowenthal, 2021). Research has shown that students not only like it when their instructors use video in these different ways but also that they have the potential to improve student learning (Belt & Lowenthal, 2021; Lowenthal, under review). A less used, though, growing area of research and practice is the use of video feedback (Bahula & Kay, 2020; Mahoney et al., 2019).

Research on video feedback is growing due in part to the affordances video communication technology provides teachers and students. Previous research has shown learners expect feedback from instructors that is “timely, personal, explicable, criteria-referenced, objective, and useful for improvement” (Bahula & Kay, 2020, p. 6536). Among other things, video feedback holds promise to support affective communication and relationship building between instructors and students in online courses. Unfortunately, research to date has found mixed results on how much students value asynchronous video feedback (see Borup et al., 2015; Lowenthal, 2021). Further, little-to-no research has focused on student perceptions of giving and receiving peer video feedback. Therefore, understanding the role and value of asynchronous video feedback to learners is timely, especially given the COVID-19 pandemic and the need for instructors to teach in blended, remote, or fully online courses. In the following paper, we present the results of an investigation of student perceptions of screencasting style of video feedback in online courses. We conclude with areas for future research and implications for practice.

Background

Feedback from instructors to students has a rich history in education. In fact, Gould and Day (2013) claim that the single most important element in student learning and course satisfaction is feedback. Despite the preponderance of feedback research (Mory, 2004) and posited principles of good feedback (see Nicol & MacFarlane-Dick, 2006), much of this research focuses on oral or written feedback. In asynchronous online courses, text-based communication is common (Garrison et al., 2000), as is text-based feedback (Ryan et al., 2019). However, text-based feedback has limitations (Borup et al., 2015). Specifically, research has found that text-based feedback can cause confusion, lack breadth and depth, and evoke negative emotional responses among students (Borup et al., 2015; Grigoryan, 2017; Ryan et al., 2019). These and other limitations of text-based feedback have prompted researchers to explore multimodal forms of feedback (e.g., audio, audio-visual, and video feedback; see Grigoryan, 2017; Ryan et al., 2019).

Early adopters of multimodal forms of feedback usually began with providing students with some type of audio feedback (Ice et al., 2007; Oomen-Early et al., 2008; Wilson, 2009).

For instance, Ice et al. (2007) and Oomen-Early et al. (2008) both found in separate studies that students thought audio feedback was more effective than text feedback. They also found that audio feedback increased student engagement, student recollection of the feedback, and established a stronger connection among students to their instructor (Ice et al., 2007; Oomen-Early et al., 2008).

With technological advances, instructors started experimenting with providing asynchronous video feedback in different ways (Brick & Holmes, 2008; Lowenthal & Dunlap, 2011; Stannard, 2008). Mahoney et al. (2019) identified three different ways instructors give asynchronous video feedback: the talking head video feedback (i.e., where an instructor gives student feedback by recording themselves on a webcam), screencast video feedback (i.e., where an instructor records their screen, and shows the student work as they provide feedback), and screencast video feedback with a webcam (i.e., where an instructor combines the use of providing screencast video feedback with a recording of themselves on a webcam; see Fig. 1).

Fig. 1
figure 1

Common Ways to Give Video Feedback

Prior research suggests that asynchronous video feedback differs from written feedback by providing learners richness and helping instructors move beyond “surface-level mechanics” (Mahoney et al., 2019, p. 161; see also Borup et al., 2015). For example, research suggests that when utilizing video feedback, instructors are more likely to elaborate on specific details and notes helping to provide more constructive conceptual feedback (e. g., arguments, analysis, synthesis, judgements; Lamey, 2015; Mahoney et al., 2019; Parton et al., 2010; Thomas et al., 2017). As feedback can differ across disciplines, instructors, and institutions, it is not surprising that some research has found varied student perceptions of video feedback (see Bahula & Kay, 2020; Mahoney et al., 2019). Variations create challenges in synthesizing effective uses of video feedback. However, a small pilot study has shown some positive student perceptions of providing the screencasting style of video feedback in online courses (see Thompson & Lee, 2012).

Researchers have been drawn to video feedback in part due to the possibility of improving affective communication and what is sometimes broadly referred to as instructor presence. The Community of Inquiry (CoI) framework, in particular, has served as a lens for researchers to think about affective communication and instructor presence. The CoI framework posits that a meaningful educational experience consists of teaching presence, social presence, and cognitive presence (Garrison et al., 2000). Teaching presence refers to how instructional design, direct instruction, and facilitating discourse help develop social and cognitive presence (Anderson et al., 2001); whereas social presence refers to how students use affective expression, open communication, and group cohesion to establish oneself as real and there (Rourke et al., 1999). Some have argued that a neglected element in the CoI framework is instructor social presence (Pollard et al., 2014; Richardson & Lowenthal, 2017); instructor social presence emerges through the overlap of teaching and social presence and “is more likely to be manifested in the ‘live’ part of courses—as they are being implemented—as opposed to during the course design process” (Richardson et al., 2015, p. 259).

Many believe that video feedback has the potential to not only improve student feedback but also to improve each type of presence in online courses, however additional research is needed to confirm this. In one study, Borup et al. (2015) found that video feedback had an impact on students’ perceptions of their instructors’ affective feedback. Students reported that video feedback helped them develop an emotional connection to their instructor, getting to know their instructor as a real person who was interested in helping them learn (Borup, et al., 2015). In a follow up study, Thomas et al. (2017) analyzed the content of instructor text-based and video-based feedback for indicators of social presence in blended courses. While they found indicators of social presence in both forms of feedback (video-based feedback was slightly higher than text-based), they ultimately found no significant differences among student preferences in form of feedback. Borup and his colleagues, though, also found that despite the potential to improve affective communication and instructor presence, students and instructors preferred the efficiency of text-based feedback (Borup et al., 2015; Thomas et al., 2017). Lowenthal (2021) though questioned, as did the authors themselves, the degree to which the results of studies like Borup et al., (2014, 2015) and Thomas et al. (2017) could have been influenced by the blended nature of the courses and the fact that the students regularly had opportunities to meet, interact, and get a sense of their instructors’ presence during in person class sessions.

While research on video feedback is nascent, overall, we have found that research, theory, and practice all suggest that video feedback is an instructional strategy online instructors can use to improve instructor presence (Fiock, 2020; Fiock & Garcia, 2019; Lowenthal & Dunlap, 2018) and possibly student learning (Denton, 2014). However, more research is needed to investigate whether students in fully online courses find it helpful to both receive and give feedback as well as to identify more effective ways to provide video feedback in online courses. As such, the purpose of this study was to explore student perceptions of screencast style video feedback in a completely asynchronous online course.

Methodology

Given the aforementioned problem, coupled with the lack of research on students actually giving other students video feedback, we investigated students’ perceptions of video feedback in a completely asynchronous online course in which students received instructor video feedback and provided and received video feedback from their peers. The following research questions guided our exploratory study:

  1. 1.

    Are students satisfied with video feedback? [satisfaction]

  2. 2.

    Do students think video feedback influences their learning? [perceived learning]

  3. 3.

    Do students think video feedback influences their perceptions of social presence? [social presence]

Context

This study was conducted at a western university in the United States, in a fully online Master of Educational Technology program. Graduate students who took part in the study were taking either a 7 week or 15 week version of a course called “Online Course Design.” In this course, students design a fully online course and then develop a course prototype in an authoring tool of their choice. The instructor provided each student screencasting style video feedback, using TechSmith Camtasia, on a rough draft of their final course prototype. Each student, without receiving any prior training, peer reviewed their peers’ final course prototype and provided each other screencasting style video feedback using the screencasting tool of their choice. The instructor chose to provide screencasting style video feedback halfway through the course. He then had students provide the same style video feedback to their peers later in the course, instead of using the video feedback tool available in the learning management system (which is the Talking Head type of video feedback described earlier) because this type of video feedback shows learners their work as the feedback is provided.

Data Collection

Data for this study was collected in an end-of-course survey that included, among other questions not related to this study, specific questions aligned to the research questions guiding this study (see Table 1) as well as the nine social presence questions from the Community of Inquiry Questionnaire (CoIQ; Arbaugh et al., 2008). The survey was administered to students enrolled in one of five different sections of the same course over a three-year period taught by the same instructor. A total of 84 students completed the course and the survey between 2018 and 2020.

Table 1 Survey Created to Investigate Student Perceptions of Video Feedback

Data Analysis

We downloaded the survey results into a spreadsheet. We calculated descriptive statistics and frequencies for the Likert style questions. We reviewed the open-ended questions; however, none of the additional comments focused on video feedback and therefore were not analyzed or reported in this study.

Results

RQ1. Are students satisfied with video feedback?

Given some of the inconsistent results in past research on students’ satisfaction with video feedback, we were first interested in whether students liked receiving instructor video feedback on assignments. Students reported that they liked it when their instructor provided individual video feedback (M = 4.29), with almost 80% either agreeing or strongly agreeing. Then almost 62% either agreed or strongly agreed that they liked it when their peers gave them video feedback on their assignments (M = 3.80). However, the results suggested that they did not like to give video feedback to their peers (M = 3.35; see Table 2) as much as they liked to receive it.

Table 2 Student Satisfaction Video Feedback

RQ2. Do students think video feedback influences their learning?

Some of our previous research suggested that students might not watch video feedback provided to them late in the semester or on assignments or projects where improvements could not later be made to improve their work and/or their final grades (e.g., final course papers; see Lowenthal, 2014). We wanted to better understand how students thought video feedback influenced their learning when given on projects where the feedback could be later applied (e.g., rough drafts of papers that can later be revised and improved). Additionally, we were interested in whether or not students reported watching the video feedback from the beginning to the end. About 92% agreed or strongly agreed that they did in fact watch the entire video of feedback (see Table 3). It is important to note that due to student privacy concerns the first author hosted the video feedback on Google Drive; however, by doing so, we did not have any analytics on viewership or accessibility features such as auto captioning (which has since led the first author to host video feedback in Panapto). We also wanted to know whether students revised their assignments based on the video feedback they received. Almost 85% agreed or strongly agreed that they did in fact revise and update assignments based on the video feedback, and 69% agreed or strongly agreed that video feedback improved their ability to meet the course outcomes in ways that text feedback could not. However, due to the nature of this study, we were unable to verify the extent to which this is true; future studies using video feedback as an intervention could verify the degree to which students revise assignments based on video feedback.

Table 3 Video Feedback and Perceived Learning Survey Results

Lastly, we were interested in students’ perceived learning from video feedback (i.e., Did video feedback contribute to their learning and to what degree did students think they learned more from both receiving and giving asynchronous video feedback?). Students reported that they thought video feedback contributed to their learning (M = 3.90). But reported they thought instructor video feedback helped them more than receiving peer or giving peer feedback (see Table 3).

RQ3. Do students think video influences their perceptions of social presence?

As mentioned earlier, research suggests that video in general but specifically video feedback, in particular, can improve social presence (Borup et al., 2015). Using the social presence questions from the CoIQ, we found that students in this study reported an average social presence response of 3.95 out of 5.00 or an average total social presence score of 35.72 out of 45. While researchers have yet to (and might never) identify an optimal level of social presence for all students or for communities of inquiry to emerge, other studies using the same questions have reported a similar social presence question average of 2.96 (Kilgore & Lowenthal, 2015), 3.18 (Swan et al., 2008), and 2.85 (Lowenthal & Dunlap, 2018) respectively (using a 0–4 scale compared to our 1–5 scale); thus, suggesting that an average social presence score of 3.95 is comparable to that reported in some other studies. Specifically, when looking at affective expression, participants in this study reported that they were able to form distinct impressions of some course participants the highest (M = 4.01). However, additional research is needed to identify if the video feedback is associated or responsible for these social presence scores (Table 4).

Table 4 Students Overall Perceptions of Social Presence from Community of Inquiry Questionnaire

Then when asked more about video feedback and social presence, students reported that they thought video feedback did improve their instructors’ social presence (M = 4.12), helped them get to know their instructor better (M = 3.95), and helped them feel more connected to their instructor (M = 3.98). And while they reported that it did not help them feel as connected to their peers (M = 3.70), 58% either agreed or strongly agreed that it did help them feel more connected to their peers (see Table 6).

Discussion

Given the conflicting results in the literature about students’ perception of video feedback, coupled with the lack of research on students’ perceptions of giving and receiving video feedback, we set out in this exploratory study to investigate students’ perceptions of receiving and giving video feedback. We found that students reported satisfaction with receiving video feedback from both their instructor (M = 4.29) and their peers (M = 3.80). This finding aligns with prior research that found positive student perceptions of video feedback (Borup et al., 2015; Henderson & Phillips, 2015; Thompson & Lee, 2012), as well as student preference toward instructor-created feedback over student-created feedback (Donkin et al., 2019; Ertmer et al., 2007). Students reported lower satisfaction with providing their peers with video feedback (M = 3.35). One possible explanation for this finding is that the learners, in general, were not yet comfortable with the process of peer review. For example, Nicol et al. (2013) found students perceived peer review positively, yet labor-intensive, qualified by student hesitancy in the process. A second possible explanation for the lower satisfaction with giving peer video feedback could be due to students not being comfortable “grading” peers (Liu & Carless, 2006). Liu and Carless (2006) posited resistance to peer grading may be attributed to a perceived lack of expertise, a shift in power dynamics, and a lack of time and reliability among students. Furthermore, students have previously reported negative emotional reactions with peer feedback (e.g., lack of expertise, accessibility problems; Bahula & Holding Kay, 2020). A third explanation could simply be due to the added work involved in giving video feedback.

Given this, it may be beneficial for instructors to first highlight to students the importance of giving and receiving feedback in general as this skill may be applicable to students’ lives in other contexts (e.g., at work or home). Additionally, instructors should consider teaching learners how to provide meaningful critique and feedback to each other, especially since the process is “not a naturally acquired skill” (Ertmer et al., 2010, p. 82). Furthermore, students may benefit from direct training and guidance on effective ways to provide video feedback specifically as there is often an associated learning curve when adopting unfamiliar technologies in-practice (Banerjee et al., 2020).

Research has shown that students value feedback and that it can enhance their overall achievement (Hattie, 2009). Therefore, it is not surprising that the students in this study reported that instructor video feedback contributed to their learning (see Table 3). One possible explanation for this finding is that students were able to enhance their comprehension by both seeing and hearing the instructors’ notes (Grigoryan, 2017). However, students in this study reported lower perceived learning from peer video feedback (M = 3.45) as well as from providing peer feedback (M = 3.30). Despite some studies showing student peer review can be as meaningful as instructor feedback on student performance (Cho & Schunn, 2007; Gielen et al., 2010), other studies have shown students value peer feedback less than instructor feedback (Ertmer et al., 2007; Gielen et al., 2010).

In this study, perceptions of social presence were comparable to previous research across all social presence indicators. This finding aligns with previous research that found asynchronous video feedback can increase social presence among participants (Borup et al, 2015; Lowenthal, 2014). The instructor in this study only provided video feedback for the rough draft of learners’ final course project. The students were able to see exactly where the instructor was providing notes on-screen (i.e., visual screen disclosure or screencasting) which, in turn, may have helped students see their projects in different ways. Lowenthal (2021) has before suggested that instructors should be selective and strategic about utilizing video feedback, choosing projects that are “visual in nature (e.g., a multimedia presentation or website), dense or nonlinear (e.g., a spreadsheet), and/or assignments where formative feedback can be provided earlier in a course (e.g., on a rough draft of a paper)” (p. 128). Moreover, Grigoryan (2017) found that both text and video-based feedback may help improve students’ work. Multimedia rich projects may require equally rich forms of feedback. Thus, instructors “should prioritize the delivery of effective and affective forms of feedback” (Howard, 2021, p. 124) as the true extent of such richness is not fully known.

There were also differences in how video feedback helped students feel connected to their instructor and their peers. While students reported that they enjoyed receiving video feedback, students reported that they felt more connected to their instructor than their peers (see Tables 5, 6). For example, students in our study reported being able to form distinct impressions of some peers (affective expression, M = 4.01), but also reported less comfortability disagreeing with other peers and maintaining a sense of trust (group cohesion, M = 3.77). This finding suggests that students may either be less interested in forming connections with peers in online settings, students could fear that their contributions will disrupt other students’ experiences (or the community) in negative ways, or students could be apprehensive about how they appear to other students in online settings (i.e., lack of anonymity or student social presence). Van der Pol et al. (2008) suggested that how feedback is received by students may influence subsequent behavior and that asking students to serve as experts in peer review creates challenges. Other studies have shown that positive and negative feedback types can elicit different emotions (see Belschak & Den Hartog, 2009) and that students’ emotions should be acknowledged and supported with learning activities that incorporate feedback (see Värlander, 2008). Thus, peer feedback initiates change within relational dynamics between students relative to how they see each other in online learning environments and further research is needed to explore how student-created or peer video feedback influences students’ sense of community and connectedness online.

Table 5 Frequency and Descriptive Results of Students Perceptions of Social Presence
Table 6 How Video Feedbacked Helped Improve Social Presence and Connectedness

Conclusion

Most online courses rely predominantly on asynchronous text-based communication, whether that be through course announcements, online discussions, emails, or even feedback on assignments. However, text-based communication has some inherent constraints (e.g., too brief, misunderstood by students, lack of quality and quantity; Grigoryan, 2017) that can lead to misunderstanding, especially when it comes to text-based feedback. Feedback in general, though, is a powerful and necessary “component of the learning process” (Nicol et al., 2013, p. 102). Therefore, it is not surprising to find instructors increasingly experimenting with ways to use video feedback in online courses. However, despite this increased use, questions remain on whether students even value this type of feedback, whether it improves their learning, or improves social presence in online courses.

Limitations

There are a few limitations of this research. First, participants from this study came from one class in a fully online master’s program in educational technology. Therefore, the findings from this study should not be generalized to all online learners, in all online courses. Second, as students in an educational technology graduate program, these students likely have more familiarity and comfort with creating video screencasts. However, these students did not receive any training on effective ways to provide peer feedback, the importance of peer feedback (both as graduate students and as future educational technology leaders), or specifically on how to effectively provide asynchronous screencasting style video feedback. Further, as an exploratory study focused on students’ perceptions, it is possible that students reported what they thought were socially acceptable answers. It is also possible that the novelty of video feedback might wear off over time or depend on the context; albeit, we tried to manage this limitation by surveying students over a span of three years. And finally, the research purpose and design focused on exploring perceptions of a single intervention without a comparison intervention.

Future Research

While we explored students’ perceptions of video feedback on learner satisfaction, perceived learning, and social presence, additional research is needed on video feedback’s influence on student outcomes. Additional areas of future research include: comparisons of the affordances of giving live video feedback vs. asynchronous video feedback, optimal formats to provide video feedback (e.g., web cam, screencasting), as well as different ways to improve peer video feedback. However, requiring students to record their faces to engage in video feedback with peers creates ethical concerns in online courses. The talking-head and screencast with webcam styles of video feedback, in particular, heighten these ethical concerns. Indeed, such ethical concerns are evident in other uses of webcams and video communication technology (e.g., student privacy in online proctoring, Selwyn et al., 2021). Compared to the talking head or the screencast with webcam styles of video feedback, the screencasting style could arguably be the most appropriate or ethical form in which to provide video feedback. Yet, this style has limitations (e.g., fewer nonverbals–facial expression, gesture, body language, eye contact are lost) that detract from many of the perceived benefits of these styles. The inherent limitations afforded in different styles of video feedback create a conundrum as to which affordance or combination of affordances are most-valued by students (e.g., the instructors’ voice, the movements that appear on-screen, or some combination thereof) that need further investigation.

Implications for Practice

Video feedback is likely not going away. In fact, increasingly learning management systems (LMSs; e.g., Canvas, Blackboard) have a video feedback tool built in the gradebook. However, both faculty and students need more experience, guidance, and training on how to effectively give video feedback (Grigoryan, 2017). Moreover, tools within LMSs are often limited to one type of video feedback (e.g., talking head) and often do not afford other types of video feedback (e.g., screencast style video recordings, screencast video with webcam recordings, etc.) and might be accessible only by the instructor. While third-party tools may address some of the shortcomings of the built-in tools in some LMSs, we caution against adding too many third-party tools, particularly if these are going to be used for students to give other students feedback, as each new tool requires additional effort and training to use effectively.

We have identified a few strategies, informed by the literature and our own experience giving video feedback, that instructors may use when providing screencast style video feedback to students in online settings:

  • Keep video recordings short. Spend a few minutes writing down the main points that you plan to focus on. Consider using your stopwatch on your phone to keep the feedback to five minutes or less. This can help combat some of the efficiency issues instructors and students have found with creating video feedback and watching video feedback (see Borup et al., 2015; Lowenthal, 2021; West et al., 2017).

  • Highlight areas on the screen that you are specifically providing feedback on and why you chose to focus on it. Research suggests that even when lacking certain detail, video feedback and being able to both “see” and “hear” the instructor feedback which can in turn make comments come across more positively (Henderson & Phillips, 2015). Further, social presence in the course can be increased when video and screencast feedback is provided as it is personalized and individualized to the student (Borup, et al., 2012; Fiock, 2020). This becomes increasingly important in content areas where written text alone cannot convey the nuance of the feedback (Borup et al., 2015; Fiock & Garcia, 2019).

  • Use recording technologies that are easy to learn, easy to use, and that are ideally initially free or inexpensive to use (West et al., 2017). In our experience, not every instructor has the disposition to effectively provide video feedback. Once a tool is selected, we recommend that you should practice using the tool to increase comfort level, minimize possible technical issues, and ultimately improve your overall abilities in how and when you use it (Lowenthal, 2021). Over time, as you use video feedback more, you might consider purchasing a more feature rich screencasting application.

  • Video introduces additional variables into any course. Consideration must be given to make sure the recording technology you use is accessible to your students—in terms of accessibility, file size, and time constraints.

  • When having students give each other feedback, provide students guidance on how to record screencast videos and effective forms of peer-review or student–student feedback (see Banerjee et al., 2020). We suggest providing students with screencast video guidance through use of job aids and/or tutorials, as well as guidance on effective methods for critiquing their peers' work (Ertmer et al., 2010). This training becomes especially important when video peer-to-peer feedback is a required element for course activities or assignments.