Skip to main content

Student Perceptions of Screencast Feedback on Mathematics Assessment


Although feedback is a very important component of assessment in higher education, there is substantial evidence that students view traditional methods of feedback as deficient in a number of respects. In this paper we explore how students perceive generic feedback on a mathematics assignment provided via screencasts. Our study is based on a Differential Equations module taught to first and second year students at a United Kingdom university. Our analysis of a student survey of this novel approach to feedback indicates that some students prefer screencast feedback to written feedback for a number of reasons: it is perceived to be more personal, it provides a richer experience than handwritten comments, it can be accessed anytime and replayed and paused as needed, it assists with learning how to communicate mathematics and it helps develop mathematical thinking skills. In fact, we show that this form of feedback is effective according to Sadler’s (Instructional Science 18:119–144, 1989) definition of effective feedback.


Feedback to students on their work is an important component of their education. It encompasses information about how students perform, how their performance compares to other students and to certain benchmarks such as a correct solution, and also how students can improve. Traditionally, feedback in undergraduate mathematics courses consists of ticks or crosses with brief comments provided by a marker on assignments or exam papers, short or completely worked solutions in written form, and sometimes the working through of questions step by step by a lecturer in class.

Hattie and Timperley (2007) go as far as saying that good quality feedback is the single most powerful influence on student achievement in higher education. Indeed Black and Wiliam (1998) reviewed around 250 studies and showed that formative feedback had a positive benefit on student learning in almost all circumstances, across a range of subjects, levels, situations and student abilities. Whilst few might doubt the above, there is much less certainty about the effectiveness of feedback actually provided in universities. In the United Kingdom, attention has been focussed on this in recent years following the introduction of an annual National Student Survey (NSS) of final year students, conducted by the Higher Education Funding Council for England (The National Student Survey 2012). In the 2012 survey at least two of the three questions relating to feedbackFootnote 1 were among the three questions that received the lowest levels of agreement from students, in over three-quarters of the undergraduate mathematics courses, with similar results reported in 2013 ( The year-long project Improving Feedback in Higher Education Mathematical Courses (Robinson 2015), funded by the UK’s National Higher Education STEM Programme (, was established as a direct consequence of the NSS scores on feedback and the students’ responses on feedback for the More Maths Grads (MMG) studyFootnote 2 (Robinson et al. 2010). The project involved staff from engineering, mathematics, physics and chemistry departments, teaching mathematics, across the UK, who were invited to complete an online survey in which they described their typical feedback practice, any atypical activities, and their opinions about the merits of each. A group of seven departments then worked with the project to evaluate either a current or innovative approach to providing feedback. The evaluation was twofold: discussion with the relevant staff members (via phone or email) about their experience of the feedback, and an online or paper survey of students. The survey aimed primarily for open-ended qualitative responses with the intention of seeing (a) what aspects of the provision students considered as ‘feedback’, (b) which of these they found helpful, and to what extent, and (c) what action, if any, students undertook to use the feedback (Robinson 2015).

This paper investigates student perception of screencast feedback implemented as one of the innovative approaches of the above mentioned overarching project in a Differential Equations module at a research-intensive UK university. For the purposes of this paper we define a screencast as a video recording of an explanation of a mathematical concept or a worked solution to a mathematical problem recorded by a lecturer using screen video capture software and accompanied by an audio commentary. Links to the screencasts that are discussed in this paper can be found in a later section. After a written coursework was marked, feedback was recorded as a detailed screencast by the lecturer who worked through the solutions to the questions, but also pointed out the common errors students in this class had made. Once the marked coursework had been returned to the students, the link to the screencast was made available to them. This screencast feedback replaced the feedback that previously had been given during lecture time. We particularly investigate the four research questions:

  1. 1.

    How does student use of screencast feedback compare to the use of other forms of feedback?

  2. 2.

    Is screencast feedback effective feedback?

    1. a.

      Does screencast feedback enable students to understand what was required?

    2. b.

      Does it allow them to make a comparison between their own work and what was required?

    3. c.

      And does it prompt action which helps students close the gap?

  3. 3.

    What are the additional gains if feedback is given in screencast form?

  4. 4.

    How do students rate screencast feedback?


In this literature review, we will first look at feedback in higher education in general, then focus on feedback in mathematics education followed by student views on feedback provided, particularly in mathematics education. We then move to literature on learning mathematics from screencasts, and finally summarise research on screencast feedback. Note that we do not review the development of research on technology use in mathematics education, as there is a scarcity of research beyond practitioners’ reports relating to tertiary education (Lavicza 2010) with the majority of research studies focusing on schools (e.g., see reviews in the special issues ZDM Mathematics Education (2010):42(1),; ZDM Mathematics Education (2010):42(7),; ESM (2014):85(3)


In this paper, we will use Sadler’s (1989) definition of effective feedback: For feedback to be effective, it must encompass three components. First, it must enable a student to understand what is, or was, required. Second it must enable them to make an accurate comparison between the required work, and their own performance. Finally, and most importantly, it must prompt some action which will help the student to close the gap between their work, and the expected standard. Hattie and Timperley (2007) suggest that effective feedback must answer the three questions: 1) Where am I going; 2) How am I going; and 3) Where to next? Effective feedback also provides cues and is in line with goals and objectives. Gibbs and Simpson (2004) place emphasis on timing of feedback, content and quality of feedback and student engagement with feedback, as important conditions under which assessment supports learning.

Feedback in Mathematics Education

Whilst feedback which is available to students informally in classes is undoubtedly important, the word ‘feedback’ more often prompts consideration of what is provided to students in relation to their assessed coursework. Here, we believe that the following description by Robinson (2015) is typical of many practitioners:

Feedback on formal written assessed coursework is given in multiple ways for any one assignment. Generally all or most of the following are used for any one piece of work: (i) short comments on scripts (ii) model answers (iii) review of common errors in class (iv) written summary of common errors (v) follow up one-to-one discussion in practical classes following the return of work (p.163).

With reference to Sadler’s tripartite description of effective feedback, we have observed that it is common in many university departments that tutors focus on the first two: namely students may gain knowledge of what was expected, usually through the provision of model solutions, and they may identify their work’s shortcomings by a variety of written tutor comments, discussion of common errors, and comparison with the model solutions. Student engagement with the feedback, the subsequent action which a student needs to undertake to close the gap between the two is often left to undirected, independent study. However, as Thompson and Lee (2012) comment, “the problem with traditional margin comments isn’t necessarily in the marks themselves, but in the disconnect between what teachers communicate and how students interpret that feedback” (p. 19). If on top of this students sometimes cannot read a marker’s handwriting (Crook et al. 2006, p.108), the usefulness of feedback given is limited.

To what extent do model solutions provide an effective means for students to understand what was really required? Well written and model solutions can certainly show clearly what the tutor was hoping the student would submit. As such, where they are provided they are often highly valued by students (Robinson 2015). However, this is not always the whole story. Students may not always understand the important differences between their own work and the model solutions; for example if the tutor places significant value on the development of a logical argument, and on a well-written explanation of this, a student whose work contains a correct general method but is poorly written or lacks a logical structure might not see clearly how they need to improve. Perhaps more crucially, students see little, if any, of the process of producing the final model solutions. The thinking behind why a particular approach is used, or any initial exploratory work, is usually omitted from the finished ‘product’ of solutions to an exercise. There is a fundamental issue here, about the nature of mathematics and the desired student learning; namely that although much of the content of mathematics courses relates to specific mathematical topics (“solve this type of equation”) there is inherent in this other ideas about the key mathematical skills and attitudes that a student ought to develop (for example, analytical thinking, logic, creativity, an ability to verify solutions, etc.). A set of model solutions to exercises will reflect these ways of thinking, but they may be hidden in the final product, especially to an inexperienced learner, rather than being explicitly discussed.

Perhaps a more fundamental question is: does this typical feedback model lead to student engagement by provoking suitable further work by the students to close the gap between the required standard, and their current performance? Gibbs and Simpson (2004) identify that feedback needs to be received, taken notice of, and acted upon to have any impact. Whilst one might hope for this in motivated students - indeed, believe that these are key skills of an independent learner - many staff report (for example to the MMG study (Robinson et al. 2010), and to the Improving Feedback project (Robinson 2015)) that some students never collect the feedback on paper-based assignments, whilst others collect it but “only look at the mark”. Indeed, “if it is collected, marked work often goes into a drawer or is otherwise misplaced, such that the student can’t find the work when preparing for a subsequent essay” (McLaughlin et al. 2007, p.330).

Student Views on Feedback Provided

In the UK National Student Survey, students consistently rate both the quality and timeliness of the feedback they receive poorly, compared to other aspects of their student experience. The survey asks them to what extent they agree with a series of 22 statements, three of which relate to feedback. The statements on feedback and the percentages of studentsFootnote 3 who definitely or mostly agreed with the three statements in 2012 are:

  • Feedback on my work has been prompt (65 % of students)

  • I have received detailed comments on my work (68 % of students)

  • Feedback on my work has helped me clarify things I did not understand (63 % of students) (The National Student Survey 2012).

In comparison, the average percentage of all other survey questions is 81 %, and the three feedback questions have the lowest percentages of the 22 questions.

Similarly, as part of the MMG project, students who were asked in open-ended questions to identify the least satisfactory aspect of their course cited issues around coursework and feedback more often than any other (Robinson et al. 2010). The same study showed that teaching staff also often recognise problems with feedback, focussing primarily on three aspects: the staff time which is taken to provide effective feedback, the poor quality of some of the feedback given whether because of inexperience or lack of time, and whether students engage with the feedback. Indeed, Kerr and McLaughlin (2009) question whether the form of feedback itself, usually written, could be part of the problem, and report on their findings that students rated the overall quality of feedback more highly if it were in video form.

Learning Mathematics from Screencasts

Educational psychology research by Atkinson (2002) and Mayer (2003) shows that learning from a video with animation and verbal commentary is more effective than learning from on-screen text, narration or animation alone. It therefore comes as no surprise that students have reacted very positively to screencasts of mathematical content to support their learning (Loch et al. 2012). It has also been shown that student performance on mathematics problems may improve once they have watched revision screencasts (Loch et al. 2014). Students have said they appreciate being able to replay, fast-forward and pause videos when they study for assignments or exams (Loch et al. 2012). However, research also shows that students do not want their lectures replaced by screencasts (Mullamphy et al. 2010). Yoon and Sneddon (2011) investigated how recorded lectures were used by students in two large undergraduate mathematics courses. Based on student feedback on online surveys, they report that the availability of lecture recordings can have a detrimental effect on the grades of some student groups: Students who did not attend lectures as they knew the recordings were available and who “intended to watch more recorded lectures than they actually did achieved significantly lower grades” (Yoon and Sneddon 2011, p.425) than students who were exposed to the whole lecture series. While screencasts may have been criticised as too passive and that they cannot challenge student misconceptions (Muller et al. 2008), it has also been argued that “there is a place for screencasts to supplement learning, particularly when previous alternatives for revision have been the study of text books” (Loch et al. 2014, p.266). This indicates that screencasts could be an effective mode to provide feedback to students, maybe even more effective than learning from written comments and solutions.

Screencast Feedback

The non-mathematics education literature contains several studies on student perception of screencasts for feedback on assessment, mostly in disciplines such as language education and creative writing, but also in chemistry education. For example, Ghosn-Chelala and Al-Chibani (2013), in English language classes, trialled individual feedback on assignment drafts. Students appreciated the clarity of feedback in the video, also because the audio narration helped decipher the instructor’s handwriting and editing symbols. Earlier, again in English language teaching, Stannard (2008) reported on two case studies: In the first, individual videos were provided; In the second, one generic video was produced for all students, highlighting common errors. While recording individual videos was seen as time consuming, tutors did not think the time commitment to record the generic video was onerous and they were able to use the video as a reference in the future on the issues that had occurred in that year. Students also reported revisiting the generic video several times. Students reacted very positively to the videos provided.

In chemistry education, both final year project students and first year students regarded screencast feedback on their submitted work as “effective and highly personal”, commenting that it is easier to understand the marker’s reasoning when an audio-visual explanation is given compared to written comments (O’Malley 2011, p.27). On the other hand, the time commitment required by tutors to create feedback screencasts was comparable to more traditional forms of providing feedback. O’Malley suggests providing more generic feedback to a whole cohort of students in addition to the more personal approach he describes. Haxton and McGarvey (2011), in contrast, found that production of generic screencast feedback for chemistry assessment was more time consuming than typing model solutions. While students commented that it took longer to identify specific areas of interest in the video and some students preferred written feedback, the videos were well received by students, particularly since they addressed common mistakes.

Finally, in the context of the teaching of writing, Thompson and Lee (2012) found that screencast feedback on essays (which they name veedback), showing no written comments, created rapport and provided better support than traditional written comments as it creates “a sense of availability” (p.11), which is an impression that staff are accessible and willing to talk to students, and is better suited for in-depth explanations that create “rapport and a sense of support for the writer than traditional written comments” (p.1). Negative feedback came from students averse to a change in feedback approach, and from those who struggled to play back the videos. Silva (2012), in trials using individual videos to provide feedback on writing assignments also suggests that “students may feel more of a social connection” (p.14), as listening to the teacher’s voice results in teacher presence being felt inside and outside the classroom. Vincelette and Bostic (2013) investigated what students in composition classes thought about screencast feedback on assessment and confirm that students prefer video feedback to traditional feedback. Edwards et al. (2012) found that it takes considerably less time to produce screencast feedback than to type comments for Master level essays in communication. The authors also concluded that screencasts are perceived as more personal, and better, and are quicker to capture than typed comments.

Although Thompson and Lee (2012) highlight that students engage actively in learning when they write their own comments while they interpret video feedback, one question remains unanswered: whether video feedback is more effective in improving student performance. Brick and Holmes (2008) suggest the need for more extensive trials of video feedback, to establish “whether all learners respond equally well, irrespective of individual learning style or other factors” (p.339), but also to investigate in more depth the level of acceptance of tutors of this type of feedback provision, and to establish a clear methodology.

We identify the following encouraging themes repeatedly coming through from these previous studies. Screencast feedback is seen as:

  • More personal, creating a teacher presence as students review feedback (Edwards et al. 2012; O’Malley 2011; Silva 2012; Thompson and Lee 2012; Vincelette and Bostic 2013)

  • Easier to understand than traditional handwritten or typed comments (Edwards et al. 2012; Ghosn-Chelala and Al-Chibani 2013; Haxton and McGarvey 2011; O’Malley 2011; Stannard 2008; Thompson and Lee 2012; Vincelette and Bostic 2013)

  • Not necessarily more time consuming than providing traditional feedback (Edwards et al. 2012; O’Malley 2011; Silva 2012; Stannard 2008)

  • Better than traditional feedback according to student views (Edwards et al. 2012; Ghosn-Chelala and Al-Chibani 2013; O’Malley 2011; Stannard 2008; Vincelette and Bostic 2013)

Returning to mathematics, students appear to access feedback when it is provided online, as Stoneham and Prichard (2013) found that three quarters of students in computing and mathematics courses accessed feedback that is provided online, most of them within a day of release. No information is provided on the year level of these students, nor on the total number of students who were part of this study, however around 2000 feedback files were released to students after monitoring had commenced. Stoneham and Prichard call for more research on the provision of (online) feedback to students to establish best practice, so staff time is put to best use. We argue that there is an urgent need to investigate the role screencasts may play in this arena, since there is a dearth of studies in the mathematics education literature on providing feedback to students via screencasts. The very positive student perception reported in the non-mathematical literature is mostly from essay feedback, which is naturally of different nature to assessment feedback in mathematics.

Context of the Study

This study was situated in a research-intensive UK university mathematics department, one of the seven departments involved in the Improving Feedback project (Robinson 2015). Students on the single honours B.Sc Mathematics and the four year Master of Mathematics MMath study the module Differential Equations in the second semester of Year 1. Some students taking joint-honours courses (e.g., in B.Sc. Mathematics and Economics) can choose to take the module in their second year. The Differential Equation module consisted of two 50-min lectures and one 50-min tutorial, in each of 12 weeks. For the tutorials the students were provided with a set of unassessed exercises, mostly with answers but not worked solutions. Whether the students completed the exercises or not was entirely up to them; students were expected to monitor their own progress by checking their answers with their peers and by consulting the answers on the exercise sheets. They could seek help in the tutorials as necessary. This formative self-assessment was supplemented in recent years by the preparation of screencasts for some of the exercises (Loch 2012; Loch et al. 2012).

The module is formally assessed by a final examination (70 %), an in-class test (10 %), and four courseworks (5 % each), two of which are assessed by computer and two of which are written pieces. The marking of the two written pieces of work is performed both by the lecturer and a postgraduate assistant. Due to the size of the cohort, in this study 220 students, and the need for a rapid turnaround of marked scripts, comments written on the scripts are minimal and traditionally have been restricted to indicating correct/incorrect answers or steps in a calculation. In the past, further feedback has been given in written form by supplying worked solutions and by pointing out some common errors during a lecture. This last mechanism for providing feedback is less than ideal for at least three reasons: (1) Some students may be missing from the lecture on the day common errors are reviewed; (2) the review of common errors may be irrelevant for some students; and (3) by the time a review of common errors takes place the module has moved on and both students and lecturer are considering other topics during the lecture.

As part of the Improving Feedback project (Robinson 2015), the module leader of the Differential Equations module agreed to implement a novel form of feedback for one of the written courseworks for the 2011/12 cohort. This took the form of preparation of two screencasts in which two coursework questions would be worked through by the lecturer in detail and common errors made by the students would be pointed out. Freely-available software for PDF annotation was acquired and used on a tablet PC. Proprietary software for recording on-screen activity and audio was used to capture the lecturer’s working and audio commentary. Preparation and recording of each screencast took typically two hours, allowing for setting up equipment, editing to incorporate re-recording of slips of the tongue or pen, and generating the final files for upload. The two resulting mp4 files were approximately 24 min and 10 min long respectively. These were made available on the module pages of the university’s virtual learning environment Moodle.Footnote 4 The two questions on the coursework included in the analysis for the research study focused on the solution of first order differential equations. Figure 1 shows an early frame of the first screencast in which the problem as posed was discussed and salient features pointed out. Figure 2 shows a later frame taken with the worked solution in progress. The two screencast feedback videos discussed in this paper may be accessed online as electronic supplementary material (ESM 1 and 2).

Fig. 1
figure 1

An early frame from the screencast showing the problem as posed to the students

Fig. 2
figure 2

A frame from the screencast showing the worked solution in progress

Once the coursework had been marked and returned, the students were emailed by the lecturer who invited them to watch the screencast and then follow a link to the survey questionnaire. About two weeks later a reminder email was sent. The survey – and hence the results presented below - did not distinguish between the screencasts available for formative assessment (those related to the tutorial sheet exercises) and the two screencasts for summative assessment (the two courseworks).

The Research Questions

As stated in the introduction, we address four main research questions in this paper. Note that question two contains three sub-questions that correspond to Sadler’s (1989) definition of effective feedback. We reiterate the research questions:

  1. 1.

    How does student use of screencast feedback compare to the use of other forms of feedback?

  2. 2.

    Is screencast feedback effective feedback?

    1. a.

      Does screencast feedback enable students to understand what was required?

    2. b.

      Does it allow them to make a comparison between their own work and what was required?

    3. c.

      And does it prompt action which helps students close the gap?

  3. 3.

    What are the additional gains if feedback is given in screencast form?

  4. 4.

    How do students rate screencast feedback?


We analyse student responses on their views together with data from the Virtual Learning EnvironmentFootnote 5 access records to answer the above four research questions in the context of the screencast feedback provided as part of the Differential Equations module. The online survey was advertised to students via email, as stated earlier, and contained open-ended questions amenable to qualitative data analysis and questions for which a quantitative analysis was performed (see Appendix 1 for the survey questions). The open-ended questions directly related to research questions 2, 3 and 4. The remaining questions related primarily to research question 1. In total, 34 of the 220 students taking the module participated in the survey. As the survey responses were anonymous, we have no information regarding the grades the students received in this module, their prior experience or background.

Qualitative Analysis of Open Responses

A grounded theory approach (Charmaz 2006) was used to analyse the qualitative data. Guided but not restricted by the research questions, the open responses were read, re-read and discussed by the authoring team. Each sentence of the 34 students’ responses was scrutinised and coded. For example, some students referred to the way their study habits would change as a consequence of watching the screencasts – coded as “changing study habits”. Others referred to how useful it was to simply hear about the mistakes made by other students – coded as “description of mistakes made by others”. Some went further, noting how they would engage with this information to ensure that they didn’t make the same mistakes –“finding out where other people went wrong (because that could potentially be something I might do in the future by accident)”. The full set of codes is listed in Table 1 together with the number of comments made and the number of individual students making such comments. A relatively small number of comments were disregarded because they were of no relevance to our screencast research (e.g., “our on-line test did not have good feedback”, “written comments on scripts hard to understand”). Further discussion amongst the authoring team resulted in a combination of these codes into three primary codes: process, engagement and richness, as set out in Table 1.

Table 1 Codes used in the analysis of the qualitative data

We then used these three primary codes as principal analytic themes as we explored the links between our data and Sadler’s definition of effective feedback.

  1. 1.

    The Process Involved in Doing Mathematics

    This theme is concerned with mathematical communication, the way mathematical solutions should be set out, learning to think like a mathematician, thinking mathematically, mathematical and more general skills development. In total 12 distinct students made 20 comments that were recorded under this theme.

  1. 2.

    Student Engagement

    This theme is concerned with ways in which the screencasts encouraged: interactions with the mathematics or with others; more and deeper learning; reflection; self-awareness and independent learning. In total 20 distinct students made 35 comments that were recorded under this theme.

  1. 3.

    Richness of Video Screencasts as A Form of Feedback

    This theme is concerned with the ways in which screencasts provided an enhanced learning experience, the ways in which they complemented, supplemented and encouraged combination of existing forms of feedback. In total 29 distinct students made 77 comments that were recorded under this theme.

These three analytic themes, Process, Engagement and Richness, correspond naturally to elements of Sadler’s (1989) definition of effective feedback. Thus, process is associated with understanding what is required in a piece of mathematics and how a mathematician would set out his or her arguments in presenting a solution. Engagement is associated with the ways in which the feedback has prompted action by the student, and particularly action that will bring about development and improvement. Richness is a broad category but one that includes having the resources to be able to make a comparison between one’s own work and required standards of performance and the performance of other students. There is of course also some overlap between these themes and Sadler’s three roles of feedback as we will indicate in the results section.


How Does Student use of Screencast Feedback Compare to the use of Other Forms of Feedback?

Online access statistics show that out of the 220 enrolled students, 153 accessed the screencast on the first question, and 47 accessed the screencast on the second question. It should be noted that the first question is regarded as more complicated.

To enable a comparison between different types of feedback, students were asked in the survey about the extent to which they had used not just the screencast videos, but all types of feedback available on this module - see Fig. 3. We note that comments written on coursework scripts were the least used form of feedback overall, but this is partly because a substantial proportion of the students (over one-third) reported that these were unavailable to them. This is not surprising since comments written on scripts were limited to indicating correct/incorrect answers or steps, as described earlier. The same is not true for the opportunity to talk to the tutor; all but two students acknowledged that this was a possibility, but this type of feedback was the one students were least likely to use. Of the other four forms of feedback, all were used to some extent by the vast majority of students, but the striking difference is in the degree to which screencasts were used compared with the other three; around 70 % of students reported that they used screencasts “extensively”, and none reported that they used them “a bit”. In other words, this suggests higher levels of student access of video feedback compared to other types of feedback by those students who responded to the survey.

Fig. 3
figure 3

The extent to which students said they used different types of feedback (number of students, excluding those who did not answer and those who said this feedback was unavailable)

Unsurprisingly then, asked to identify the single most used type of feedback, most identified the screencast videos, as shown in Fig. 4.

Fig. 4
figure 4

The most used type of feedback as identified by the students

In summary, our analysis of the quantitative data shows that the students who responded to the survey watch the screencasts more than they access other forms of feedback.

Is Screencast Feedback Effective Feedback?

In this section, and guided by Sadler’s three components, we give a selection of student comments and directly relate them to the three roles of effective feedback.

Does Screencast Feedback Enable Students to Understand What was Required?

Here we draw largely on comments attributed to process. Several students provided details about how the screencasts showed them improvements in how to communicate mathematics. They demonstrated that they gained an understanding of what was required by the task. The screencast feedback showed them “the ideal way to set out and answer our coursework”, and “how to lay out answers in future coursework/exams.” Other students commented on the screencasts:

…they show how the questions need to be answered rather than just being given a solution.

It helped me to improve my writing of the answers by setting it out very neatly so you could [see] the answer very easily.

Some comments referred to ways in which the screencasts provided insight into the lecturer’s way of thinking, ways which students could then learn from and emulate:

…I got to see how the lecturer would answer the questions….

And seeing how someone moves from one step to another…

Many other comments indicated that students had understood what was required (see Table 1).

Does Screencast Feedback Allow Them to Make a Comparison Between Their own Work and What was Required?

Here we draw largely on comments attributed to the richness of the screencast feedback and also on those attributed to engagement. Several students said they had used the screencasts to compare their work to the solution provided, and also said they were learning from this for the future. Seeing and hearing about common mistakes was particularly helpful:

Yes, the videos were very beneficial as they allowed you to see exactly where you had gone wrong with verbal explanations of why certain calculations were carried out, and common mistakes were vocalised.

… the lecturer is guiding you through the question and then you can see why you were wrong….

This means, of course, that these students have not only identified the gap between their work and what was required, but they have also commenced action to close the gap.

Does Screencast Feedback Prompt Action Which Helps Students Close the gap?

We remind the reader that prompting for action is the third role of effective feedback. Here we draw largely on comments attributed to engagement. Rather than being a passive form of instruction students were able to interact with the screencast in a way that would not be possible in a traditional lecture. In particular students were able to intersperse the lecturer’s presentation with time for them to think and actively engage with the content:

Videos were helpful … if you needed to stop to think.

I could pause the video when I wanted to write things or to think things through.

The potential for promoting reflection, deeper learning and time-on-task came to the fore. Two students in particular seem to have been inspired to devote a great deal of time and energy to engage with the screencasts:

Before watching the videos I read over my coursework again and then whilst watching the video compared it to how I had done the question and where it was similar or where I had gone wrong. I then watched the video again and wrote down notes of how to do the questions and additional comments and tips. After reading my coursework and watching the videos I discussed with friends to talk about where I had gone wrong and where they had and to talk about the correct way that the question should have been completed.

I worked through the question again where I had lost the marks and recapped the topics covered using the lecture notes.

The ways in which students can be encouraged by this feedback to become independent learners was evident, as well, in their comments:

I watched the video numerous times, and used it not only to understand any problems or mistakes I made but also as an aid to tackle other problems of a similar nature…..

It allowed me to make my own notes from the feedback and to use this for future reference and revision purposes.

It also meant that instead of staring at my notes I could recap in a different way in order to further enhance my understanding.

More than one student commented that there had been more feedback in this module than in any other they had taken.

What are the Additional Gains if Feedback is Given in Screencast Form?

Students described ways in which the provision of feedback through the screencasts led to an enhanced learning experience. They were particularly keen to emphasise additional gains for them. For example, screencasts provided an opportunity for staff to provide richer detail that might not be covered within the constraints of a lecture or in standard written solutions. Many students commented that they appreciated that they could pause the video. For instance, one student commented:

…. Because sometimes you don’t always gain every single bit of information from a lecture and having a video which goes through slowly, step by step, I find really useful because you can take it in at your own pace.

Students drew comparisons between the feedback from screencasts and the minimal written feedback they had traditionally received, and were continuing to receive in other modules. They stated that it was particularly beneficial to hear a lecturer talk through the solution of a problem and the mistakes commonly made. They stated that providing feedback in this way added clarity as to why a solution might be incorrect. They particularly valued seeing all the mathematical steps in a solution, noting particularly the level of detail and depth provided in the screencast feedback. In particular, one student wrote:

… really helpful as sometimes in written solutions I can struggle to follow where a step has come from. With a commentary alongside the workings it is clear exactly what is going on.

For students who have been largely successful in the coursework, but who need help with isolated parts of the solution, the screencast can be an efficient way of providing this. These students do not need to sit through parts of a feedback session when the lecturer is going over material that they could already do. Instead they can scroll ahead to problematic parts:

It was helpful to see the video as you could just look at the parts you struggled with and got wrong instead of listening to feedback on the whole coursework as other lecturers would do in a lecture, and modify the pace accordingly

The above quotes highlight ways in which a screencast can be better than a traditional feedback lecture or feedback provided in writing on a script. For others, the richness came from feeling they were personally addressed by the lecturer, and from being able to choose how, when and where they worked:

It makes you feel like the lecturer is explaining everything to you personally.

The videos were particularly useful as it gave an environment to learn and understand in the comfort of my own home. Being comfortable and being at my own pace is very important for learning.

As an enhanced form of feedback, the screencasts were particularly popular:

I had made some errors in the first question on the coursework and on the marked script it was highlighted where the error was but it didn’t explain why it was an error… with the video commentary the lecturer could talk you through the problem and could point out where most students made mistakes and perhaps even say it was a common mistake but then explain why it’s a mistake…

The vast majority of responses were positive. This is not altogether surprising given that those students who had responded to the survey were most likely those that had chosen to watch the video feedback. However some of the students who watched the videos found that they did not live up to their expectations or were unhelpful. With specific reference to the screencasts, dissenting comments, or comments that suggested there was room for improvement, seemed to come from students who prefer to read through solutions on paper, those who did not think they needed to see all the steps, and others who do not always study in front of a computer. Specific comments included

I would have preferred to have a marking scheme with answers and workings for the questions; this would allow me to look through the coursework at my own pace. It does not help to have the answers written before me on a screen whilst listening to the narrative. I am quite capable of following decent working on a sheet of paper (an on-line pdf).

The video system was useful although I would have liked a printable Word-like document as well as then I could read ahead if [I] needed to, or had a hard copy when a computer wasn’t available.

We note that the issue of a marking scheme and a printable document are not criticisms of the screencasts per se, but we will take them as suggestions for the future. The issue of whether there was too much detail is an individual and personal one. We contend that too much detail is better than too little as students can always fast-forward through detail they are already confident with.

How do Students Rate Screencast Feedback?

Students were asked to rate the quality of each type of feedback which they said they had received as either Excellent, Above Average, Below Average, or Poor (Fig. 5), and to identify the single best type of feedback for them (Fig. 6). The survey aimed not to prejudge what was high quality by including the statement:

Fig. 5
figure 5

The proportion of students (n = 34) who rated feedback from excellent to poor, (excluding those students who said this feedback was unavailable, or responded “don’t know”)

Fig. 6
figure 6

The number of students who identified each type of feedback as the best

Whether you like or dislike particular sorts of feedback might be for a variety of reasons, which will depend on your opinions.

Figure 5 shows that over 80 % of the students rated the quality of the video feedback as “excellent”, more than double the proportion who said the same about any other type of feedback; Fig. 6 shows that the vast majority of students felt that it was the best feedback they received.

In summary, our analysis of the quantitative data shows that the students who responded to the survey watch the screencasts more than they access other forms of feedback, and they rate screencasts higher than other forms of feedback.

Overall, our results would indicate that students found this method to be particularly helpful, which is exemplified by this student comment:

It is by far the most useful way of providing feedback and help.

Discussion and Conclusions

We acknowledge that one of the limitations of our study is the number of students who completed the survey compared to the total number of students who were enrolled in the course: 34 of the 220 students responded. However, since students were only prompted to respond to the survey after watching the feedback screencasts, our target population is not the entire class cohort, but students who had decided to watch the videos. It is likely that those students who responded to the survey were feeling more positive about the screencasts than other students in the class. The comments suggesting improvement show that not only students who were in favour of screencast feedback voiced their opinions. While we can only report in this paper on the sample of the class that responded, our outcomes nonetheless have implications for the larger body of students.

Our study confirms that the students who responded to the survey think screencast feedback is more personal and easier to understand – a finding consistent with other studies reported in the literature (Edwards et al. 2012; Ghosn-Chelala and Al-Chibani 2013; Haxton and McGarvey 2011; O’Malley 2011; Silva 2012; Stannard 2008; Thompson and Lee 2012; Vincelette and Bostic 2013). The students regarded screencasts as an appropriate and helpful form of feedback.

How Does Student use of Screencast Feedback Compare to the use of Other Forms of Feedback?

Screencast feedback was preferred to other types of feedback by the students who responded to the survey. Nearly all students who responded said that the screencasts were the type of feedback they had used most.

Is Screencast Feedback Effective Feedback?

We have demonstrated that all three of Sadler’s components of effective feedback are evident in the students’ responses. Students have engaged with the videos, in terms of controlling the content of video when they pause, revisit and think about an answer. Some said they reflect on their own work, and on other solution methods. They transfer what they have learnt from the video to other examples. This is the type of engagement we would expect to see from effective feedback.

What are the Additional Gains if Feedback is Given in Screencast Form?

It appears that the screencast feedback adds another dimension to feedback in mathematics. Many students commented upon the detail in the screencasts and that all the stages were shown without taking shortcuts. Perhaps a message here is that depth is very important when producing screencast feedback as students will appreciate and benefit from this. Students also commented that the screencasts helped them not just develop skills, but also to learn to communicate mathematics like a mathematician. Students thought that the feedback screencasts were giving them more than the short screencasts they had seen to help with exercises, as the feedback screencasts were of direct relevance to the questions they had just been marked on. The commonly made mistakes were seen as feedback on how the whole cohort was doing.

How do Students Rate Screencast Feedback?

The students who responded to the survey rated screencast feedback as the best feedback they had received.

In summary, we believe that screencast feedback has become a learning tool that students use actively to improve their understanding, as it goes beyond dissemination of what is correct or incorrect and allows students to close the gap between the work that they submitted and what was required of them. Looking ahead, we believe there is need for further studies into screencast feedback on mathematics assessment, particularly to establish if students indeed learn from this type of feedback and if it is more effective in improving student performance on tasks. It would be of interest to contrast effectiveness of various uses of screencast feedback. For example, instead of simply presenting and then discussing a model solution it would be possible to use a screencast to show specific examples of high quality student work pointing out why students that produced such work achieved high marks. A further use would be to provide screencast feedback of full examination papers and this could then be made available to future cohorts of students to help them better prepare for examinations.

Taking into account some of the negative comments, a combination of types of feedback may be a very effective form of feedback provided on mathematics coursework: highlighting where an individual student has gone wrong, for instance on the actual script, giving the fully worked solutions in PDF form, but also giving a detailed solution as a screencast, with additional commentary on common mistakes, why these are mistakes, and on other misconceptions. This should please the students who commented they were more comfortable reading through a solution rather than watching its development. We also suggest that future research should include the tutor/lecturer perspective, since the production of a video resource is time consuming and editing requires a certain level of technology expertise.


  1. There were 22 questions in total. Students are asked to respond on a 5 point Likert scale to the three questions on feedback:

    Feedback on my work has been prompt.

    I have received detailed comments on my work.

    Feedback on my work has helped me clarify things I did not understand.

  2. More Maths Grads was a three-year project funded by the Higher Education Funding Council for England to develop, trial and evaluate means of increasing the number of students studying mathematics and encouraging participation from groups of learners who have not traditionally been well represented in higher education. See

  3. Full time students from England


  5. Also know as Learning Management System in some countries


  • Atkinson, R. K. (2002). Optimizing learning from examples using animated pedagogical agents. Journal of Education & Psychology, 94(2), 416–427.

    Article  Google Scholar 

  • Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy and Practice, 5(1), 7–74.

    Article  Google Scholar 

  • Brick, B. & Holmes, J. (2008). Using screen capture software for student feedback. In Klinshuk, D., Sampson, G., Spector, J.M., Isaias, P., and Ifenthaler, D. (Eds.), Cognition and exploratory leaning in digital age: proceedings of the IADIS CELDA 2008 conference, IADIS CELDA 2008 (pp: 339–342). IADIS.

  • Charmaz, K. (2006). Constructing grounded theory Sage Academic Publishers: Sage Publications.

  • Crook, C., Gross, H., & Dymott, T. (2006). Assessment relationships in higher education: the tension of process and practice. British Educational Research Journal, 32, 95–114.

    Article  Google Scholar 

  • Edwards, K., Dujardin, A.-F., & Williams, N. (2012). Screencast Feedback for Essays on a Distance Learning MA in Professional Communication. Journal of Academic Writing, 2(1), 95–126

  • Ghosn-Chelala, M., & Al-Chibani, W. (2013). Screen-capture and audio recording as an alternative feedback approach in freshman writing classes. Morris, L. & Tsolakidis, C. (Eds.) The International Conference on Information Communication Technologies in Education: ICICTE (pp.267–273). Greece:Crete.

  • Gibbs, G., & Simpson, C. (2004). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, 1(1), 3–31.

    Google Scholar 

  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 88(1), 81–112.

    Article  Google Scholar 

  • Haxton, K.J., & McGarvey, D.J. (2011). Screencasting as a means of providing timely, general feedback on assessment. New Directions, 7 , 18–21. doi: 10.11120/ndir.2011.00070018

  • Kerr, W., & McLaughlin, P. (2009). The benefit of screen recorded summaries in feedback for work submitted electronically. Paper presented at the Representatives’ Forum 2009, Birmingham, UK. Abstract retrieved from

  • Lavicza, Z. (2010). Integrating technology into mathematics teaching at the university level. ZDM Mathematics Education, 42, 105–119.

    Article  Google Scholar 

  • Loch, B. (2012). Screencasting for mathematics online learning – a case study of a first year Operations Research course at a dual delivery mode Australian university. In A.A. Juan (Ed.), Teaching Mathematics Online: Emergent Technologies and Methodologies.

  • Loch, B., Gill, O., & Croft, T. (2012). Complementing mathematics support with online MathsCasts. ANZIAM Journal, 53, C561–C575.

  • Loch, B., Jordan, C., Lowe, T., & Mestel, B. (2014). Do screencasts help to revise prerequisite mathematics? An investigation of student performance and perception. International Journal of Mathematical Education in Science and Technology, 45(2), 256–268.

    Article  Google Scholar 

  • Mayer, R. E. (2003). Elements of a science of eLearning. Journal of Educational Computing Research, 29(3), 297–313.

    Article  Google Scholar 

  • McLaughlin, P., Kerr, K., & Howie, K. (2007). Fuller, richer feedback, more easily delivered, using tablet PCs. Proceedings for the 11th International Conference on Computer Aided Assessment (pp. 327–340). UK: Loughborough.

  • Mullamphy, D., Higgins, P., Belward, S., & Ward, L. (2010). To screencast or not to screencast. ANZIAM J., 51 (EMAC2009), C446–C460.

  • Muller, D., Bewes, J., Sharma, M., & Reimann, P. (2008). Saying the wrong thing: improving learning with multimedia by including misconceptions. Journal of Computer Assisted Learning, 24, 144–155.

    Article  Google Scholar 

  • National HE STEM Programme. (2013). from Available at: [Accessed 30 06 2013].

  • O’Malley, P. J. (2011). Combining screencasting and a tablet PC to deliver personalised student feedback. New Directions, 7, 27–30. doi: 10.11120/ndir.2011.00070027

  • Robinson, M. (2015). Providing effective feedback. In T. Croft, M. Grove, J. Kyle, & D. Lawson (Eds.), Transitions in Undergraduate Mathematics Education: Higher Education Academy. Birmingham, UK, University of Birmingham.

  • Robinson, M., Thomlinson, M., & Challis, N. (2010). From the horse’s mouth 2: The best of times, the worst of times.... In Robinson, Challis, & Thomlinson (Eds.), Maths at University: Reflections on experience, practice and provision (pp. 58–64): More Maths Grads Project. Birmingham, UK, University of Birmingham

  • Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18, 119–144.

    Article  Google Scholar 

  • Silva, M. L. (2012). Camtasia in the classroom: student attitudes and preferences for video commentary or microsoft word comments during the revision process. Computers and Composition, 29, 1–22.

    Article  Google Scholar 

  • Stannard, R. (2008). Screen capture software for feedback in language education. Paper presented at the Second International Wireless Ready Symposium Interactivity, collaboration and feedback in language learning technologies, 29 March 2008, NUCB Graduate School, Japan

  • Stoneham, R., & Prichard, M. (2013). Look, listen and learn! Do students actually look at and/or listen to online feedback? Compass (the Teaching and Learning Journal of the University of Greenwich), 7.

  • The National Student Survey. (2012). Retrieved from

  • Thompson, R., & Lee, M.J. (2012). Talking with Students through Screencasting: Experimentations with Video Feedback to Improve Student Learning. The Journal of Interactive Technology & Pedagogy. Retrieved from

  • Vincelette, E. J., & Bostic, T. (2013). Show and tell: student and instructor perceptions of screencast assessment. Assessing Writing, 18, 257–277.

    Article  Google Scholar 

  • Yoon, C., & Sneddon, J. (2011). Student perceptions of effective use of tablet PC recorded lectures in undergraduate mathematics courses. International Journal of Mathematical Education in Science and Technology, 42(4), 425–445.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Birgit Loch.

Electronic Supplementary Material


(MP4 41166 kb)


(MP4 16066 kb)



Survey Questions

This survey is mainly about individual coursework questions on your Differential Equations module/unit/course and in particular the use of screencasts as a way of giving feedback.

Part 1: (Open-Ended Initial Question)

  1. 1.

    Please briefly describe all the feedback that was available to you during your degree course.

  2. 2.

    Was any feedback particularly helpful or useful ? (please give brief reasons; what was helpful or useful to you, and why ?)

  3. 3.

    Was any feedback particularly unhelpful ?

  4. 4.

    Did you feel you received enough feedback ?

Part 2: (Multiple Choice Questions and Open Ended Questions on the use of Feedback)

  1. 5.

    Listed below are some forms of feedback that might have been available on your degree course.

In each case please indicate how much you looked at it/ read it/ used it in any way.

If the feedback was not available, or you didn’t know that it was available, then please click that option.

  • Mark on your coursework script [Used extensively; Used more; Used a bit; Not at all; Not available]

  • Comments written on your coursework script [Used extensively; Used more; Used a bit; Not at all; Not available]

  • Opportunity to talk to tutor/lecturer about the work [Used extensively; Used more; Used a bit; Not at all; Not available]

  • Screencast (video) of worked solution with commentary [Used extensively; Used more; Used a bit; Not at all; Not available]

  • Screencast (video) with commentary about common errors and misconceptions [Used extensively; Used more; Used a bit; Not at all; Not available]

  • Comments from the tutor/lecturer in class (lecture or tutorial) [Used extensively; Used more; Used a bit; Not at all; Not available]

  • Informal discussion between students. [Used extensively; Used more; Used a bit; Not at all; Not available]

  • Which of these types of feedback have you used most? [drop down list of all 7 types from above]

  • What other means of feedback was available to you?

  • You have said that you used X the most. Please tell us what you did with the feedback.

    For example, you might: Look at the mark; file it; read some comments; read all comments; listen to audio files; watch videos; compare the model solutions to your own; re-read lecture notes to clear up misunderstanding; work through questions again; talk to your tutor about difficulties; make notes when the tutor talks about it in class; listen to what the tutor says in class; email the tutor to ask questions; talk to your friends or colleagues. Or you might do something completely different not mentioned here.

Part 3: (Multiple-Choice and Open Ended Questions on the Quality of the Feedback)

  1. 6.

    For each of the following items which you said were available, please rate the feedback.

(Whether you like or dislike particular sorts of feedback might be for a variety of reasons which will depend upon your opinions. We will ask you about what counts as “good” for you in the next section).

  • Mark on your coursework script [Poor; Below average; Above average; Excellent; Don’t know]

  • Comments written on your coursework script [Poor; Below average; Above average; Excellent; Don’t know]

  • Opportunity to talk to tutor/lecturer about the work [Poor; Below average; Above average; Excellent; Don’t know]

  • Screencast (video) of worked solution with commentary [Poor; Below average; Above average; Excellent; Don’t know]

  • Screencast (video) with commentary about common errors and misconceptions [Poor; Below average; Above average; Excellent; Don’t know]

  • Comments from the tutor/lecturer in class (lecture or tutorial) [Poor; Below average; Above average; Excellent; Don’t know]

  • Informal discussion between students. [Poor; Below average; Above average; Excellent; Don’t know]

  • Please select the item which you think was the BEST feedback. (If you rate two or more things equally just pick one of them. If you think most of the feedback was poor, we’d still like you to pick the one that was the “least bad”). [all 7 types of feedback listed in drop down]

  • You have said that you rated X most highly (or least badly!) What was it about this feedback that made it your highest rated ? Why was it the best ? What was good about ?

Part 4 Miscellany

  1. 7.

    Please tell us roughly what marks you got for the coursework.

  2. 8.

    Thinking about the screencast (video) in particular: did watching it help you to understand what was required ? Did it help you to understand where you went wrong ? Would you do anything differently in the future (exam or future coursework) as a result of watching it ?

  3. 9.

    Finally, are there any other comments you would like to make about the feedback for your degree course

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Robinson, M., Loch, B. & Croft, T. Student Perceptions of Screencast Feedback on Mathematics Assessment. Int. J. Res. Undergrad. Math. Ed. 1, 363–385 (2015).

Download citation

  • Published:

  • Issue Date:

  • DOI:


  • Feedback
  • Screencast
  • Solution
  • Commonly made mistakes