Introduction

Many health sciences students use free online videos to supplement their learning (Fu et al., 2022; Burns et al., 2020). One specific type of videos that have gained popularity among students are animated videos, a format that students find to be visually appealing and engaging (Stadlinger et al., 2021). Advocates believe that the animation format is especially beneficial for illustrating complex concepts in health sciences that might otherwise be difficult to visualize with static graphics (O'day, 2006; Tackett et al., 2021). Because animations can depict the changes involved in a dynamic system more explicitly, there is a widely held assumption that they provide advantages over static graphics for learning.

Despite their popularity and the claimed benefits, the actual impact of the animated format remains unclear, as prior research has reported mixed results (Brown et al., 2020; Daly et al., 2016). Another issue related to students’ use of animated videos, specifically concerning free online videos, is the lack of quality control, which poses the risk of the spread of misleading information (Aydın & Yılmaz, 2021). In this context, many institutions have invested in commercially developed videos that are produced by following a clearly defined production process. This has brought up several other issues, one of which being whether students use the resources. Ellaway et al. (2014) stated that faculty often overestimate students’ adoption and use of new educational technology. Given the substantial investments in commercial learning resources, it is critical for institutions to monitor students’ use of given resources. Another issue is that while commercial digital resources are gaining an increasingly important place in health professions education, research on their implementation is limited (Tackett et al., 2021).

Plass et al. (2009) asserted that research should ask questions specific to the type of animation being studied. A unique and contemporary animation style that has been increasingly embraced by health professions educators and students is whiteboard animations. By presenting content through real-time, hand-drawn illustrations on a whiteboard background, these animations provide step-by-step guidance to learners, often with the support of audio narration (Türkay, 2016). Unlike other video or animation styles, whiteboard animations reveal the content as it is drawn on the screen. They can also use entertaining line sketches to elicit learners’ positive emotion, exemplifying emotional design in multimedia learning (Türkay, 2016; Um et al., 2012). Whiteboard animations can also foster immersive learning experience by including a hand or pen in the drawing process, creating a sense of real-time engagement for learners. A similar animation format that uses hand-drawn illustration is the Khan Academy videos, which have been found to be more engaging than slideshow presentations (Guo et al., 2014).

The distinctive features of whiteboard animations make them particularly advantageous in health professions education. While traditionally filmed videos can be effective for demonstrating clinical procedures, they may not be as effective in presenting complex concepts as they often lack the dynamic visualizations and animations that can significantly simplify such content. Filming a live clinical procedure also needs sophisticated equipment and needs to consider patient privacy and ethical compliance, which can be time-consuming and challenging in a fast-paced health professions educational setting. Whiteboard animations offer a more convenient alternative, as they can be efficiently created using a digital whiteboard and video editing software.

Despite their widespread use, whiteboard animations' effectiveness in health professions education is understudied. Given the increasing enthusiasm for whiteboard animations, it is imperative to conduct in-depth research to better understand their impact and ensure optimal implementation. This current study aims to contribute to this important area of inquiry. We have been providing predoctoral dental students with complimentary access to commercially developed, whiteboard animated videos on a web-based learning platform called Osmosis. Two studies have documented medical students’ experience with the videos on this platform (Hudder et al., 2019; Tackett et al., 2021). To the best of our knowledge, there are no existing studies that report the uses of these videos with dental students. Therefore, this exploratory study evaluated first-year dental students’ uses and effectiveness of these videos as a supplementary tool for learning basic sciences. The research questions are as follows.

  1. (1)

    What was students’ perceived effectiveness of the animated videos?

  2. (2)

    What was students' video usage behaviors and patterns?

  3. (3)

    How did watching the animated videos correlate with students’ exam performance in basic sciences?

Literature Review

Benefits and Drawbacks of Animations

Well-designed animations could have positive affective and cognitive impact on learners (Lowe, 2004). It is argued that the animated format may attract learners’ attention and motivate them to learn the material. The animated format may also facilitate learning by explicitly visualizing complex concepts, helping learners create a mental model of abstract objects (Lowe, 2004). However, Weiss et al. (2002) suggested that once the novelty of the animated format wears off, the animation may not continue to attract learners’ attention. The researchers also proposed that, while the animated format is especially helpful for presenting abstract concepts and procedures that are otherwise not visible to the naked eyes, the use of animations may not enhance student learning if the concept or procedure doesn’t reach a certain level of complexity (Weiss et al., 2002).

Cognitive Load Perspective

Researchers have often cited the cognitive load theory (Mayer, 2002) to understand the impact of multimedia learning. While the animated format might intuitively seem more effective than static graphics, researchers cautioned that it also has drawbacks. Firstly, the animated format could be distracting and difficult for novice learners to comprehend due to the transient nature of information (Tversky et al., 2002). According to Scheiter et al. (2006), some learners have difficulty identifying and selecting the most relevant information in the animation and may be easily distracted by irrelevant details on the screen.

Additionally, the dynamic visualizations in animations usually flow forward frame by frame quickly, and information is only displayed for a limited amount of time. The continuous flow of information may cause learners to miss important information before it vanishes from the screen (Stebner et al., 2017). They also must remember previous information in order to integrate it with new information in subsequent frames. The large amount of information presented in a short period of time and the need to process information across different frames may overwhelm novice learners and impose an unnecessary extraneous load, potentially impeding learning (Brame, 2016; Lowe, 2004; O'day, 2006; Ruiz et al., 2009; Spanjers et al., 2011). In contrast, the permanent information in static graphics allows novice learners to revisit previously seen information as frequently as needed, with less need to retain information in working memory. As such, novices often learn more from static graphics than from animated instruction (Kalyuga, 2011).

In addition to being overwhelming, Lowe (2004) stated that the animated format could also be underwhelming, leading to oversimplification of learning. He explained that because animations provide direct depiction of a complex system, learners only need to watch these dynamics being portrayed on the screen. There is no need for them to make substantial cognitive effort to construct a mental model of their own. As a result, it may not induce the desirable germane load, another type of cognitive load devoted to integrating knowledge and constructing schemas that is beneficial for deeper understanding (Sweller et al., 1998). In this case, learners may run the risk of superficial, passive processing, and the explicit depiction provided by the animation may give them a false impression of comprehension (Lowe, 2004).

Research on Effectiveness of Animations

Research evidence is mixed and inconclusive regarding the actual educational impact of animations (Brown et al., 2020; Daly et al., 2016). Some studies in disciplines outside of health professions education have reported the positive impact of animations, while others have indicated the opposite. Rosen (2009) reported that animation-based teaching improved students’ knowledge transfer, motivation, and attitude towards science and technology learning. Supporting this finding, Végh (2016) found that the animated teaching method had a positive effect on learning programming. A meta-analysis by Berney and Bétrancourt (2016) concluded that animations are more effective for learning than static graphics.

In contrast, an earlier literature review by Tversky et al. (2002) failed to find clear evidence of the advantages of the animated format over still graphics. The researchers noted that there was often more information presented in the animations than in the static version. They concluded that any observed advantage of animations was due to the extra information presented rather than style of presentation. Several empirical studies also reported that animated videos are not always more effective than static graphics (Hegarty et al., 2003; Liu et al., 2020; Scheiter et al., 2006; Tunuguntla et al., 2008). Scheiter et al. (2006) examined using animations to visualize mathematical solution procedures and discovered that frequent use of animations resulted in inferior performance as compared to learning from traditional text.

Animations in Health Professions Education

A growing body of studies have reported health profession students' positive experience with instructional animations (Adam et al., 2017; Cooper et al., 2019; Hwang et al., 2012; Liu et al., 2020; Tackett et al., 2021; Yellepeddi & Roberson, 2016). However, in terms of the impact on learning, the literature review by Ruiz et al. (2009) found that the existing comparative studies on computer animations in medical education have reported mixed results. Yellepeddi & Roberson (2016) revealed that using animated videos in pharmaceutics improved student learning. Thatcher (2006) found that medical students who used animations to study molecular and cellular biology performed significantly better than those who used traditional text and that the majority of students preferred animations. The study by Marsh et al. (2008) also demonstrated that using animations after students already had some familiarity with the content (embryonic development) improved their learning and long-term retention.

On the other hand, there are several studies that showed no advantage of the animated format as compared with traditional static formats. Liu et al. (2020) compared using animated videos and recorded PowerPoint lectures to teach pathophysiology and found no difference in students’ test performance between the two delivery formats. Tunuguntla et al. (2008) evaluated animations for learning home safety assessment by medical students and observed no significant difference in students’ assessment performance between the group that used animations and the control group that used static graphics. The researchers concluded that the much cheaper static graphics were as effective as animations that are more expensive and time consuming to develop. Similarly, Daly et al. (2016) reported that using animated images increased physiology and pharmacology students’ satisfaction, but there was no strong evidence in favor of the animated format over still images for content learning.

Research on Whiteboard Animations

The current body of research on whiteboard animations in higher education is limited (Schneider et al., 2023; Türkay, 2016). Nevertheless, existing studies have presented initial findings suggesting that whiteboard animations may be a potentially effective educational tool across different disciplines, including science (e.g., Li et al., 2019; Türkay, 2016), social science (e.g., Turkay, 2022; Turkay & Mouton, 2016), business/marketing (e.g., Lento, 2017), and language arts (e.g., Syafrizal et al., 2021). Fiorella et al. (2019) found that students learned human kidney more effectively by watching the instructor draw illustrations in real-time during an instructional video, a distinctive feature of whiteboard animations, rather than viewing already-drawn illustrations. In another study, Fiorella and Mayer (2016) found that learners who observed the instructor draw diagrams with their hand visible in the video achieved significantly better performance in the transfer test compared to the control group who watched videos with static images.

In Türkay's (2016) widely cited research, the author compared four instructional approaches: whiteboard animations, slideshow presentations, audio recordings, and text-based materials. The study found that whiteboard animations were more effective than the other methods in enhancing knowledge retention and engagement among physics students, although the author noted that this could be due to the novelty effect. In a general education science course that utilized a flipped classroom approach, Li et al. (2019) observed that students who watched whiteboard animations achieved higher quiz scores than their peers who watched conventional lecture videos. Students also reported that whiteboard animations were more engaging than lecture videos. Similarly, Turkay (2022) used whiteboard animations to teach social sciences in an online setting and found that they were superior to traditional lectures and narrated slides in promoting learning and engagement. In another study by Bradford and Bharadwaj (2015), whiteboard animated videos were used as a digital storytelling tool by health researchers and were perceived to be emotionally appealing.

Research on Whiteboard Animations in Health Professions Education

There is a sacrality of research on whiteboard animations in health professions education. Hudder et al. (2019) and Tackett et al. (2021) reported medical students’ positive learning experience with commercially developed whiteboard animated videos on an interactive learning platform, but did not provide data on its impact on learning outcomes. Larnard et al. (2020) conducted a study in which whiteboard animations were used as supplementary resources to help medical students learn the decision-making process of antibiotic selection. The study concluded that whiteboard animations could provide an effective way to teach complex concepts in medical education. This finding was supported by Thomson et al. (2016) who reported that whiteboard animated videos were effective in improving medical students’ knowledge about infertility. Whiteboard animations have also been reported to be well received when used for patient education (Occa & Morgan, 2022).

Only one published report was found on the use of whiteboard animation in dental education (Sharmin et al., 2023). The study described the development of whiteboard animation videos to teach histology to first-year dental students. While the videos were well received, the study did not report their impact on learning outcomes. Other existing studies on educational animations in dental education were on other types of animation formats such as 3D animations (Dhulipalla et al., 2015) and cartoon animations (Fa et al., 2020; Lone et al., 2018).

Gaps in Literature and Significance

The contradictory results and the lack of consensus among researchers reveal the need for more research to validate the effectiveness of the animated format. In particular, given the considerable investments necessary to develop whiteboard animations, it is surprising that few of these innovative educational resources have been evaluated for their effectiveness. It is imperative for researchers to systematically evaluate their educational impact, not only to justify the cost but also to inform best practices. Our study represents an essential step in this direction, particularly within the understudied context of health professions education. Our experience could also help other schools identify and implement animated videos to augment student learning.

Methods

Study Context

The study was approved by the university institutional review board (# IRB2020-37). It was conducted at a dental school in the U.S. which offers a three-year accelerated Doctor of Dental Surgery (DDS) program. The first-year curriculum has a heavy focus on basic sciences. In the second and third academic years, stand-alone didactic instruction in basic sciences is significantly reduced as the curriculum becomes increasingly applied and integrated. Because the animated videos on Osmosis are primarily on basic sciences, this present study examined only the basic sciences courses offered in the first academic year. Didactic teaching in basic sciences normally consists of lectures and case-based discussion. The participants of this study were 143 students from the DDS2022 cohort. Data was collected from them at the conclusion of their first academic year (July 2019 to June 2020).

Whiteboard Animated Videos

The majority of the videos on the Osmosis platform are on basic sciences and they are organized by content areas and organ systems. They were developed by the Osmosis video production team in collaboration with clinicians and health professions faculty from different institutions. The videos use the hand-drawn whiteboard animation style that visualizes a concept with step-by-step drawing on a whiteboard background (Fig. 1) accomplished by audio narration. The videos are on average five to ten minutes long to reduce cognitive overload and keep students engaged (Mayer, 2002). A sample video on the respiratory system can be found here (https://youtu.be/0fVoz4V75_E). The video control options available to students include adding videos to playlists, turning on or off onscreen captions, changing the video playback speed, and taking time-stamped notes directly on the platform. The videos were originally developed for medical students but have been eventually adopted by students in other health professions disciplines. In the context of this present study, the videos were provided to our dental students as supplementary learning resources.

Fig. 1
figure 1

Screenshot of hand-drawn whiteboard animation

Data Collection and Analysis

Student Survey

Data on students’ perceived value of the animated videos was collected with a survey developed by the authors and tailored to the specific learning platform under investigation (Appendix A). The survey was reviewed by co-authors who are experienced in educational research to ensure that the questions were relevant to the purpose of the study. It was also reviewed by two student representatives, which allowed us to gather feedback on question clarity. Cronbach’s alpha for the survey was 0.85, indicating high reliability.

The survey included eight Likert-scale questions that asked students to report their overall perceived value of the learning platform and each of its learning features such as the animated videos, 3D objects, flashcards, and others. The questions used a 4-point scale ranging from “Not Valuable at All (1 point)” to “Highly Valuable (4 points)”. Two additional open-ended questions at the end of the survey allowed students to comment on what they liked most about the platform and to provide suggestion for improvement. For the purpose of this present study, reporting of the survey data was focused on those questions pertaining to the animated videos.

The survey was administered via Qualtrics in February 2020. In the introduction section of the survey, students were informed that taking the survey was voluntary and that by continuing with the survey they indicated consent to participate in the study.

Platform Analytics

The platform analytics captured students’ video watching patterns such as the number of videos watched at the class and individual student level, as well as videos that were most frequently watched. Note that the number of videos watched is not the number of unique videos watched. For example, if a student watched a video twice, it counted as two video views. Similarly, if a video was watched by two different students, it counted as two video views. Simple regression analysis was performed to examine whether high or low performing students, defined by students’ year-end class rank, were more likely to watch the videos. A class rank of “1” represents the highest performing student in the class.

Course Exam Scores

Students’ course exam scores in basic sciences were retrieved from ExamSoft, the school’s secure computer-based testing system. All midterm and final exams at the school are administered on ExamSoft. During exams, the student’s laptop is locked down, preventing them from accessing the internet and any course materials on the device. Exam questions are predominantly multiple-choice questions, some of which are case-based to assess students’ ability to apply basic science knowledge in clinical scenarios. Other question formats such as fill-in-the-blank and short answers are also used in some courses but to a much less extent.

Using the “Categories” feature in ExamSoft, course directors had tagged each exam question to one or more than one content area that it assessed. There are 13 basic science content areas such as physiology, biochemistry and pathology that were developed by department chairs for question categorization. The resulting exam performance generated from the ExamSoft instructor portal provided a longitudinal view of students’ performance in each category (content area) over the span of the entire academic year. This longitudinal content area performance was expressed as the percentage of questions answered correctly out of all questions tagged to the specific content area. Simple regression analyses were then conducted to examine the correlation between frequency of video watching and students’ longitudinal performance in each content area.

Results

Perceived Value of Animated Videos

A total of 65 (45.5% response rate) students took the survey. The mean score of students’ perceived value of the videos was 3.2 on a 4-point scale, suggesting that the videos were well received by students. 48 (73.8%) of the respondents reported that the videos were “Valuable” and “Highly Valuable” for their learning. The breakdowns of responses were as follows: Not Valuable at All (n = 2; 3.1%); Somewhat Valuable (n = 15; 23.1%); Valuable (n = 19; 29.2%); and Highly Valuable (n = 29; 44.6%).

For the open-ended question that asked students what they liked most about the learning platform, 19 (57.6%) of the 33 students who answered this question mentioned the animated videos. The most frequently reported benefits of the videos were: the videos were short and concise; the narration was clear and easy to follow; and the animation format was engaging. Other less frequently mentioned benefits of the videos included: reliable content; the recap at the end of each video; the transcript that accompanies the video; and the ability to speed up and down. Some illustrative student comments included: “The videos have top notch production.”; “I like how they break down topics into pieces of information that is easy for viewers to retain. I also enjoy the animations.”; and “I understand that not all topics that we cover in class are available in a video, but it's really nice when there is a video to fall back on.”

For the open-ended question that asked students what suggestion they had for improving the learning platform, seven (25%) of the 28 students who answered this question commented that the platform interface could be made easier to navigate. One student suggested that faculty integrate the videos into the school’s curriculum. Another student emphasized the importance of alignment between course exam questions and the content of videos by stating that, “If professors are going to use the videos, please have exam questions deriving from the videos.”

Video Watching Patterns

The platform analytics revealed that the class watched a total of 10, 919 videos throughout the academic year. Figure 2 shows the total number of videos watched by the class each week. The peaks of video watching fell near the start and end of each quarter. The number of videos watched by individual students ranged from 0 to 627 (see Table 1 for breakdowns). Two students never watched any videos. One student reported in the survey that he/she didn't watch the videos because he/she preferred reading. Regression analysis failed to reveal a significant relationship between class rank and the number of videos watched.

Fig. 2
figure 2

Weekly videos watched by number of students. Take the week of July 22, 2019 as an example. During this week, 132 students watched a total of 2,014 videos

Table 1 Number of videos watched by individual students throughout the academic year

Furthermore, as illustrated in Fig. 3, the top five content areas that were most watched by the class were: biochemistry (1,197 views), cell physiology (1,120 views), genetics (750 views), musculoskeletal (738 views), and immunology (503 views).

Fig. 3
figure 3

Total video watches by content areas

Correlation with Exam Performance

Regression analyses revealed a statistically significant positive correlation between the number of video views and students’ longitudinal exam performance in two content areas: biochemistry (b = 0.00002, t = 2.44, p = 0.02), and nutrition (b = 0.0002, t = 2.18, p = 0.03). The number of video views was not correlated with students’ exam performance in other content areas.

Discussion

Perceived Value and Video Watching Patterns

Survey responses revealed that first-year dental students had a favorable attitude towards the whiteboard animated videos as a supplementary learning tool. Students appreciated that the videos were short, clear, and engaging. Their comments on the brevity of the videos echoed other researchers’ finding that shorter tutorials enhanced student engagement while longer videos might decrease student engagement (Guo et al., 2014; Nilson & Goodson, 2021). The results are in agreement with Hudder et al. (2019) and Tackett et al. (2021) who also reported medical students' positive experience with learning resources on Osmosis.

There was widespread adoption of the animated videos by the class, as indicated by the large number of video views throughout the year. Hudder et al. (2019) identified medical students’ lack of time to learn how to use the Osmosis platform as a barrier to adoption. In the case of our study, we provided a one-hour training session to dental students in the first week of school, which might have contributed to the popularity of the animated videos on this learning platform. Our study reinforces the importance of student training to address potential learning curves with new technology. The platform analytics also revealed that the frequency of video watching varied significantly among individual students. This might be explained by students’ preferred learning methods (e.g., preferring reading to video watching) and alternative learning resources used outside of the Osmosis platform. Overall, our result replicated that of Hudder et al. (2019) who reported that the adoption of the Osmosis platform varied among medical students.

It was also clear from the analytics that videos in some content areas were more frequently watched than those in other content areas. To reiterate, videos on biochemistry, cell physiology, genetics, musculoskeletal, and immunology were most frequently watched by the class. This result is not surprising as the dental school’s curriculum has a heavy focus on these content areas during the first academic year. Another potential contributor to the number of videos watched is faculty championship, which has been shown to be an important factor in medical students’ adoption of the videos on the Osmosis platform (Hudder et al., 2019). In our study, while dental students’ video views were primarily self-directed, to our knowledge, the Biochemistry and Nutrition course director had recommended relevant videos for students to watch to supplement course lectures. In light of this finding, faculty could encourage students to use the videos by identifying and recommending relevant videos for them to watch when applicable. Suggesting videos for students to watch could also reduce the time that students have to spend searching for videos on the platform on their own, leading to more efficient use of their study time.

Correlation with Exam Performance

Another significant finding was that the number of videos watched had a positive correlation with dental students’ longitudinal exam performance in the two content areas of biochemistry and nutrition. This result reflects the analytics which revealed that videos on biochemistry were watched most frequently. It is also in line with the aforementioned fact that the course director of the Biochemistry and Nutrition course actively recommended videos for students to watch. While a positive correlation with longitudinal exam performance was not found for other examined basic science content areas, our mixed results fit well with the literature at large that has reported conflicting findings on the educational impact of instructional animations.

There are several explanations for why a positive correlation was not observed in other content areas. These might include alternative learning resources used and personal differences in prior knowledge and spatial ability (Ruiz et al., 2009). Additionally, Hwang et al. (2012) and Weiss et al. (2002) proposed that animations are more effective for certain types of content such as dynamic concepts that involve complicated interactions of different components. If the content is simple enough to be effectively presented in a static format, the use of animations is not necessary and may even have adverse effect due to the unnecessary extraneous cognitive processing required to process the dynamic visualizations. This implies that the nature of the content also needs to be considered when developing and implementing animations.

Students’ video watching behaviors might also have impacted their learning from the videos. First of all, the number of video views for some content areas might be too small to produce a positive impact on exam performance. Secondly, Lowe (2004) suggested that if the content is difficult and unfamiliar, the cognitive processing load might exceed the learner’s processing capacity. Supporting this statement, the study by Marshet al (2008) concluded that animated materials were more useful after students had some familiarity with the material. Therefore, whether dental students in our present study watched the videos before or after receiving faculty instruction on the topics (preview vs. review) might impact their comprehension of the video content. Thirdly, video viewing strategies may have impacted students’ comprehension. Pausing, changing the video play speed, and re-watching portions of the video have been reported to be beneficial for learning from videos (Brame, 2016; Costley et al., 2021; Kalyuga, 2008; Schwan & Riempp, 2004). Self-testing following video watching is another proven strategy to improve knowledge retention (Roediger & Karpicke, 2006). In contrast, if students watch videos without being actively engaged, the passive experience may lead to limited learning (Owston et al., 2011).

It is also important to note that the content of the animated videos on this platform is not always reflective of the dental school curriculum. As previously mentioned, the videos were originally developed for medical students. Dental students may not study basic sciences in the same level of depth as medical students. Nevertheless, this may change with an increased dental curriculum focus on applied biomedical content to support care of medically complex patients. Lastly, most of the time video watching was voluntary. The videos students chose to watch might not always align with the content of course lectures and exams. While video watching in this case might not directly benefit students’ course exam performance, it is still valuable because students used the videos to pursue personal interest and support their own unique learning needs. Therefore, it is important to consider the value of video watching beyond the measurable effect on exam performance.

To sum up, our results highlight the importance of taking into consideration the complexities of cognitive processing, content topics most suitable for the animated format, learners’ personal differences, faculty direction, and active learning strategies that could facilitate enhanced learning from animated videos. Höffler and Leutner (2007) noted that animations have advantages but are a panacea. The mixed findings presented here and those in related literature suggest a need for further investigation into the role of this specific type of learning modality in health professions education.

Limitations and Future Directions

While we could use a previous cohort who had no access to the videos as the control group, yearly changes in the curriculum and teaching methods could not be controlled for. Personal differences among students such as prior knowledge, learning behaviors in and out of class, and learning resources used might also have impacted the outcomes. Furthermore, since these videos were offered to students as supplementary materials, it is likely that the students who chose to watch them were already intrinsically motivated. This could make it challenging to distinguish the effects of the videos from those of motivation. Thus, a future direction may be to use controlled study designs to account for such variables. It may also be worthwhile to examine students’ video watching strategies and distinctions between its impact on high and low performing students. Finally, while our results were consistent with prior studies in medical education, this present study was conducted in a single dental school and the results may not be generalizable to another setting. We are planning for a follow-up study to compare findings between two different dental schools to further validate the results.

Conclusions

This exploratory study revealed that first-year predoctoral dental students frequently used the whiteboard animated videos to supplement their learning in basic sciences, although usage varied among individual students. The majority of students perceived the videos to be a reliable, effective and engaging means of learning basic sciences. Watching the videos had a positive correlation with students’ exam performance in some but not all content areas. These results are overall promising and support the value of well-designed whiteboard animations to augment student learning. Our study paved the way for future studies to further explore the use and educational value of whiteboard animations.