Animated Demonstrations: Evidence of Improved Performance Efficiency and the Worked Example Effect

  • David Lewis
  • Ann Barron
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5619)

Abstract

The purpose of this study was to assess the efficiency and effectiveness of animated demonstrations, to determine if those using animated demonstrations would exhibit the worked example effect [1], and a delayed performance decrement, described as Palmiter’s animation deficit [2], [3]. The study measured relative condition efficiency (RCE) [4] and developed a construct called performance efficiency (PE). Results revealed the animated demonstration groups assembled the week one problem in significantly less time than the practice group, providing evidence for the worked example effect with animated demonstrations. In addition, subjects from the demonstration groups were significantly more efficient (given performance efficiency) than those from the practice group. Finally, group performance did not differ a week later, providing no evidence of Palmiter’s animation deficit.

Keywords

Animation cognitive load performance efficiency 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Sweller, J., Chandler, P.: Evidence for Cognitive Load Theory. Cognit Instruct 8(4), 351–362 (1991)CrossRefGoogle Scholar
  2. 2.
    Animation as documentation: A Replication with Reinterpretation, http://www.stc.org/proceedings/ConfProceed/1998/PDFs/00006.PDF
  3. 3.
    Palmiter, S.L., Elkerton, J., Baggett, P.: Animated demonstrations vs. written instructions for learning procedural tasks: a preliminary investigation. Int. J. Man Mach. Stud. 34, 687–701 (1991)Google Scholar
  4. 4.
    Paas, F.G.W.C., van Merrienboer, J.J.G.: The efficiency of instructional conditions: An approach to combine mental-effort and performance measures. Hum Factors 35(4), 737–743 (1993)Google Scholar
  5. 5.
    Sweller, J., Cooper, G.A.: The use of worked examples as a substitute for problem solving in learning algebra. Cognition and Instruction 2(1), 59–89 (1985)CrossRefGoogle Scholar
  6. 6.
    Tuovinen, J.E., Sweller, J.: A comparison of cognitive load associated with discovery learning and worked examples. J. Educ. Psychol. 91(2), 334–341 (1999)CrossRefGoogle Scholar
  7. 7.
    Lewis, R.D.: Demobank: a method of presenting just-in-time online learning. In: The Proceedings of the Association for Educational Communications and Technology (AECT) Annual International Convention, Orlando, FL, October 2005, vol. 2, pp. 371–375 (2005)Google Scholar
  8. 8.
    Waterson, P.E., O’Malley, C.E.: Using animated demonstrations in multimedia applications: Some suggestions based upon experimental evidence. In: Salvendy, G., Smith, M.J. (eds.) Human-Computer Interaction: Software and Hardware Interfaces, Proceedings of the Fifth International Conference on Human-Computer Interaction (HCI International 1993), pp. 543–548 (1993)Google Scholar
  9. 9.
    Paas, F., Tuovinen, J.E., Tabbers, H.K., Van Gerven, P.W.M.: Cognitive load measurement as a means to advance cognitive load theory. Educ. Psychol. 38(1), 63–71 (2003)CrossRefGoogle Scholar
  10. 10.
    Brünken, R., Plass, J.L., Leutner, D.: Direct measurement of cognitive load in multimedia learning. Educ. Psychol. 38(1), 53–61 (2003)CrossRefGoogle Scholar
  11. 11.
    Gagné, R.M.: Problem solving. In: Melton, A.W. (ed.) Categories of human learning. Academic Press, New York (1964)Google Scholar
  12. 12.
    Lewis, R.D.: The acquisition of procedural skills: An analysis of the worked-example effect using animated demonstrations. Unpublished doctoral dissertation, University of South Florida (2008)Google Scholar
  13. 13.
    Cooper, G., Sweller, J.: Effects of schema acquisition and rule automation on mathematical problem-solving transfer. J. Educ. Psychol. 79(4), 347–362 (1987)CrossRefGoogle Scholar
  14. 14.
    Stevens, J.: Applied multivariate statistics for the social sciences. Erlbaum, Mahwah (2002)MATHGoogle Scholar
  15. 15.
    Techsmith, TechSmith Camtasia Studio 4.0 [Computer program]. Okemos, MI (2006)Google Scholar
  16. 16.
    Adobe Systems: Adobe Photoshop Elements 2.0 [Computer program]. Mountain View, CA (1990-2002)Google Scholar
  17. 17.
    Techsmith: TechSmith Morae 1.0.1 [Computer program]. Okemos, MI (2004)Google Scholar
  18. 18.
    Glass, G., Hopkins, K.: Statistical methods in education and psychology, 2nd edn. Allyn and Bacon, Boston (1984)Google Scholar
  19. 19.
    Macro to test multivariate normality, http://support.sas.com/kb/24/983.html
  20. 20.
    Bruner, J.S.: The act of discovery. Harv. Educ. Rev. 31(1), 21–32 (1961)Google Scholar
  21. 21.
    Ausubel, D.P.: The psychology of meaningful verbal learning; an introduction to school learning. Grune and Stratton, New York (1963)Google Scholar
  22. 22.
    Mayer, R.: Multimedia Learning. Cambridge University Press, Cambridge (2001)CrossRefGoogle Scholar
  23. 23.
    Kirschner, P.A., Sweller, J., Clark, R.E.: Why minimal guidance during instruction does not work: an analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educ. Psychol. 41(2), 75–86 (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • David Lewis
    • 1
  • Ann Barron
    • 1
  1. 1.University of South FloridaTampaUSA

Personalised recommendations