A Comparison of the Effects of Nine Activities within a Self-directed Learning Environment on Skill-Grained Learning

  • Ari Bader-Natal
  • Thomas Lotze
  • Daniel Furr
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6738)

Abstract

Self-directed learners value the ability to make decisions about their own learning experiences. Educational systems can accommodate these learners by providing a variety of different activities and study contexts among which learners may choose. When creating a software-based environment for these learners, system architects incorporate activities designed to be both effective and engaging. Once these activities are made available to students, researchers can evaluate these activities by analyzing observed usage and performance data by asking: Which of these activities are most engaging? Which are most effective? Answers to these questions enable a system designer to highlight and encourage those activities that are both effective and popular, to refine those that are either effective or popular, and to reconsider or remove those that are neither effective nor popular. In this paper, we discuss Grockit – a web-based environment offering self-directed learners a wide variety of activities – and use a mixed-effects logistic regression model to model the effectiveness of nine of these supplemental interventions on skill-grained learning.

Keywords

self-directed learning learner control skill-grained evaluation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bader-Natal, A.: Incorporating game mechanics into a network of online study groups. In: Craig, S.D., Dicheva, D. (eds.) Supplementary Proceedings of the 14th International Conference on Artificial Intelligence in Education (AIED-2009), Intelligent Educational Games workshop, July 2009, vol. 3, pp. 109–112. IOS Press, Brighton (2009)Google Scholar
  2. 2.
    Bader-Natal, A.: Interaction synchronicity in web-based collaborative learning systems. In: Bastiaens, T., Dron, J., Xin, C. (eds.) Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2009, pp. 1121–1129. AACE, Vancouver (2009)Google Scholar
  3. 3.
    Bates, D., Maechler, M.: lme4: Linear mixed-effects models using S4 classes (2010)Google Scholar
  4. 4.
    Donovan, J.J., Radosevich, D.J.: A meta-analytic review of the distribution of practice effect: Now you see it, now you don’t. Journal of Applied Psychology 84, 795–805 (1999)CrossRefGoogle Scholar
  5. 5.
    Dron, J.: Control and constraint in e-learning: Choosing when to choose. Information Science Publishing, United Kingdom (2007)CrossRefGoogle Scholar
  6. 6.
    Dron, J., Anderson, T.: Collectives, networks and groups in social software for e-Learning. In: Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education Quebec, vol. 16, p. 2008 (2007) (retrieved February)Google Scholar
  7. 7.
    Lord, F.M.: Applications of Item Response Theory to practical testing problems. Lawrence Erlbaum Associates, Hillsdale (1980)Google Scholar
  8. 8.
    R Development Core Team. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria (2010)Google Scholar
  9. 9.
    Smith, M.K., Wood, W.B., Adams, W.K., Wieman, C., Knight, J.K., Guild, N., Su, T.T.: Why peer discussion improves student performance on in-class concept questions. Science 323(5910), 122–124 (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Ari Bader-Natal
    • 1
  • Thomas Lotze
    • 1
  • Daniel Furr
    • 1
  1. 1.Grockit, Inc.San FranciscoUSA

Personalised recommendations