Understanding the Learners’ Actions when Using Mathematics Learning Tools

  • Paul Libbrecht
  • Sandra Rebholz
  • Daniel Herding
  • Wolfgang Müller
  • Felix Tscheulin
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7362)

Abstract

The use of computer-based mathematics tools is widespread in learning. Depending on the way that these tools assess the learner’s solution paths, one can distinguish between automatic assessment tools and semi-automatic assessment tools. Automatic assessment tools directly provide all feedback necessary to the learners, while semi-automatic assessment tools involve the teachers as part the assessment process. They are provided with as much information as possible on the learners’ interactions with the tool.

How can the teachers know how the learning tools were used and which intermediate steps led to a solution? How can the teachers respond to a learner’s question that arises while using a computer tool? Little is available to answer this beyond interacting directly with the computer and performing a few manipulations to understand the tools’ state.

This paper presents SMALA, a web-based logging architecture that addresses these problems by recording, analyzing and representing user actions. While respecting the learner’s privacy, the SMALA architecture supports the teachers by offering fine-grained representations of the learners’ activities as well as overviews of the progress of a classroom.

Keywords

Learning Tool Learn Management System Event Object Cognitive Tutor SMALA Server 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [AI11]
    Michèle Artigue and International Group of Experts on Science and Mathematics Education Policies. Les défis de l’enseignement des mathématiques dans l’éducation de base. Number 191776 - UNESCO Archive, IBE Geneva (2011)Google Scholar
  2. [AMSK09]
    Aleven, V., McLaren, B., Sewall, J., Koedinger, K.: A new paradigm for intelligent tutoring systems: Example-tracing tutors. International Journal of Artificial Intelligence in Education 20 (2009)Google Scholar
  3. [BHK+11]
    Bescherer, C., Herding, D., Kortenkamp, U., Müller, W., Zimmermann, M.: E-learning tools with intelligent assessment and feedback. In: Graf, S., Lin, F., Kinshuk, McGreal, R. (eds.) Intelligent and Adaptive Learning Systems: Technology Enhanced Support for Learners and Teachers. IGI Global, Addison Wesley (2011) (in press)Google Scholar
  4. [DT08]
    Drijvers, P., Trouche, L.: From artifacts to instruments: a theoretical framework behind the orchestra metaphor. In: Heid, K., Blume, G. (eds.) Research on Technology and the Teaching and Learning of Mathematics, Information Age, Charlotte, NC, USA, pp. 363–392 (2008)Google Scholar
  5. [Fes10]
    Fest, A.: Creating interactive user feedback in DGS using scripting interfaces. Acta Didactica Napocensia 3(2) (2010)Google Scholar
  6. [GC06]
    Guéraud, V., Cagnat, J.-M.: Automatic Semantic Activity Monitoring of Distance Learners Guided by Pedagogical Scenarios. In: Nejdl, W., Tochtermann, K. (eds.) EC-TEL 2006. LNCS, vol. 4227, pp. 476–481. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  7. [Gog09]
    Goguadze, G.: Representation for Interactive Exercises. In: Carette, J., Dixon, L., Coen, C.S., Watt, S.M. (eds.) MKM 2009, Held as Part of CICM 2009. LNCS, vol. 5625, pp. 294–309. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  8. [GRNR09]
    Fontenla González, J., Rodríguez, M., Nistal, M., Anido Rifón, L.: Reverse oauth: A solution to achieve delegated authorizations in single sign-on e-learning systems. Computers and Security 28(8), 843–856 (2009)CrossRefGoogle Scholar
  9. [HS11]
    Herding, D., Schroeder, U.: Using capture & replay for semi-automatic assessment. In: Whitelock, D., Warburton, W., Wills, G., Gilbert, L. (eds.) CAA 2011 International Conference (2011)Google Scholar
  10. [JGB+08]
    Jovanovic, J., Gasevic, D., Brooks, C., Devedzic, V., Hatala, M., Eap, T., Richards, G.: LOCO-Analyst: semantic web technologies in learning content usage analysis. International Journal of Continuing Engineering Education and Life Long Learning 18(1), 54–76 (2008)CrossRefGoogle Scholar
  11. [KAHM97]
    Koedinger, K., Anderson, J., Hadley, W., Mark, M.: Intelligent tutoring goes to school in the big city. IJAIED 8(1) (1997)Google Scholar
  12. [MMS08]
    Melis, E., McLaren, B., Solomon, S.: Towards Accessing Disparate Educational Data in a Single, Unified Manner. In: Dillenbourg, P., Specht, M. (eds.) EC-TEL 2008. LNCS, vol. 5192, pp. 280–283. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  13. [RZ11]
    Rebholz, S., Zimmermann, M.: Applying computer-aided intelligent assessment in the context of mathematical induction. In: eLearning Baltics 2011. Fraunhofer Verlag, Stuttgart (2011)Google Scholar
  14. [SGZS05]
    Spannagel, C., Gläser-Zikuda, M., Schroeder, U.: Application of qualitative content analysis in user-program interaction research. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research 6(2) (2005)Google Scholar
  15. [ZS06]
    Zinn, C., Scheuer, O.: Getting to Know Your Student in Distance Learning Contexts. In: Nejdl, W., Tochtermann, K. (eds.) EC-TEL 2006. LNCS, vol. 4227, pp. 437–451. Springer, Heidelberg (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Paul Libbrecht
    • 1
  • Sandra Rebholz
    • 2
  • Daniel Herding
    • 3
  • Wolfgang Müller
    • 2
  • Felix Tscheulin
    • 2
  1. 1.Institute for Mathematics and InformaticsKarlsruhe University of EducationGermany
  2. 2.Media Education and Visualization GroupWeingarten University of EducationGermany
  3. 3.Computer-Supported Learning Research GroupRWTH Aachen UniversityGermany

Personalised recommendations