Understanding the Learners’ Actions when Using Mathematics Learning Tools
The use of computer-based mathematics tools is widespread in learning. Depending on the way that these tools assess the learner’s solution paths, one can distinguish between automatic assessment tools and semi-automatic assessment tools. Automatic assessment tools directly provide all feedback necessary to the learners, while semi-automatic assessment tools involve the teachers as part the assessment process. They are provided with as much information as possible on the learners’ interactions with the tool.
How can the teachers know how the learning tools were used and which intermediate steps led to a solution? How can the teachers respond to a learner’s question that arises while using a computer tool? Little is available to answer this beyond interacting directly with the computer and performing a few manipulations to understand the tools’ state.
This paper presents SMALA, a web-based logging architecture that addresses these problems by recording, analyzing and representing user actions. While respecting the learner’s privacy, the SMALA architecture supports the teachers by offering fine-grained representations of the learners’ activities as well as overviews of the progress of a classroom.
KeywordsLearning Tool Learn Management System Event Object Cognitive Tutor SMALA Server
Unable to display preview. Download preview PDF.
- [AI11]Michèle Artigue and International Group of Experts on Science and Mathematics Education Policies. Les défis de l’enseignement des mathématiques dans l’éducation de base. Number 191776 - UNESCO Archive, IBE Geneva (2011)Google Scholar
- [AMSK09]Aleven, V., McLaren, B., Sewall, J., Koedinger, K.: A new paradigm for intelligent tutoring systems: Example-tracing tutors. International Journal of Artificial Intelligence in Education 20 (2009)Google Scholar
- [BHK+11]Bescherer, C., Herding, D., Kortenkamp, U., Müller, W., Zimmermann, M.: E-learning tools with intelligent assessment and feedback. In: Graf, S., Lin, F., Kinshuk, McGreal, R. (eds.) Intelligent and Adaptive Learning Systems: Technology Enhanced Support for Learners and Teachers. IGI Global, Addison Wesley (2011) (in press)Google Scholar
- [DT08]Drijvers, P., Trouche, L.: From artifacts to instruments: a theoretical framework behind the orchestra metaphor. In: Heid, K., Blume, G. (eds.) Research on Technology and the Teaching and Learning of Mathematics, Information Age, Charlotte, NC, USA, pp. 363–392 (2008)Google Scholar
- [Fes10]Fest, A.: Creating interactive user feedback in DGS using scripting interfaces. Acta Didactica Napocensia 3(2) (2010)Google Scholar
- [HS11]Herding, D., Schroeder, U.: Using capture & replay for semi-automatic assessment. In: Whitelock, D., Warburton, W., Wills, G., Gilbert, L. (eds.) CAA 2011 International Conference (2011)Google Scholar
- [KAHM97]Koedinger, K., Anderson, J., Hadley, W., Mark, M.: Intelligent tutoring goes to school in the big city. IJAIED 8(1) (1997)Google Scholar
- [RZ11]Rebholz, S., Zimmermann, M.: Applying computer-aided intelligent assessment in the context of mathematical induction. In: eLearning Baltics 2011. Fraunhofer Verlag, Stuttgart (2011)Google Scholar
- [SGZS05]Spannagel, C., Gläser-Zikuda, M., Schroeder, U.: Application of qualitative content analysis in user-program interaction research. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research 6(2) (2005)Google Scholar