Proposal of a Numerical Calculation Exercise System for SPI2 Test Based on Academic Ability Diagnosis

Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 14)

Abstract

This paper describes a concept for a calculation exercise system used in order for students who study for SPI2 Test may exercise a numerical calculation repeatedly. We aim to develop a system characterized by generating questions dynamically to each student in order to ease the burden for the teacher when preparing many original questions. In such a case, in order to raise learning effect, it is important to measure a student’s academic ability exactly and distribute questions according to academic ability. In this paper, we propose a method to estimate students’ understanding based on an academic ability diagnostic test using item response theory, and a method to control questions to distribute using the information of hierarchical structure among questions in the study unit.

Keywords

e-learning SPI2 Test numerical calculation exercise academic ability diagnosis generation of questions item response theory 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Huang, S.X.: A Content-Balanced Adaptive Testing Algorithm for Computer-Based Training Systems. In: Lesgold, A.M., Frasson, C., Gauthier, G. (eds.) ITS 1996. LNCS, vol. 1086, pp. 306–314. Springer, Heidelberg (1996)CrossRefGoogle Scholar
  2. 2.
    Suganuma, A., Mine, T., Shoudai, T.: Automatic Generating Appropriate Exercises Based on Dynamic Evaluating both Students’ and Questions’ Levels. In: Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications, pp. 1898–1903 (2002)Google Scholar
  3. 3.
    Hoshino, A., Nakagawa, H.: A real-time multiple-choice question generation for language testing - a preliminary study. In: Proceedings of the 2nd Workshop on Building Educational Applications Using NLP, pp. 17–20 (2005)Google Scholar
  4. 4.
    Mitkov, R., Ha, L.A.: Computer-Aided Generation of Multiple-Choice Tests. In: Proceedings of the HLT-NAACL 2003 Workshop on Building Educational Applications Using Natural Language Processing, vol. 2, pp. 17–22 (2003)Google Scholar
  5. 5.
    Holohan, E., Melia, M., McMullen, D., Pahl, C.: The Generation of E-Learning Exercise Problems from Subject Ontologies. In: Proceedings of The Sixth IEEE International Conference on Computer and Information Technology, pp. 967–969 (2006)Google Scholar
  6. 6.
    Holohan, E., Melia, M., McMullen, D., Pahl, C.: Adaptive E-Learning Content Generation based on Semantic Web Technology. In: AI-ED 2005 Workshop 3, SW-EL 2005: Applications of Semantic Web Technologies for E-Learning, pp. 29–36 (2005)Google Scholar
  7. 7.
    Gonzalez, J.A., Munoz, P.: E-status: An Automatic Web-Based Problem Generator - Applications to Statistics. Computer Applications in Engineering Education 14(2), 151–159 (2006)CrossRefGoogle Scholar
  8. 8.
    Guzmán, E., Conejo, R.: A Model for Student Knowledge Diagnosis Through Adaptive Testing. In: Lester, J.C., Vicari, R.M., Paraguaçu, F. (eds.) ITS 2004. LNCS, vol. 3220, pp. 12–21. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  9. 9.
    Lazcorreta, E., Botella, F., Fernandez-Caballero, A.: Auto-Adaptive Questions in E-Learning System. In: Proceedings of the Sixth International Conference on Advanced Learning Technologies, pp. 270–274 (2006)Google Scholar
  10. 10.
    Tsumori, S., Kaijiri, K.: System Design for Automatic Generation of Multiple-Choice Questions Adapted to Students’ Understanding. In: Proceedings of the 8th International Conference on Information Technology Based Higher Education and Training, pp. 541–546 (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  1. 1.Kyushu Junior College of Kinki UniversityIizukaJapan

Personalised recommendations