Student Modeling Based on Problem Solving Times



Student modeling in intelligent tutoring systems is mostly concerned with modeling correctness of students’ answers. As interactive problem solving activities become increasingly common in educational systems, it is useful to focus also on timing information associated with problem solving. We argue that the focus on timing is natural for certain types of educational problems and we describe a simple model of problem solving times which assumes a linear relationship between a latent problem solving skill and a logarithm of a time to solve a problem. The model is closely related to models from two different areas: the item response theory and collaborative filtering. We describe two parameter estimation techniques for the model and several extensions – models with multidimensional skill, learning, or variability of performance. We describe an application of the proposed models in a widely used computerized practice system. Using both simulated data and real data from the system we evaluate the model, analyse its parameter values, and discuss the insight into problem difficulty which the model brings.


Intelligent tutoring systems Student modeling Problem solving times Item response theory Collaborative filtering Computerized adaptive practice 



We thank anonymous reviewers, who provided many comments and specific suggestions, which significantly improved the paper, and our collaborators, who have participated in some of the reported research: Petr Boroš, Juraj Nižnan, Matěj Klusáček, and Jiří Řihák. We also thank developers of the Problem Solving Tutor system, particularly Vít Stanislav.


  1. Anderson, J., Boyle, C., & Reiser, B. (1985). Intelligent tutoring systems. Science, 228(4698), 456–462.CrossRefGoogle Scholar
  2. Arroyo, I., Meheranian, H., & Woolf, B. P. (2010). Effort-based tutoring: An empirical approach to intelligent tutoring. In Proc. of educational data mining (Vol. 2010, pp. 1–10).Google Scholar
  3. Baker, F. (2001). The basics of item response theory. University of Wisconsin.Google Scholar
  4. Barnes, T. (2005). The q-matrix method: Mining student response data for knowledge. In American association for artificial intelligence 2005 educational data mining workshop.Google Scholar
  5. Barnes, T., Bitzer, D., & Vouk, M. (2005). Experimental analysis of the q-matrix method in knowledge discovery. Foundations of Intelligent Systems, 11–41.Google Scholar
  6. Beck, J., & Woolf, B. (2000). High-level student modeling with machine learning. In Intelligent tutoring systems (pp 584–593). Springer.Google Scholar
  7. Beck, J.E. (2004). Using response times to model student disengagement. In Proc. of the ITS2004 workshop on social and emotional intelligence in learning environments (pp. 13–20).Google Scholar
  8. Beck, J.E., & Chang, K.m. (2007). Identifiability: a fundamental problem of student modeling. In User modeling (pp. 137–146). Springer.Google Scholar
  9. Beck, J.E., & Mostow, J. (2008). How who should practice: using learning decomposition to evaluate the efficacy of different types of practice for different types of students. In Intelligent tutoring systems (pp. 353–362). Springer.Google Scholar
  10. Bergner, Y., Droschler, S., Kortemeyer, G., Rayyan, S., Seaton, D., & Pritchard, D. (2012). Model-based collaborative filtering analysis of student response data: machine-learning item response theory. In Educational data mining (pp. 95–102).Google Scholar
  11. Bishop, C. (2006). Pattern recognition and machine learning. Springer.Google Scholar
  12. Boroš, P., Nižnan, J., Pelánek, R., & Řihák, J. (2013). Automatic detection of concepts from problem solving times. In Proc. of international conference on artificial intelligence in education (Vol. 7926, pp. 595–598). Springer, LNCS.Google Scholar
  13. Chandola, V., Banerjee, A., & Kumar, V. (2009). Anomaly detection: a survey. ACM Computing Surveys (CSUR), 41(3), 15.CrossRefGoogle Scholar
  14. Conati, C., Gertner, A., & Vanlehn, K. (2002). Using bayesian networks to manage uncertainty in student modeling. User modeling and user-adapted interaction, 12(4), 371–417.MATHCrossRefGoogle Scholar
  15. Csikszentmihalyi, M. (1991). Flow: the psychology of optimal experience. HarperPerennial New York.Google Scholar
  16. De Ayala, R. (2008). The theory and practice of item response theory. The Guilford Press.Google Scholar
  17. De La Torre, J. (2008). An empirically based method of q-matrix validation for the dina model: development and applications. Journal of educational measurement, 45(4), 343–362.MathSciNetCrossRefGoogle Scholar
  18. Desmarais, M., Beheshti, B., & Naceur, R. (2012). Item to skills mapping: deriving a conjunctive q-matrix from data. In Intelligent tutoring systems (pp. 454–463). Springer.Google Scholar
  19. Desmarais, M. C., & de Baker, R.SJ. (2012). A review of recent advances in learner and skill modeling in intelligent learning environments. User Model User-Adapt Interact, 22(1–2), 9–38.CrossRefGoogle Scholar
  20. Fan, Z., Wang, C., Chang, H. H., & Douglas, J. (2012). Utilizing response time distributions for item selection in cat. Journal of Educational and Behavioral Statistics, 37(5), 655–670.CrossRefGoogle Scholar
  21. Hirsch, W. Z. (1952). Manufacturing progress functions. The Review of Economics and Statistics, 143–155.Google Scholar
  22. Jarušek, P. (2013). Modeling problem solving times in tutoring systems. PhD thesis, Faculty of informatics, Masaryk University Brno.Google Scholar
  23. Jarušek, P., & Pelánek, R. (2011a). Problem response theory and its application for tutoring. In Proc. of educational data mining (pp. 374–375).Google Scholar
  24. Jarušek, P., & Pelánek, R. (2011b). What determines difficulty of transport puzzles? In Proc. of Florida artificial intelligence research society conference (pp. 428–433). AAAI Press.Google Scholar
  25. Jarušek, P., & Pelánek, R. (2012a). Analysis of a simple model of problem solving times. In Proc. of intelligent tutoring systems (Vol. 7315, pp. 379–388). Springer, LNCS.Google Scholar
  26. Jarušek, P., & Pelánek, R. (2012b). Modeling and predicting students problem solving times. In Proc. of international conference on current trends in theory and practice of computer science (Vol. 7147, pp. 637–648). Springer, LNCS.Google Scholar
  27. Jarušek, P, Klusáček, M, & Pelánek, R (2013). Modeling students’ learning and variability of performance in problem solving. In Proc. of international conference on educational data mining, international educational data mining society (pp. 256–259).Google Scholar
  28. Kantor, P., Ricci, F., Rokach, L., & Shapira, B. (2010). Recommender systems handbook. Springer.Google Scholar
  29. Klinkenberg, S., Straatemeier, M., & Van der Maas, H. (2011). Computer adaptive practice of maths ability using a new item response model for on the fly ability and difficulty estimation. Computers & Education, 57(2), 1813–1824.CrossRefGoogle Scholar
  30. Klusáček, M. (2014). Modeling learning in problem solving, Master’s thesis, Masaryk University.Google Scholar
  31. Koedinger, K., Anderson, J., Hadley, W., & Mark, M. (1997). Intelligent tutoring goes to school in the big city. International Journal of Artificial Intelligence in Education, 8(1), 30–43.Google Scholar
  32. Koedinger, K., Corbett, A., Ritter, S., & Shapiro, L. (2000). Carnegie learning’s cognitive tutor: summary research results. White paper Available from Carnegie Learning Inc 1200.Google Scholar
  33. Koren, Y., & Bell, R. (2011). Advances in collaborative filtering. Recommender systems handbook (pp. 145–186).Google Scholar
  34. Kotovsky, K., & Simon, H. (1990). What makes some problems really hard: explorations in the problem space of difficulty. Cognitive Psychology, 22(2), 143–83.CrossRefGoogle Scholar
  35. Kotovsky, K., Hayes, J., & Simon, H. (1985). Why are some problems hard? Evidence from tower of Hanoi. Cognitive Psychology, 17(2), 248–294.CrossRefGoogle Scholar
  36. LaBerge, D. (1975). Acquisition of automatic processing in perceptual and associative learning. Attention and Performance, 5, 50–64.Google Scholar
  37. Van der Linden, W. (2006). A lognormal model for response times on test items. Journal of Educational and Behavioral Statistics, 31(2), 181.MathSciNetCrossRefGoogle Scholar
  38. Martin, B., Mitrovic, A., Koedinger, K. R., & Mathan, S. (2011). Evaluating and improving adaptive educational systems with learning curves. User Modeling and User-Adapted Interaction, 21(3), 249–283.CrossRefGoogle Scholar
  39. Michlík, P., & Bieliková, M. (2010). Exercises recommending for limited time learning. Procedia Computer Science, 1(2), 2821–2828.CrossRefGoogle Scholar
  40. Mostow, J., & Aist, G. (2001). Evaluating tutors that listen: an overview of project listen. In Smart machines in education (pp. 169–234). MIT Press.Google Scholar
  41. Newell, A., & Rosenbloom, P. (1981). Mechanisms of skill acquisition and the law of practice. Cognitive skills and their acquisition (pp. 1–55).Google Scholar
  42. Nižnan, J., Pelánek, R., & Řihák, J. (2014). Using problem solving times and expert opinion to detect skills. In Educational data mining (EDM) (pp. 21–27).Google Scholar
  43. Pavlik, P, Bolster, T, Wu, SM, Koedinger, K, & Macwhinney, B (2008). Using optimally selected drill practice to train basic facts. In Intelligent tutoring systems (pp. 593–602). Springer .Google Scholar
  44. Pavlik, P. I., & Anderson, J. R. (2005). Practice and forgetting effects on vocabulary memory: an activation-based model of the spacing effect. Cognitive Science, 29(4), 559–586.CrossRefGoogle Scholar
  45. Pavlik, P. I., & Anderson, J. R. (2008). Using a model to compute the optimal schedule of practice. Journal of Experimental Psychology: Applied, 14(2), 101.Google Scholar
  46. Pelánek, R. (2011). Difficulty rating of sudoku puzzles by a computational model. In Proc. of Florida artificial intelligence research society conference.Google Scholar
  47. Pizlo, Z., & Li, Z. (2005). Solving combinatorial problems: the 15-puzzle. Memory and Cognition, 33(6), 1069.CrossRefGoogle Scholar
  48. Rai, D., & Beck, J. (2011). Exploring user data from a game-like math tutor: a case study in causal modeling. In Pechenizkiy, M., et al. (Eds.), Educational data mining (pp. 307–313).Google Scholar
  49. Rupp, A., & Templin, J. (2008). The effects of q-matrix misspecification on parameter estimates and classification accuracy in the dina model. Educational and Psychological Measurement, 68(1), 78–96.MathSciNetCrossRefGoogle Scholar
  50. Simon, H., & Newell, A. (1972). Human problem solving. Prentice Hall.Google Scholar
  51. Šormová, H. (2014). Detekce anomálií v datech o řešení problému (Anomaly detection for problem solving data). Master’s thesis, Masaryk University.Google Scholar
  52. Tatsuoka, K. (1983). Rule space: an approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20(4), 345–354.CrossRefGoogle Scholar
  53. Thai-Nghe, N., Drumond, L., Horváth, T., Krohn-Grimberghe, A., Nanopoulos, A., & Schmidt-Thieme, L. (2011). Factorization techniques for predicting student performance. Educational recommender systems and technologies: practices and challenges IGI Global.Google Scholar
  54. Van Der Linden, W. (2009). Conceptual issues in response-time modeling. Journal of Educational Measurement, 46(3), 247–272.CrossRefGoogle Scholar
  55. Vanlehn, K. (2006). The behavior of tutoring systems. International Journal of Artificial Intelligence in Education, 16(3), 227–265.Google Scholar
  56. Wang, Y., & Heffernan, N. (2012). Leveraging first response time into the knowledge tracing model. In Proc. of international conference on educational data mining (pp. 176–179).Google Scholar

Copyright information

© International Artificial Intelligence in Education Society 2015

Authors and Affiliations

  1. 1.Faculty of InformaticsMasaryk UniversityBrnoCzech Republic

Personalised recommendations