Advertisement

Cross-System Transfer of Machine Learned and Knowledge Engineered Models of Gaming the System

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9146)

Abstract

Replicable research on the behavior known as gaming the system, in which students try to succeed by exploiting the functionalities of a learning environment instead of learning the material, has shown it is negatively correlated with learning outcomes. As such, many have developed models that can automatically detect gaming behaviors, towards deploying them in online learning environments. Both machine learning and knowledge engineering approaches have been used to create models for a variety of software systems, but the development of these models is often quite time consuming. In this paper, we investigate how well different kinds of models generalize across learning environments, specifically studying how effectively four gaming models previously created for the Cognitive Tutor Algebra tutoring system function when applied to data from two alternate learning environments: the scatterplot lesson of Cognitive Tutor Middle School and ASSISTments. Our results suggest that the similarity between the systems our model are transferred between and the nature of the approach used to create the model impact transfer to new systems.

Keywords

Gaming the system Cognitive tutors ASSISTments Machine learning Cognitive modeling Cross-system transfer 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    de Baker, R.S.J., Corbett, A.T., Roll, I., Koedinger, K.R.: Developing a Generalizable Detector of When Students Games the System. User Modeling & User Adapted Interaction 18, 287–314 (2008)CrossRefGoogle Scholar
  2. 2.
    de Baker, R.S.J., D’Mello, S.K., Rodrigo, M.M.T., Graesser, A.C.: Better to be Frustrated than Bored: The Incidence, Persistence, and Impact of Learners’ Cognitive-Affective States During Interactions with Three Different Computer-Based Learning Environments. Int’l Journal of Human-Computer Studies 68, 223–241 (2010)CrossRefGoogle Scholar
  3. 3.
    de Baker, R.S.J., Mitrović, A., Mathews, M.: Detecting gaming the system in constraint-based tutors. In: De Bra, P., Kobsa, A., Chin, D. (eds.) UMAP 2010. LNCS, vol. 6075, pp. 267–278. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  4. 4.
    Beal, C.R., Qu, L., Lee, H.: Classifying learner engagement through integration of multiple data sources. In: Proc. of the National Conf. on Artificial Intelligence, pp. 151–156 (2006)Google Scholar
  5. 5.
    Johns, J., Woolf, B.: A dynamic mixture model to detect student motivation and proficiency. In: Proc. of the National Conference on Artificial Intelligence, pp. 163–168 (2006)Google Scholar
  6. 6.
    Muldner, K., Burleson, W., Van de Sande, B., VanLehn, K.: An Analysis of Students’ Gaming Behaviors in an Intelligent Tutoring System: Predictors and Impact. User Modeling and User Adapted Interaction 21, 99–135 (2011)CrossRefGoogle Scholar
  7. 7.
    Walonoski, J.A., Heffernan, N.T.: Detection and analysis of off-task gaming behavior in intelligent tutoring systems. In: Ikeda, M., Ashley, K.D., Chan, T.-W. (eds.) ITS 2006. LNCS, vol. 4053, pp. 382–391. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  8. 8.
    Beck, J., Rodrigo, M.T.: Understanding wheel spinning in the context of affective factors. In: Trausan-Matu, S., Boyer, K.E., Crosby, M., Panourgia, K. (eds.) ITS 2014. LNCS, vol. 8474, pp. 162–167. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  9. 9.
    Cocea, M., Hershkovitz, A., de Baker, R.S.J.: The impact of off-task and gaming behaviors on learning: immediate or aggregate? In: Proc. of the 14th Int’l Conference on Artificial Intelligence in Education, pp. 507–514 (2009)Google Scholar
  10. 10.
    Fancsali, S.E.: Data-Driven Causal Modeling of “Gaming the System” and Off-Task Behavior in Cognitive Tutor Algebra. NIPS Workshop on Data Driven EducationGoogle Scholar
  11. 11.
    Pardos, Z.A., Baker, R.S., San Pedro, M.O.C.Z., Gowda, S.M., Gowda, S.M.: Affective States and State Tests: Investigating how Affect and Engagement During the School Year Predict End of Year Learning Outcomes. J. of Learning Analytics 1(1), 107–128 (2014)Google Scholar
  12. 12.
    San Pedro, M.O.Z., de Baker, R.S.J., Bowers, A.J., Heffernan, N.T.: Predicting college enrolment from student interaction with an intelligent tutoring system in middle school. In: Proc. of the 6th Int’l Conference on Educational Data Mining, pp. 177–184 (2013)Google Scholar
  13. 13.
    Aleven, V., McLaren, B.M., Roll, I., Koedinger, K.R.: Towards Meta-Cognitive Tutoring: A Model of Help Seeking with a Cognitive Tutor. Int’l J. of Artificial Intelligence in Education 16, 101–130 (2006)Google Scholar
  14. 14.
    Gong, Y., Beck, J.E., Heffernan, N.T., Forbes-Summers, E.: The fine-grained impact of gaming on learning. In: Aleven, V., Kay, J., Mostow, J. (eds.) ITS 2010, Part I. LNCS, vol. 6094, pp. 194–203. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  15. 15.
    Arroyo, I., et al.: Repairing disengagement with non-invasive interventions. In: Proc. of the 13th Int’l Conference on Artificial Intelligence in Education, pp. 195–202 (2007)Google Scholar
  16. 16.
    Walonoski, J.A., Heffernan, N.T.: Prevention of off-task gaming behavior in intelligent tutoring systems. In: Ikeda, M., Ashley, K.D., Chan, T.-W. (eds.) ITS 2006. LNCS, vol. 4053, pp. 722–724. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  17. 17.
    de Baker, R.S.J., Yacef, K.: The State of Educational Data Mining in 2009: A Review and Future Visions. Journal of Educational Data Mining 1(1), 3–17 (2009)Google Scholar
  18. 18.
    de Baker, R.S.J., Corbett, A.T., Wagner, A.Z.: Human classification of low-fidelity replays of student actions. In: Proc. of the Educational Data Mining Workshop at Intelligent Tutoring System 2006, pp. 29–36 (2006)Google Scholar
  19. 19.
    Koedinger, K.R., Corbett, A.T.: Cognitive tutors: technology bringing learning sciences to the classroom. In: Sawyer, R.K. (ed.) The Cambridge Handbook of the Learning Sciences, pp. 61–77 (2006)Google Scholar
  20. 20.
    Paquette, L., de Carvalho, A.M.J.A., Ryan, S.B.: Towards understanding export coding of student disengagement in online learning. In: Proc. of the 36th Annual Cognitive Science Conference, pp. 1126–1131 (2014)Google Scholar
  21. 21.
    Paquette, L., de Carvalho, A.M.J.A., Ryan, S.B., Ocumpaugh, J.: Reengineering the feature distillation process: a case study in the detection of gaming the system. In: Proc. of the 7th Int’l Conference on Educational Data Mining, pp. 284–287 (2014)Google Scholar
  22. 22.
    Baker, R.S., Corbett, A.T., Koedinger, K.R.: Learning to distinguish between representations of data: a cognitive tutor that uses contrasting cases. In: Proc. of the International Conference of the Learning Sciences, pp. 58–65 (2004)Google Scholar
  23. 23.
    Razzaq, L., et al.: The assistment project: blending assessment and assisting. In: Proc. of the 12 Annual Conference on Artificial Intelligence in Education, pp. 555–562 (2005)Google Scholar
  24. 24.
    Koedinger, K.R., et al.: A Data Repository for the Community: The PLSC DataShop (2010)Google Scholar
  25. 25.
    de Baker, R.S.J., de Carvalho, A.M.J.A.: Labeling student behavior faster & more precisely with text replays. In: Proc. of the 1st Int’l Conf. on Educational Data Mining 2008, pp. 38–47 (2008)Google Scholar
  26. 26.
    Cohen, J.: A Coefficient of Agreement for Nominal Scales. Educational and Psychological Measurement 20(1), 37–46 (1960)CrossRefGoogle Scholar
  27. 27.
    Clark, R.E., Feldon, D., van Merriënboer, J., Yates, K., Early, S.: Cognitive task analysis. In: Spector, J.M., Merrill, M.D., van Merriënboer, J.J.G., Driscoll, M.P. (eds.) Handbook of Research on Educational Communications and Technology, 3rd edn., pp. 575–593 (2008)Google Scholar
  28. 28.
    Cooke, N.J.: Varieties of Knowledge Elicitation Techniques. Int’l Journal of Human-Computer Studies 41, 801–849 (1994)zbMATHCrossRefGoogle Scholar
  29. 29.
    Meyer, M.A.: How to Apply the Anthropological Technique of Participant Observation to Knowledge Acquisition for Expert Systems. IEEE Transactions on Systems, Man, & Cybernetics 22, 983–991 (1992)CrossRefGoogle Scholar
  30. 30.
    Van Someren, M.W., Barnard, Y.F., Sandberg, J.A.C.: The Think Aloud Method: A Practical Guide to Modeling Cognitive Processes (1994)Google Scholar
  31. 31.
    Hanley, J., McNeil, B.: The Meaning and Use of the Area Under a Receiver Operating Characteristic (ROC) Curve. Radiology 143, 29–36 (1982)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Teachers CollegeColumbia UniversityNew YorkUSA
  2. 2.GoogleNew YorkUSA

Personalised recommendations