Evaluating the Robustness of Learning Analytics Results Against Fake Learners

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11082)


Massive Open Online Courses (MOOCs) collect large amounts of rich data. A primary objective of Learning Analytics (LA) research is studying these data in order to improve the pedagogy of interactive learning environments. Most studies make the underlying assumption that the data represent truthful and honest learning activity. However, previous studies showed that MOOCs can have large cohorts of users that break this assumption and achieve high performance through behaviors such as Cheating Using Multiple Accounts or unauthorized collaboration, and we therefore denote them fake learners. Because of their aberrant behavior, fake learners can bias the results of Learning Analytics (LA) models. The goal of this study is to evaluate the robustness of LA results when the data contain a considerable number of fake learners. Our methodology follows the rationale of ‘replication research’. We challenge the results reported in a well-known, and one of the first LA/Pedagogic-Efficacy MOOC papers, by replicating its results with and without the fake learners (identified using machine learning algorithms). The results show that fake learners exhibit very different behavior compared to true learners. However, even though they are a significant portion of the student population (\(\sim \)15%), their effect on the results is not dramatic (does not change trends). We conclude that the LA study that we challenged was robust against fake learners. While these results carry an optimistic message on the trustworthiness of LA research, they rely on data from one MOOC. We believe that this issue should receive more attention within the LA research community, and can explain some ‘surprising’ research results in MOOCs.


Learning analytics Educational data mining MOOCs Fake learners Reliability IRT 


  1. 1.
    Alexandron, G., Keinan, G., Levy, B., Hershkovitz, S.: Evaluating the effectiveness of educational videos. In: EdMedia (2018) (To appear)Google Scholar
  2. 2.
    Alexandron, G., Pritchard, D.: Discovering the pedagogical resources that assist students in answering questions correctly a machine learning approach. In: Proceedings of the 8th International Conference on Educational Data Mining, pp. 520–523 (2015)Google Scholar
  3. 3.
    Alexandron, G., Ruiperez-Valiente, J.A., Pritchard, D.E.: Evidence of MOOC students using multiple accounts to harvest correct answers, learning with MOOCs II (2015)Google Scholar
  4. 4.
    Alexandron, G., Ruipérez-Valiente, J.A., Chen, Z., Muñoz-Merino, P.J., Pritchard, D.E.: Copying@Scale: using harvesting accounts for collecting correct answers in a MOOC. Comput. Educ. 108, 96–114 (2017)CrossRefGoogle Scholar
  5. 5.
    Baker, R., Walonoski, J., Heffernan, N., Roll, I., Corbett, A., Koedinger, K.: Why students engage in “Gaming the System" behavior in interactive learning environments. J. Interact. Learn. Res. 19(2), 162–182 (2008)Google Scholar
  6. 6.
    Baker, R.S.J.D., De Carvalho, A.M.J.B., Raspat, J., Aleven, V., Corbett, A.T., Koedinger, K.R.: Educational software features that encourage and discourage “gaming the system". In: Proceedings of the 2009 Conference on Artificial Intelligence in Education, pp. 475–482 (2009)Google Scholar
  7. 7.
    Champaign, J., Colvin, K.F., Liu, A., Fredericks, C., Seaton, D., Pritchard, D.E.: Correlating skill and improvement in 2 MOOCs with a student’s time on tasks. In: Proceedings of the first ACM conference on Learning @ scale conference - L@S 2014 (March), pp. 11–20 (2014)Google Scholar
  8. 8.
    Chen, Z., Chudzicki, C., Palumbo, D., Alexandron, G., Choi, Y.J., Zhou, Q., Pritchard, D.E.: Researching for better instructional methods using AB experiments in MOOCs: results and challenges. Res. Pract. Technol. Enhanc. Learn. 11(1), 9 (2016)CrossRefGoogle Scholar
  9. 9.
    De Ayala, R.: The Theory and Practice of Item Response Theory. Methodology in the social sciences. Guilford Publications, New York (2009)Google Scholar
  10. 10.
    U.S. Department of Education, O.o.E.T.: Enhancing teaching and learning through educational data mining and learning analytics: An issue brief (2012)Google Scholar
  11. 11.
    Goldhammer, F.: Measuring ability, speed, or both? challenges, psychometric solutions, and what can be gained from experimental control. Measur. Interdisc. Res. Perspect. 13(3–4), 133–164 (2015)CrossRefGoogle Scholar
  12. 12.
    Kim, J., Guo, P.J., Seaton, D.T., Mitros, P., Gajos, K.Z., Miller, R.C.: Understanding in-video dropouts and interaction peaks in online lecture videos (2014)Google Scholar
  13. 13.
    Koedinger, K.R., Mclaughlin, E.A., Kim, J., Jia, J.Z., Bier, N.L.: Learning is Not a Spectator Sport: Doing is Better than Watching for Learning from a MOOC, pp. 111–120 (2015)Google Scholar
  14. 14.
    MacHardy, Z., Pardos, Z.A.: Toward the evaluation of educational videos using Bayesian knowledge tracing and big data. In: Proceedings of the Second (2015) ACM Conference on Learning @ Scale, L@S 2015, pp. 347–350. ACM (2015)Google Scholar
  15. 15.
    Northcutt, C.G., Ho, A.D., Chuang, I.L.: Detecting and preventing “multiple-account" cheating in massive open online courses. Comput. Educ. 100(C), 71–80 (2016)Google Scholar
  16. 16.
    O’Neil, C.: Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown Publishing Group, New York (2016)zbMATHGoogle Scholar
  17. 17.
    Palazzo, D.J., Lee, Y.J., Warnakulasooriya, R., Pritchard, D.E.: Patterns, correlates, and reduction of homework copying. Phys. Rev. ST Phys. Educ. Res. 6, 010104 (2010)CrossRefGoogle Scholar
  18. 18.
    Ruiperez-Valiente, J.A., Alexandron, G., Chen, Z., Pritchard, D.E.: Using multiple accounts for harvesting solutions in MOOCs. In: Proceedings of the Third (2016) ACM Conference on Learning @ Scale - L@S 2016, pp. 63–70 (2016)Google Scholar
  19. 19.
    Ruipérez-Valiente, J.A., Joksimović, S., Kovanović, V., Gašević, D., Muñoz Merino, P.J., Delgado Kloos, C.: A data-driven method for the detection of close submitters in online learning environments. In: Proceedings of the 26th International Conference on World Wide Web Companion, pp. 361–368 (2017)Google Scholar
  20. 20.
    Siemens, G.: Learning analytics: the emergence of a discipline. Am. Behav. Sci. 10, 1380–1400 (2013)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Weizmann Institute of ScienceRehovotIsrael
  2. 2.Massachusetts Institute of TechnologyCambridgeUSA
  3. 3.University of HoustonHoustonUSA

Personalised recommendations