Designing Learning Analytics Experiences

Chapter

Abstract

With the increasing presence of digital devices and technology-mediated activities, the possibility of obtaining a detailed account of the events occurring in a learning experience is now a reality. Learning analytics is the discipline that uses this data to improve the overall effectiveness of the learning experience. The use of learning analytics can be divided into five stages: collect, analyze, predict, act, and refine. In this chapter an analysis of each of these stages and some of the solutions being used are discussed. An issue orthogonal to these steps is privacy and security. This chapter provides a discussion of the principles to take into account when approaching the design of a learning analytics experience, noting the dependencies between stages and what concerns might guide a decision to take a learning analytics platform from idea to execution.

Keywords

Income Rote Tanes 

References

  1. Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction equivalency of interaction. The International Review of Research in Open and Distance Learning, 4(2), 1–14.Google Scholar
  2. Anderson, T. D., & Garrison, D. R. (1998). Learning in a networked world: New roles and responsibilities. In C. Gibson (Ed.), Distance learners in higher education (Chap. 6, pp. 97–112). Madison, WI: Atwood Publishing.Google Scholar
  3. Arnold, K. (2010). Signals: Applying academic analytics. EDUCAUSE Quarterly, 33(1), 10.Google Scholar
  4. Australian Government. (2006). Private sector information sheet 1A—National privacy principles. Sydney, Australia: Office of the Privacy Commissioner.Google Scholar
  5. Bakharia, A., & Dawson, S. (2011). SNAPP: A bird’s-eye view of temporal participant interaction. In Proceedings of the First International Conference on Learning Analytics and Knowledge—LAK’11 (pp. 168–173). New York, NY: ACM Press.Google Scholar
  6. Blikstein, P. (2011). Using learning analytics to assess students behavior in open ended programming tasks. In Proceedings of the First International Conference on Learning Analytics and Knowledge, Banff, Canada.Google Scholar
  7. Bramucci, R., & Gaston, J. (2012). Sherpa: Increasing student success with a recommendation engine. In International Conference on Learning Analytics and Knowledge (pp. 82–83). New York, NY: ACM Press.Google Scholar
  8. Calvo, R., Aditomo, A., Southavilay, V., & Yacef, K. (2012). The use of text and process mining techniques to study the impact of feedback on students’ writing processes. In International Conference of the Learning Sciences (pp. 416–423), Sydney.Google Scholar
  9. Campbell, J. P., DeBlois, P. B., & Oblinger, D. G. (2007). Academic analytics. Educause Review, 42, 1–24. EDUCAUSE White Paper.Google Scholar
  10. Canadian Standards Association. (2001). Model code for the protection of personal information. Technical report. Mississauga: Canadian Standards Association.Google Scholar
  11. Chickering, A., Gamson, Z., & Poulsen, S. (1987). Seven principles for good practice in undergraduate education. American Association for Higher Education Bulletin, 39(7), 3–7.Google Scholar
  12. Dawson, S., McWilliam, E., & Tan, J. (2008). Teaching smarter: How mining ICT data can inform and improve learning and teaching practice. In Proceedings of the Australasian Society for Computers in Learning in Tertiary Education (pp. 231–230), Auckland, NZ.Google Scholar
  13. Domingos, P. (2012). A few useful things to know about machine learning. Communications of the ACM, 55(10), 78–87.CrossRefGoogle Scholar
  14. Duval, E. (2011). Attention please!: Learning analytics for visualization and recommendation. In International Conference on Learning Analytics (pp. 9–12). New York, NY: ACM Press.Google Scholar
  15. Essa, A., & Ayad, H. (2012). Student success system: Risk analytics and data visualization using ensembles of predictive models. In Proceedings of the Second International Conference on Learning Analytics and Knowledge—LAK’12 (pp. 158–161). New York, NY: ACM Press.Google Scholar
  16. EUP. (2002). Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector.Google Scholar
  17. Felder, R. M., & Brent, R. (2009). Active learning: An introduction. ASQ Higher Education Brief, 2(4).Google Scholar
  18. Ferguson, R. (2012). The state of learning analytics in 2012: A review and future challenges. Technical report. Milton Keynes, UK: Knowledge Media Institute.Google Scholar
  19. Ferguson, R., & Shum, S. B. (2011). Learning analytics to identify exploratory dialogue within synchronous text chat. In Proceedings of the First International Conference on Learning Analytics and Knowledge—LAK’11 (p. 99). New York, NY: ACM Press.Google Scholar
  20. Ferguson, R., Wei, Z., He, Y., & Buckingham Shum, S. (2013). An evaluation of learning analytics to identify exploratory dialogue in online discussions. In Proceedings of the Third International Conference on Learning Analytics and Knowledge—LAK’13 (p. 85). New York, NY: ACM Press.Google Scholar
  21. Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109.CrossRefGoogle Scholar
  22. Govaerts, S., Verbert, K., Duval, E., & Pardo, A. (2012). The student activity meter for awareness and self-reflection. In: ACM SIGCHI International Conference on Human Factors in Computing Systems (pp. 869–884), Austin.Google Scholar
  23. Jovanović, J., Gašević, D., Brooks, C., Devedžić, V., Hatala, M., Eap, T., & Richards, G. (2007). Using Semantic Web technologies to analyze learning content. IEEE Internet Computing, 11(5), 45–53.CrossRefGoogle Scholar
  24. Jovanović, J., Gašević, D., Brooks, C., Devedzic, V., Hatala, M., Eap, T., & Richards, G. (2008). LOCO-analyst: Semantic Web technologies in learning content usage analysis. International Journal of Continuing Engineering Education and Lifelong Learning, 18(1), 54.CrossRefGoogle Scholar
  25. Kerren, A., Stasko, J. T., Fekete, J. D., & North, C. (Eds.). (2008). Information visualization: Human-centered issues and perspectives. LNCS Vol 4950. Heidelberg: Springer.Google Scholar
  26. Kump, B., Seifert, C., Beham, G., Lindstaedt, S. N., & Ley, T. (2012). Seeing what the system thinks you know. In Proceedings of the Second International Conference on Learning Analytics and Knowledge—LAK’12 (p. 153). New York, NY: ACM Press.Google Scholar
  27. Leony, D., Pardo, A., de La Fuente Valentin, L., Sánchez de Castro, D., & Delgado Kloos, C. (2012). GLASS: A learning analytics visualization tool. In International Conference on Learning Analytics and Knowledge (pp. 162–163). New York, NY: ACM.Google Scholar
  28. Lin, F., Hsieh, L., & Chuang, F. (2009). Discovering genres of online discussion threads via text mining. Computers & Education, 52(2), 481–495.CrossRefGoogle Scholar
  29. Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459.CrossRefGoogle Scholar
  30. Long, P., & Siemens, G. (2011). Penetrating the fog: Analytics in learning and education. Educause Review, 48(5), 31–40.Google Scholar
  31. Lonn, S., Aguilar, S., & Teasley, S. (2013). Issues, challenges, and lessons learned when scaling up a learning analytics intervention. In International Conference on Learning Analytics and Knowledge (pp. 8–12). New York, NY: ACM Press.Google Scholar
  32. Lonn, S., Krumm, A. E., Waddington, R. J., & Teasley, S.D. (2012). Bridging the gap from knowledge to action: Putting analytics in the hands of academic advisors. In International Conference on Learning Analytics and Knowledge (pp. 184–182). New York, NY: ACM Press.Google Scholar
  33. Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an early warning system for educators: A proof of concept. Computers & Education, 54(2), 588–599.CrossRefGoogle Scholar
  34. Mazza, R., & Dimitrova, V. (2007). CourseVis: A graphical student monitoring tool for supporting instructors in web-based distance courses. International Journal of Human-Computer Studies, 65(2), 125–139.CrossRefGoogle Scholar
  35. Mazza, R., & Dimitrova, V. (2004). Visualising student tracking data to support instructors in web-based distance education. In Proceedings of the 13th International World Wide Web Conference on Alternate Track Papers & Posters (pp. 154–161). New York, NY: ACM.Google Scholar
  36. Mazza, R., & Milani, C. (2004). Gismo: A graphical interactive student monitoring tool for course management systems. In TEL’04 Technology Enhanced Learning’04, Milan.Google Scholar
  37. Miyazoe, T., & Anderson, T. D. (2010). The interaction equivalency theorem. Journal of Interactive Online Learning, 9(2), 94–104.Google Scholar
  38. Muñoz-Merino, P. J., Pardo, A., Scheffel, M., Niemann, K., Wolpers, M., Leony, D., et al. (2011). An ontological framework for adaptive feedback to support students while programming. In Proceedings of the 10th International Semantic Web Conference (ISWC), Bonn, Germany.Google Scholar
  39. Muñoz-Merino, P. J., Delgado Kloos, C., Muñoz-Organero, M., Wolpers, M., & Friedrich, M. (2010). An approach for the personalization of exercises based on Contextualized Attention Metadata and Semantic Web technologies. In IEEE International Conference on Advanced Learning Technologies, Athens, Greece.Google Scholar
  40. Novak, G. M. (2011). Just-in-time teaching. New Directions for Teaching and Learning, 2011(128), 63–73.CrossRefGoogle Scholar
  41. Pardo, A., & Delgado Kloos, C. (2011). Stepping out of the box. Towards analytics outside the learning management system. In International Conference on Learning Analytics (pp. 163–162). New York, NY: ACM.Google Scholar
  42. Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(July), 223–232.CrossRefGoogle Scholar
  43. Rivera-Pelayo, V., Munk, J., Zacharias, V., & Braun, S. (2013). Live interest meter. In Proceedings of the Third International Conference on Learning Analytics and Knowledge—LAK’13 (p. 23). New York, NY: ACM Press.Google Scholar
  44. Romero Zalďıvar, V. A., Pardo, A., Burgos, D., & Delgado Kloos, C. (2012). Monitoring student progress using virtual appliances: A case study. Computers & Education, 58(4), 1058–1067.CrossRefGoogle Scholar
  45. Romero, C., López, M. I., Luna, J. M., & Ventura, S. (2013). Predicting students’ final performance from participation in on-line discussion forums. Computers & Education, 68, 458–472. Doi: 10.1016/j.compedu.2013.06.009. Retrieved from http://linkinghub.elsevier.com/retrieve/pii/S0360131513001607
  46. Santos, J. L., Govaerts, S., Verbert, K., & Duval, E. (2012). Goal-oriented visualizations of activity tracking. In Proceedings of the Second International Conference on Learning Analytics and Knowledge—LAK’12 (p. 143). New York, NY: ACM Press.Google Scholar
  47. Santos, J., Verbert, K., Govaerts, S., & Duval, E. (2011). Visualizing PLE usage. In Proceedings of EFE-PLE11, La Clusaz, France.Google Scholar
  48. Santos, J. L., Verbert, K., Govaerts, S., & Duval, E. (2013). Addressing learner issues with StepUp! In Proceedings of the Third International Conference on Learning Analytics and Knowledge—LAK’13 (p. 14). New York, NY: ACM Press.Google Scholar
  49. Scheffel, M., Beer, F., & Wolpers, M. (2010). Analysing contextualized attention metadata for self-regulated learning. In International Conference on Computer Supported Education (pp. 341–346), Valencia, Spain.Google Scholar
  50. Schmitz, H. C., Wolpers, M., Kirschenmann, U., & Niemann, K. (2007). Contextualized attention metadata. In: Human attention in digital environments (Vol. 13, Chap. 8). Cambridge: Cambridge University Press.Google Scholar
  51. Schwartz, P. M. (2011). Privacy, ethics, and analytics. IEEE Security & Privacy, 9, 66–69.CrossRefGoogle Scholar
  52. Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist. doi: 10.1177/0002764213479366.Google Scholar
  53. Tanes, Z., Arnold, K. E., King, A. S., & Remnet, M. A. (2011). Using signals for appropriate feedback: Perceptions and practices. Computers & Education, 57(4), 2414–2422.CrossRefGoogle Scholar
  54. The White House (2012). Consumer data privacy in a networked world. Technical report. Washington, DC: The White House.Google Scholar
  55. Wise, A. F., & Hausknecht, S. N. (2013). Learning analytics for online discussions: A pedagogical model for intervention with embedded and extracted analytics. In International Conference on Learning Analytics and Knowledge (pp. 48–56). New York, NY: ACM Press.Google Scholar
  56. Wolpers, M., Najjar, J., Verbert, K., & Duval, E. (2007). Tracking actual usage: The attention metadata approach. Journal of Technology Education & Society, 10(3), 106–121.Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.School of Electrical and Information EngineeringThe University of SydneySydneyAustralia

Personalised recommendations