Advertisement

Technology, Knowledge and Learning

, Volume 21, Issue 1, pp 5–19 | Cite as

Exploratory Analysis in Learning Analytics

  • David Gibson
  • Sara de Freitas
Work-in-Progress

Abstract

This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex datasets were analyzed and iteratively modeled with a variety of computationally intensive methods to provide the most effective outcomes for learning assessment, performance management and learner tracking. The article presents the research contexts, the tools and methods used in the exploratory phases of analysis, the major findings and the implications for learning analytics research methods.

Keywords

Learning analytics Computationally intensive mixed methods research Game-based learning Virtual performance assessment 

References

  1. Baker, R. S. J. (2010). Data mining for education. International Encyclopedia of Education, 3, 112–118. doi: 10.4018/978-1-59140-557-3.ch075.CrossRefGoogle Scholar
  2. Behrens, J., Mislevy, R., Dicerbo, K., & Levy, R. (2011). Evidence centered design for learning and assessment in the digital world. In M. Mayrath, J. Clarke-Midura, D. Robinson, & G. Schraw (Eds.), Technology-based assessments for 21st century skills (pp. 13–54). Charlotte, NC: Information Age Publishers.Google Scholar
  3. Bollen, J., Van de Sompel, H., Hagberg, A., Bettencourt, L., Chute, R., Rodriguez, M. A., & Balakireva, L. (2009). Clickstream data yields high-resolution maps of science. PLoS One, 4, e4803. doi: 10.1371/journal.pone.0004803.CrossRefGoogle Scholar
  4. Campanharo, A., Sirer, M., Malmgren, R., Ramos, F., & Amaral, L. (2011). Duality between time series and networks. PLoS One, 6(8), e23378. doi: 10.1371/journal.pone.0023378.CrossRefGoogle Scholar
  5. Choi, Y., Rupp, A., Gushta, M., & Sweet, S. (2010). Modeling learning trajectories with epistemic network analysis: An investigation of a novel analytic method for learning progressions in epistemic games. In National council on measurement in education (pp. 1–39).Google Scholar
  6. Christensen, R., Tyler-Wood, T., Knezek, G., & Gibson, D. (2011). SimSchool: An online dynamic simulator for enhancing teacher preparation. International Journal of Learning Technology, 6(2), 201–220.CrossRefGoogle Scholar
  7. Clarke, J., & Dede, C. (2010). Assessment, technology, and change. Journal of Research in Teacher Education, 42(3), 309–328.Google Scholar
  8. Clarke-Midura, J., Code, J., Dede, C., Mayrath, M., & Zap, N. (2012). Thinking outside the bubble: Virtual performance assessments for measuring complex learning. Technology-based assessments for 21st Century skills: Theoretical and practical implications from modern research (pp. 125–148). Charlotte, NC: Information Age Publishers.Google Scholar
  9. Clarke-Midura, J., Mayrath, M., & Dede, C. (2010). Measuring inquiry: New methods, promises and challenges. Library, 2, 89–92.Google Scholar
  10. Connolly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A systematic literature review of empirical evidence on computer games and serious games. Computers and Education,. doi: 10.1016/j.compedu.2012.03.004.Google Scholar
  11. Creswell, J. (2003). Research design: Qualitative, quantitative and mixed methods approaches. Thosand Oaks, CA: Sage Publications.Google Scholar
  12. De Freitas, S. (2014). Education in computer generated environments. London, New York: Routledge.Google Scholar
  13. De Freitas, S., Gibson, D., Du Plessis, C., Halloran, P., Williams, E., Ambrose, M., Dunwell, I., Arnab, S. (2014). Foundations of dynamic learning analytics: Using university student data to increase retention. British Journal of Educational Technology. doi: 10.1111/bjet.12212.
  14. Deloitte. (2010). Student retention analytics in the Curtin business school. WA: Bentley.Google Scholar
  15. Dempster, A., Laird, N., & Rubin, D. (1977). Maximum liklihood from incomplete data via the EM algorithm. Journal of the Royal Stasticial Society Series B (Methodological), 39(1), 1–38.Google Scholar
  16. Dunleavy, M., Dede, C., & Mitchell, R. (2008). Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning. Journal of Science Education and Technology, 18(1), 7–22. doi: 10.1007/s10956-008-9119-1.CrossRefGoogle Scholar
  17. Ertmer, P. A., Richardson, J. C., Belland, B., Camin, D., Connolly, P., Coulthard, G., et al. (2007). Using peer feedback to enhance the quality of student online postings…. Journal of Computer-Mediated Communication, 12(2), 1–15.CrossRefGoogle Scholar
  18. Eseryel, D., Ifenthaler, D., & Ge, X. (2013). Validation study of a method for assessing complex ill-structured problem solving by using causal representations. Educational Technology Research and Development, 61, 443–463. doi: 10.1007/s11423-013-9297-2.CrossRefGoogle Scholar
  19. Gibson, D. (2012). Game changers for transforming learning environments. In F. Miller (Ed.), Transforming learning environments: Strategies to shape the next generation (advances in educational administration) (Vol. 16, pp. 215–235). Bradford: Emerald Group Publishing Ltd. doi: 10.1108/S1479-3660(2012)0000016014.Google Scholar
  20. Gibson, D., & Clarke-Midura, J. (2013). Some psychometric and design implications of game-based learning analytics. In D. Ifenthaler, J. Spector, P. Isaias, & D. Sampson (Eds.), E-learning systems, environments and approaches: Theory and implementation. London: Springer.Google Scholar
  21. Gibson, D., & Jakl, P. (2013). Data challenges of leveraging a simulation to assess learning. West Lake Village, CA. http://www.curveshift.com/images/Gibson_Jakl_data_challenges.pdf
  22. Gonzalez-Sanchez, J., Chavez-Echeagaray, M. E., Lin, L., Baydogan, M., Christopherson, R., Gibson, D. et al. (2013). Affect recognition in learning scenarios: Matching facial- and BCI-based values. In 2013 IEEE 13th international conference on advanced learning technologies (pp. 70–71). doi: 10.1109/ICALT.2013.26.
  23. Hall, M., National, H., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., & Witten, I. H. (2009). The WEKA data mining software: An update. SIGKDD Explorations, 11, 10–18. doi: 10.1145/1656274.1656278.CrossRefGoogle Scholar
  24. Han, J., Cheng, H., Xin, D., & Yan, X. (2007). Frequent pattern mining: Current status and future directions. Data Mining and Knowledge Discovery, 15(1), 55–86. doi: 10.1007/s10618-006-0059-1.CrossRefGoogle Scholar
  25. Hasler, B. S., Tuchman, P., & Friedman, D. (2013). Virtual research assistants: Replacing human interviewers by automated avatars in virtual worlds. Computers in Human Behavior, 29, 1608–1616. doi: 10.1016/j.chb.2013.01.004.CrossRefGoogle Scholar
  26. Hegland, M. (2005). The Apriori algorithm: A tutorial. Institute for Mathematical Sciences Preprint Series. http://www2.ims.nus.edu.sg/preprints/2005-29.pdf
  27. Jordan, S. (2009). Assessment for learning: pushing the boundaries of computer-based assessment. Research in Higher Education, 3(1), 11–19.Google Scholar
  28. Kline, P. (1998). The new psychometrics: Science, psychology, and measurement. London: Routledge.Google Scholar
  29. Lenhard, W., Baier, H., Hoffmann, J., & Schneider, W. (2007). Automatic scoring of constructed-response items with latent semantic analysis. Diagnostica, 53(3), 155–165. http://apps.isiknowledge.com.libproxy.unm.edu/full_record.do?product=WOS&colname=WOS&search_mode=RelatedRecords&qid=25&SID=2AhKOADF4oMFGj8O5P9&page=3&doc=30
  30. Mislevy, R. (2011). Evidence-centered design for simulation-based assessment. Los Angeles, CA: The National Center for Research on Evaluation, Standards, and Student Testing.Google Scholar
  31. Morgenthaler, S. (2009). Exploratory data analysis. Wiley Interdisciplinary Reviews: Computational Statistics,. doi: 10.1002/wics.2.Google Scholar
  32. Morris, T. W. (2002). Conversational agents for game-like virtual environments. In AAAI 2002 spring symposium on artificial intelligence and interactive entertainment (pp. 82–86). http://www.google.com/url?sa=t&ct=res&cd=1&url=http://www.qrg.northwestern.edu/aigames.org/papers2002/TMorris02.pdf&ei=crZZR9yjNozssgKpgPHlBg&usg=AFQjCNHT8F92vjehn8upv4EK5JlzWP3Hrg&sig2=yBAN659b8qSvJ-IXns3pBQ
  33. Nair, R., Tambe, M., Marsella, S., & Raines, T. (2004). Automated assistants for analyzing team behaviors. Autonomous Agents and Multi-Agent Systems, 8, 69–111. doi: 10.1023/B:AGNT.0000009411.79208.f4.CrossRefGoogle Scholar
  34. Olsen, P. (2007). Staying the course: Retention and attrition in Australian universities findings (pp. 1–16). Sydney. http://www.spre.com.au/download/AUIDFRetentionResultsFindings.pdf
  35. Quellmalz, E., Timms, M., Buckley, B., Davenport, J., Loveland, M., & Silberglitt, M. (2012). 21st Century dynamic assessment. In M. Mayrath, J. Clarke-Midura, & D. Robinson (Eds.), Technology-based assessments for 21st century skills: Theoretical and practical implications from modern research (pp. 55–90). Charlotte, NC: Information Age Publishers.Google Scholar
  36. Rissanen, M. J., Kume, N., Kuroda, Y., Kuroda, T., Yoshimura, K., & Yoshihara, H. (2008). Asynchronous teaching of psychomotor skills through VR annotations: Evaluation in digital rectal examination. Studies in Health Technology and Informatics, 132, 411–416.Google Scholar
  37. Rupp, A., Gushta, M., Mislevy, R., & Shaffer, D. (2010). Evidence-centered design of epistemic games: Measurement principles for complex learning environments. Journal of Technology, Learning, and Assessment, 8(4), 1–45.Google Scholar
  38. Sabourin, J., Mott, B., & Lester, J. (2011). Computational models of affect and empathy for pedagogical virtual agents. In Standards in Emotion Modeling. Leiden, NL: Lorentz Center International Center for workshops in the Sciences. http://www.lorentzcenter.nl/lc/web/2011/464/presentations/Sabourin.pdf
  39. Schmidt, M., & Lipson, H. (2009). Symbolic regression of implicit equations. Genetic Programming Theory and Practice, 7(Chap 5), 73–85.Google Scholar
  40. Shum, S. B., & Ferguson, R. (2012). Social learning analytics. Educational Technology and Society, 15, 3–26.Google Scholar
  41. Siemens, G. (2012). Learning analytics: Envisioning a research discipline and a domain of practice. In 2nd international conference on learning analytics and knowledge (pp. 4–8). doi: 10.1145/2330601.2330605.
  42. Sporns, O. (2011). Networks of the brain. Cambridge, MA: MIT Press.Google Scholar
  43. Stevens, R., Johnson, D., & Soller, A. (2005). Probabilities and predictions: Modeling the development of scientific problem-solving skills. Cell Biology Education, 4(1), 42–57. doi: 10.1187/cbe.04-03-0036.CrossRefGoogle Scholar
  44. Tashakkori, A., & Teddlie. (2003). Handbook of mixed methods in social and behavioral research. SAGE. http://books.google.com/books?id=F8BFOM8DCKoC&pgis=1
  45. Tukey, J. W. (1977). Exploratory data analysis. Analysis (Vol. 2, p. 688). doi: 10.1007/978-1-4419-7976-6.
  46. Turkay, S., & Tirthali, D. (2010). Youth leadership development in virtual worlds: A case study. Procedia Social and Behavioral Sciences, 2(2), 3175–3179. doi: 10.1016/j.sbspro.2010.03.485.CrossRefGoogle Scholar
  47. Van Der Pol, J., Van Den Berg, B., Admiraal, W., & Simons, P. (2008). The nature, reception, and use of online peer feedback in higher education. Computers and Education, 51(4), 1804–1817. doi: 10.1016/j.compedu.2008.06.001.CrossRefGoogle Scholar
  48. Webb, M. (2010). Beginning teacher education and collaborative formative e-assessment. Assessment and Evaluation in Higher Education, 35(5), 597–618.CrossRefGoogle Scholar
  49. Witten, F., & Frank, E. (2005). Data mining: Practical machine learning tools and techniques (p. 524).Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  1. 1.Curtin UniversityBentley, PerthAustralia
  2. 2.Murdoch UniversityPerthAustralia

Personalised recommendations