Advertisement

Journal of Computing in Higher Education

, Volume 29, Issue 2, pp 201–217 | Cite as

Transaction-level learning analytics in online authentic assessments

  • Rob Nyland
  • Randall S. Davies
  • John Chapman
  • Gove Allen
Article

Abstract

This paper presents a case for the use of transaction-level data when analyzing automated online assessment results to identify knowledge gaps and misconceptions for individual students. Transaction-level data, which records all of the steps a student uses to complete an assessment item, are preferred over traditional assessment formats that submit only the final answer, as the system can detect persistent misconceptions. In this study we collected transaction-level data from 996 students enrolled in an online introductory spreadsheet class. Each student’s final answer and step-by-step attempts were coded for misconceptions or knowledge gaps regarding the use of absolute references over four assessment occasions. Overall, the level of error revealed was significantly higher in the step-by-step processes compared to the final submitted answers. Further analysis suggests that students most often have misconceptions regarding non-critical errors. Data analysis also suggests that misconceptions identified at the transaction level persist over time.

Keywords

Educational data mining Assessment Data logs Learning analytics 

References

  1. Abdous, M., He, W., & Yen, C. J. (2012). Using data mining for predicting relationships between online question theme and final grade. Educational Technology and Society, 15(3), 77–88.Google Scholar
  2. Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: Using learning analytics to increase student success. In Proceedings of the 2nd international conference on learning analytics and knowledgeLAK’12 (pp. 267–270). New York, NY: ACM. Retrieved from http://dl.acm.org/citation.cfm?id=2330666
  3. Baker, R. S. J. D., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3–16. Retrieved from http://educationaldatamining.org/JEDM/index.php/JEDM/article/view/8
  4. Berland, M., Martin, T., Benton, T., Petrick Smith, C., & Davis, D. (2013). Using learning analytics to understand the learning pathways of novice programmers. Journal of the Learning Sciences, 22(4), 564–599. doi: 10.1080/10508406.2013.836655.CrossRefGoogle Scholar
  5. Blikstein, P. (2011). Using learning analytics to assess students’ behavior in open-ended programming tasks. In Proceedings of the 1st international conference on learning analytics and knowledgeLAK’11 (pp. 110–116). Banff, Alberta: ACM. Retrieved from http://dl.acm.org/citation.cfm?id=2090132
  6. Bowers, A. J. (2010). Analyzing the longitidunal K-12 grading histories of entire cohorts of students: Grades, data drive decision making, dropping out and hierachical cluster analysis. Practical Assessment, Research & Evaluation, 15(7), 1–18.Google Scholar
  7. Brown, J. S., & VanLehn, K. (1980). Repair theory: A generative theory of bugs in procedural skills. Cognitive Science, 4, 379–426. doi: 10.1207/s15516709cog0404_3.CrossRefGoogle Scholar
  8. Campbell, J. P. (2007). Utilizing student data within the course management system to determine undergraduate academic success: An exploratory study. (Doctoral Dissertation). Retrieved from http://docs.lib.purdue.edu/dissertations/AAI3287222/
  9. Chung, G. (2014). Toward the relational management of educational measurement data. Teachers College Record, 116(11), 1–16. Retrieved from http://www.tcrecord.org/Content.asp?ContentId=17650
  10. Chung, G. K. W. K., & Kerr, D. (2012). A primer on data logging to support extraction of meaningful information from educational games: An example from Save Patch (CRESST Report 814). Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).Google Scholar
  11. Collins, A., Brown, J. S., & Newman, S. E. (1987). Cognitive apprenticeship: Teaching the craft of reading, writing (No. 403). Champaing, Illinois: University of Illinois at Urbana-Champaign.Google Scholar
  12. Corbett, A. T., & Anderson, J. R. (1995). Knowledge tracing: Modeling the acquisition of procedural knowledge. User Modelling and User-Adapted Interaction, 4(4), 253–278. doi: 10.1007/BF01099821.CrossRefGoogle Scholar
  13. Cummins, M., Johnson, L., & Adams, S. (2012). The NMC horizon report: 2012 higher education edition. The New Media Consortium.Google Scholar
  14. Davies, R., & West, R. E. (2013). Technology integration in school settings. In M. Spector, M. J. Bishop, M. D. Merrill, & J. Elen (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 841–853). New York, NY: Lawrence Erlbaum.Google Scholar
  15. Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 304–317. doi: 10.1504/IJTEL.2012.051816.CrossRefGoogle Scholar
  16. Harlen, W. (2007). Assessment of learning. Los Angeles, CA: SAGE Publications Inc.Google Scholar
  17. Khan, S. (2012). The one world schoolhouse: Education reimagined. New York, NY: Grand Central Publishing.Google Scholar
  18. Koedinger, K. R., Corbett, A. T., & Perfetti, C. (2010). The Knowledge-Learning-Instruction framework: Bridging the science-practice chasm to enhance robust student learning (Technical Report). Pittsburg, PA: Carnegie Mellon University.Google Scholar
  19. Koh, K. H., Basawapatna, A., Nickerson, H., & Repenning, A. (2014). Real time assessment of computational thinking. In Proceedings from 2014 IEEE symposium on visual languages and human-centric computing (VL/HCC) (pp. 49–52). IEEE. Retrieved from http://sgd.cs.colorado.edu/wiki/images/9/91/Paper_24.pdf
  20. Lehikoinen, J., & Koistinen, V. (2014). In big data we trust? Interactions, 21(5), 38–41.CrossRefGoogle Scholar
  21. Mayer, M. (2009). Innovation at Google: The physics of data. Retrieved from http://www.parc.com/event/936/innovation-atgoogle.html
  22. Miller, M. D., Linn, R. L., & Gronlund, N. E. (2013). Measurement and assessment in teaching (11th ed.). Upper Saddle River, NJ: Prentice-Hall.Google Scholar
  23. Morris, L. V., Wu, S., & Finnegan, C. L. (2005). Predicting retention in online general education courses. The American Journal of Distance Education, 19(1), 23–36. doi: 10.1207/s15389286ajde1901.CrossRefGoogle Scholar
  24. Nesbit, J., Zhou, M., Xu, Y., & Winne, P. (2007). Advancing log analysis of student interactions with cognitive tools. In Proceedings of 12th Biennial conference of the European association for research on learning and instruction (pp. 1–20). Retrieved from http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:Advancing+Log+Analysis+of+Student+Interactions+with+Cognitive+Tools#0
  25. Panko, R. R. (2013). The cognitive science of spreadsheet errors: Why thinking is bad. In Proceedings of 2013 46th Hawaii international conference on system sciences (pp. 4013–4022). IEEE. doi: 10.1109/HICSS.2013.513
  26. Panko, R. R., & Aurigemma, S. (2010). Revising the Panko-Halverson taxonomy of spreadsheet errors. Decision Support Systems, 49(2), 235–244. doi: 10.1016/j.dss.2010.02.009.CrossRefGoogle Scholar
  27. Pelligrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know. Washington DC: National Academy Press.Google Scholar
  28. Perera, D., Kay, J., Koprinska, I., Yacef, K., & Zaiane, O. R. (2009). Clustering and sequential pattern mining of online collaborative learning data. IEEE Transactions on Knowledge and Data Engineering, 21(6), 759–772. doi: 10.1109/TKDE.2008.138.CrossRefGoogle Scholar
  29. Popham, W. J. (2005). Classroom assessment: What teachers need to know (4th ed.). Boston, MA: Allyn and Bacon.Google Scholar
  30. Romero, C., Ventura, S., Pechenizkly, M., & Baker, R. S. (Eds.). (2011). Handbook of educational data mining. Baton Rouge, FL: CRC Press.Google Scholar
  31. Siemens, G. (2012). Learning analytics: Envisioning a research discipline and a domain of practice. In Proceedings of the 2nd international conference on learning analytics and knowledgeLAK’12 (pp. 4–8). New York, NY: ACM Press.Google Scholar
  32. VanLehn, K. (2006). The behavior of tutoring systems. International Journal of Artificial Intelligence in Education, 16(3), 227–265. Retrieved from http://dl.acm.org/citation.cfm?id=1435353
  33. Woolf, B. P. (2010). A roadmap for education technology. Retrieved from http://cra.org/ccc/wp-content/uploads/sites/2/2015/08/GROE-Roadmap-for-Education-Technology-Final-Report.pdf

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Brigham Young UniversityProvoUSA
  2. 2.Brigham Young UniversityProvoUSA

Personalised recommendations