Skip to main content
Log in

Learning analytics in programming courses: Review and implications

  • Published:
Education and Information Technologies Aims and scope Submit manuscript

A Correction to this article was published on 03 April 2023

This article has been updated

Abstract

Learning analytics (LA) is a significant field of study to examine and identify difficulties the novice programmers face while learning how to program. Despite producing notable research by the community in the specified area, rare work is observed to synthesize these research efforts and discover the dimensions that guide the future research of learning analytics in programming courses (LAPC). This work demonstrates review of the learning analytics research for initial level programming courses by exploring different types and sources of data used for LA, and evaluating some pertinent facets of reporting, prediction, intervention, and refinements exhibited in literature. Based on the reviewed aspects, a taxonomy of LAPC research has been proposed along with the associated benefits. The results reveal that most of the learning analytics studies in programming courses used assessment data, which is generated from conventional assessment processes. However, the analysis based on more granular level data covering the cognitive dimensions and concept specific facets could improve accuracies and reveal the precise aspects of learning. In addition, the coding analysis parameters can broadly be categorized into code quality and coding process. These categories can further be classified to present twenty-five sub-categories of coding parameters for analyzing the behaviors of novice programmers. Moreover, efforts are required for early identification of effective and ineffective behavioral patterns through performance predictions in order to deliver timely interventions. Lastly, this review emphasizes the integration of related processes to optimize the future research efforts of conducting the learning analytics research for programming courses.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Data availability

This work presents a meta-analysis and review of the research articles available on the respective digital repositories.

Change history

References

  • Ahadi, A., Hellas, A., & Lister, R. (2017). A contingency table derived method for analyzing course data. ACM Transactions on Computing Education (TOCE), 17(3), 13.

    Google Scholar 

  • Ahadi, A., Lister, R., Haapala, H., & Vihavainen, A. (2015). Exploring machine learning methods to automatically identify students in need of assistance. In: Proceedings of the eleventh annual International Conference on International Computing Education Research (pp. 121–130). ACM.

  • Ahmad, R., Sarlan, A., Hashim, A. S., & Hassan, M. F. (2017). Relationship between hands-on and written coursework assessments with critical thinking skills in structured programming course. In: 2017 7th World Engineering Education Forum (WEEF) (pp. 231–235). IEEE.

  • Ahmad, A., Zeshan, F., Khan, M. S., Marriam, R., Ali, A., & Samreen, A. (2020). The impact of gamification on learning outcomes of computer science majors. ACM Transactions on Computing Education (TOCE), 20(2), 1–25.

    Article  Google Scholar 

  • Ala-Mutka, K. M. (2005). A survey of automated assessment approaches for programming assignments. Computer Science Education, 15(2), 83–102.

    Article  Google Scholar 

  • Albluwi, I. (2018). A Closer Look at the Differences Between Graders in Introductory Computer Science Exams. IEEE Transactions on Education, 61(3), 253–260.

    Article  Google Scholar 

  • Allinjawi, A. A., Al-Nuaim, H. A., & Krause, P. (2014). An Achievement Degree Analysis Approach to Identifying Learning Problems in Object-Oriented Programming. ACM Transactions on Computing Education (TOCE), 14(3), 20.

    Google Scholar 

  • Al-Rifaie, M. M., Yee-King, M., & d'Inverno, M. (2017). Boolean prediction of final grades based on weekly and cumulative activities. In: 2017 Intelligent Systems Conference (IntelliSys) (pp. 462–469). IEEE.

  • Ashenafi, M. M., Riccardi, G., & Ronchetti, M. (2015). Predicting students' final exam scores from their course activities. In: 2015 IEEE Frontiers in Education Conference (FIE) (pp. 1–9). IEEE.

  • Avella, J. T., Kebritchi, M., Nunn, S. G., & Kanai, T. (2016). Learning analytics methods, benefits, and challenges in higher education: A systematic literature review. Online Learning, 20(2), 13–29.

    Google Scholar 

  • Azcona, D., Hsiao, I. H., & Smeaton, A. F. (2018). Personalizing computer science education by leveraging multimodal learning analytics. In: 2018 IEEE Frontiers in Education Conference (FIE) (pp. 1–9). IEEE.

  • Carter, A. S., Hundhausen, C. D., & Adesope, O. (2017). Blending measures of programming and social behavior into predictive models of student achievement in early computing courses. ACM Transactions on Computing Education (TOCE), 17(3), 12.

    Google Scholar 

  • Baker, R. S., & Inventado, P. S. (2014). Educational data mining and learning analytics. In: Learning analytics (pp. 61–75). Springer.

  • Berges, M., Striewe, M., Shah, P., Goedicke, M., & Hubwieser, P. (2016). Towards deriving programming competencies from student errors. In: 2016 International Conference on Learning and Teaching in Computing and Engineering (LaTICE) (pp. 19–23). IEEE.

  • Bhatia, S., Kohli, P., & Singh, R. (2018). Neuro-symbolic program corrector for introductory programming assignments. In: 2018 IEEE/ACM 40th International Conference on Software Engineering (ICSE) (pp. 60–70). IEEE.

  • Blikstein, P., Worsley, M., Piech, C., Sahami, M., Cooper, S., & Koller, D. (2014). Programming pluralism: Using learning analytics to detect patterns in the learning of computer programming. Journal of the Learning Sciences, 23(4), 561–599.

    Article  Google Scholar 

  • Carter, A. S., Hundhausen, C. D., & Adesope, O. (2015). The normalized programming state model: Predicting student performance in computing courses based on programming behavior. In: Proceedings of the eleventh annual International Conference on International Computing Education Research (pp. 141–150). ACM.

  • Caspari-Sadeghi, S. (2022). Applying learning analytics in online environments: Measuring learners’ engagement unobtrusively. In: Frontiers in Education, 7(1).

  • Castro-Wunsch, K., Ahadi, A., & Petersen, A. (2017). Evaluating neural networks as a method for identifying students in need of assistance. In: Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (pp. 111–116). ACM.

  • Chatti, M. A., Dyckhoff, A. L., Schroeder, U., & Thüs, H. (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning, 4(5–6), 318–331.

    Article  Google Scholar 

  • Chaweewan, C., Surarerks, A., Rungsawang, A., & Manaskasemsak, B. (2018). Development of Programming capability framework based on aptitude and skill. In: 2018 3rd International Conference on Computer and Communication Systems (ICCCS) (pp. 104–108). IEEE.

  • Chung, C. Y., & Hsiao, I. H. (2020). Investigating patterns of study persistence on self-assessment platform of programming problem-solving. In: Proceedings of the 51st ACM Technical Symposium on Computer Science Education (pp. 162–168).

  • Delev, T., & Gjorgjevikj, D. (2017). Static analysis of source code written by novice programmers. In: 2017 IEEE Global Engineering Education Conference (EDUCON) (pp. 825–830). IEEE.

  • Dorodchi, M., Dehbozorgi, N., & Frevert, T. K. (2017).” I wish I could rank my exam's challenge level!”: An algorithm of Bloom's taxonomy in teaching CS1. In: 2017 IEEE Frontiers in Education Conference (FIE) (pp. 1–5). IEEE.

  • Doshi, J. C., Christian, M., & Trivedi, B. H. (2014). Effect of conceptual cue based (CCB) practical exam evaluation of learning and evaluation approaches: a case for use in process-based pedagogy. In: 2014 IEEE sixth international conference on technology for education (pp. 90–94). IEEE.

  • Du, X., Yang, J., Shelton, B. E., Hung, J. L., & Zhang, M. (2021). A systematic meta-review and analysis of learning analytics research. Behaviour & Information Technology, 40(1), 49–62.

    Article  Google Scholar 

  • Echeverría, L., Cobos, R., Machuca, L., & Claros, I. (2017). Using collaborative learning scenarios to teach programming to non-CS majors. Computer Applications in Engineering Education, 25(5), 719–731.

    Article  Google Scholar 

  • Edwards, S. H., Tilden, D. S., & Allevato, A. (2014a). Pythy: improving the introductory python programming experience. In: Proceedings of the 45th ACM technical symposium on Computer science education (pp. 641–646). ACM.

  • Edwards, S. H., Shams, Z., & Estep, C. (2014b). Adaptively identifying non-terminating code when testing student programs. In: Proceedings of the 45th ACM technical symposium on Computer science education (pp. 15–20). ACM.

  • Effenberger, T., Pelánek, R., & Cechák, J. (2020). Exploration of the robustness and generalizability of the additive factors model. In: Proceedings of the Tenth International Conference on Learning Analytics & Knowledge (pp. 472–479).

  • España-Boquera, S., Guerrero-López, D., Hermida-Pérez, A., Silva, J., & Benlloch-Dualde, J. V. (2017). Analyzing the learning process (in Programming) by using data collected from an online IDE. In: 2017 16th International Conference on Information Technology Based Higher Education and Training (ITHET) (pp. 1–4). IEEE.

  • Estey, A., Keuning, H., & Coady, Y. (2017). Automatically classifying students in need of support by detecting changes in programming behaviour. In: Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (pp. 189–194). ACM.

  • Esteero, R., Khan, M., Mohamed, M., Zhang, L. Y., & Zingaro, D. (2018). Recursion or iteration: Does it matter what students choose?. In: Proceedings of the 49th ACM Technical Symposium on Computer Science Education (pp. 1011–1016). ACM.

  • Farooq, M. S., Hamid, A., Alvi, A., & Omer, U. (2022). Blended Learning Models, Curricula, and Gamification in Project Management Education. IEEE Access, 10, 60341–60361. https://doi.org/10.1109/ACCESS.2022.3180355

    Article  Google Scholar 

  • Farooq, M. S., Omer, U., Tehseen, R., & Nisah, I. U. (2021). Software project management education: a systematic review. VFAST Transactions on Software Engineering, 9(3), 102–119.

  • Funabiki, N., Ishihara, N., & Kao, W. C. (2016). Analysis of fill-in-blank problem solution results in Java programming course. In: 2016 IEEE 5th Global Conference on Consumer Electronics (pp. 1–2). IEEE.

  • Fu, X., Shimada, A., Ogata, H., Taniguchi, Y., & Suehiro, D. (2017). Real-time learning analytics for c programming language courses. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 280–288). ACM.

  • Gomes, A., Correia, F. B., & Abreu, P. H. (2016). Types of assessing student-programming knowledge. In: 2016 IEEE Frontiers in Education Conference (FIE) (pp. 1–8). IEEE.

  • Gomes, A., & Correia, F. B. (2018). Bloom’s taxonomy based approach to learn basic programming loops. In: 2018 IEEE Frontiers in Education Conference (FIE) (pp. 1–5). IEEE.

  • Grawemeyer, B., Halloran, J., England, M., & Croft, D. (2022). Feedback and engagement on an introductory programming module. In: Computing Education Practice 2022 (pp. 17–20).

  • Guzmán-Valenzuela, C., Gómez-González, C., Rojas-Murphy Tagle, A., & Lorca-Vyhmeister, A. (2021). Learning analytics in higher education: A preponderance of analytics but very little learning? International Journal of Educational Technology in Higher Education, 18(1), 1–19.

    Article  Google Scholar 

  • Hashima, A. S., Hamoud, A. K., & Awadh, W. A. (2018). Analyzing students’ answers using association rule mining based on feature selection. Journal of Southwest Jiaotong University, 53(5).

  • Hellings, J., & Haelermans, C. (2020). The effect of providing learning analytics on student behaviour and performance in programming: a randomised controlled experiment. Higher Education, 1–18.

  • Heinonen, K., Hirvikoski, K., Luukkainen, M., & Vihavainen, A. (2014). Using CodeBrowser to seek differences between novice programmers. In: Proceedings of the 45th ACM technical symposium on Computer science education (pp. 229–234). ACM.

  • Hijon-Neira, R., Velázquez-Iturbide, Á., Pizarro-Romero, C., & Carriço, L. (2014). Merlin-know, an interactive virtual teacher for improving learning in Moodle. In: 2014 IEEE Frontiers in Education Conference (FIE) Proceedings (pp. 1–8). IEEE.

  • Hilton, S., & Rague, B. (2015). Is video feedback more effective than written feedback?. In: 2015 IEEE Frontiers in Education Conference (FIE) (pp. 1–6). IEEE.

  • Hsiao, I. H., Huang, P. K., & Murphy, H. (2017). Integrating programming learning analytics across physical and digital space. IEEE Transactions on Emerging Topics in Computing.

  • Hsu, W. C., & Mimura, Y. (2017). Understanding the secondary digital gap: Learning challenges and performance in college introductory programming courses. In: 2017 IEEE 9th International Conference on Engineering Education (ICEED) (pp. 59–64). IEEE.

  • Hundhausen, C. D., Olivares, D. M., & Carter, A. S. (2017). IDE-based learning analytics for computing education: A process model, critical review, and research agenda. ACM Transactions on Computing Education (TOCE), 17(3), 1–26.

    Article  Google Scholar 

  • Ihantola, P., Vihavainen, A., Ahadi, A., Butler, M., Börstler, J., Edwards, S. H., ... & Rubio, M. Á. (2015). Educational data mining and learning analytics in programming: Literature review and case studies. In: Proceedings of the 2015 ITiCSE on Working Group Reports (pp. 41–63). ACM.

  • Iqbal Malik, S., & Coldwell-Neilson, J. (2017). Impact of a new teaching and learning approach in an introductory programming course. Journal of Educational Computing Research, 55(6), 789–819.

    Article  Google Scholar 

  • Jamjoom, M., Alabdulkreem, E., Hadjouni, M., Karim, F., & Qarh, M. (2021). Early prediction for at-risk students in an introductory programming course based on student self-efficacy. Informatica, 45(6).

  • Janke, E., Brune, P., & Wagner, S. (2015). Does outside-in teaching improve the learning of object-oriented programming?. In: Proceedings of the 37th International Conference on Software Engineering (Volume 2, pp. 408–417). IEEE Press.

  • Khalil, M., Prinsloo, P., & Slade, S. (2022). A Comparison of learning analytics frameworks: a systematic review. In: LAK22: 12th International Learning Analytics and Knowledge Conference (pp. 152–163).

  • Khan, I., Ahmad, A. R., Jabeur, N., & Mahdi, M. N. (2021). Machine learning prediction and recommendation framework to support introductory programming course. International Journal of Emerging Technologies in Learning, 16(17).

  • Kaliisa, R., Kluge, A., & Mørch, A. I. (2022). Overcoming challenges to the adoption of learning analytics at the practitioner level: A critical analysis of 18 learning analytics frameworks. Scandinavian Journal of Educational Research, 66(3), 367–381.

    Article  Google Scholar 

  • King, C. E. (2018). Feasibility and acceptability of peer assessment for coding assignments in large lecture based programming engineering courses. In: 2018 IEEE Frontiers in Education Conference (FIE) (pp. 1–9). IEEE.

  • Kitchenham, B., & Charters S. (2007). Guidelines for performing systematic literature reviews in software engine,ering. Retrieved from https://www.elsevier.com/__data/promis_misc/525444systematicreviewsguide.pdf. Accessed 7 Jun 2022

  • Koong, C. S., Tsai, H. Y., Hsu, Y. Y., & Chen, Y. C. (2018). The Learning effectiveness analysis of JAVA programming with automatic grading system. In: 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC) (Vol. 2, pp. 99–104). IEEE.

  • Kumar, A. N. (2017). Learning styles of computer science I students. In: 2017 IEEE Frontiers in Education Conference (FIE) (pp. 1–6). IEEE.

  • Lagus, J., Longi, K., Klami, A., & Hellas, A. (2018). Transfer-Learning Methods in Programming Course Outcome Prediction. ACM Transactions on Computing Education (TOCE), 18(4), 19.

    Google Scholar 

  • Liao, S. N., Zingaro, D., Thai, K., Alvarado, C., Griswold, W. G., & Porter, L. (2019). A robust machine learning technique to predict low-performing students. ACM Transactions on Computing Education (TOCE), 19(3), 18.

    Google Scholar 

  • Lin, C. C., Liu, Z. C., Chang, C. L., & Lin, Y. W. (2018). A Genetic algorithm-based personalized remedial learning system for learning object-oriented concepts of Java. IEEE Transactions on Education.

  • Mangaroska, K., & Giannakos, M. N. (2018). Learning analytics for learning design: A systematic literature review of analytics-driven design to enhance learning. IEEE Transactions on Learning Technologies.

  • Malliarakis, C., Satratzemi, M., & Xinogalos, S. (2016). CMX: The effects of an educational MMORPG on learning and teaching computer programming. IEEE Transactions on Learning Technologies, 10(2), 219–235.

    Article  Google Scholar 

  • Marcolino, A. S., Santos, A., Schaefer, M., & Barbosa, E. F. (2018). Towards a Catalog of Gestures for M-learning Applications for the Teaching of Programming. In: 2018 IEEE Frontiers in Education Conference (FIE) (pp. 1–9). IEEE.

  • McCall, D., & Kölling, M. (2019). A New Look at Novice Programmer Errors. ACM Transactions on Computing Education (TOCE), 19(4), 38.

    Google Scholar 

  • Ninrutsirikun, U., Imai, H., Watanapa, B., & Arpnikanondt, C. (2020). Principal Component clustered factors for determining study performance in computer programming class. Wireless Personal Communications, 1–20.

  • Omer, U., Farooq, M. S., & Abid, A. (2020). Cognitive Learning Analytics Using Assessment Data and Concept Map: A Framework-Based Approach for Sustainability of Programming Courses. Sustainability, 12(17), 6990.

    Article  Google Scholar 

  • Omer, U., Farooq, M. S., & Abid, A. (2021). Introductory programming course: Review and future implications. PeerJ Computer Science, 7, e647.

    Article  Google Scholar 

  • Ouhbi, S., Idri, A., Fernández-Alemán, J. L., & Toval, A. (2015). Requirements engineering education: A systematic mapping study. Requirements Engineering, 20(2), 119–138.

    Article  Google Scholar 

  • Pardo, A. (2014). Designing learning analytics experiences. Learning analytics: From research to practice (pp. 15–38). New York: Springer.

  • Pereira, F. D., Fonseca, S. C., Oliveira, E. H., Cristea, A. I., Bellhäuser, H., Rodrigues, L., ... & Carvalho, L. S. (2021). Explaining individual and collective programming students’ behavior by interpreting a black-box predictive Model. IEEE Access, 9, 117097–117119.

  • Pereira, F. D., Oliveira, E. H., Oliveira, D. B., Cristea, A. I., Carvalho, L. S., Fonseca, S. C., ... & Isotani, S. (2020). Using learning analytics in the Amazonas: understanding students’ behaviour in introductory programming. British Journal of Educational Technology.

  • Premchaiswadi, W., Porouhan, P., & Premchaiswadi, N. (2018). Process modeling, behavior analytics and group performance assessment of e-learning logs via fuzzy miner algorithm. In: 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC) (Vol. 2, pp. 304–309). IEEE.

  • Qazdar, A., Er-Raha, B., Cherkaoui, C., & Mammass, D. (2019). A machine learning algorithm framework for predicting students performance: A case study of baccalaureate students in Morocco. Education and Information Technologies, 24(6), 3577–3589.

    Article  Google Scholar 

  • Rojas-López, A., Rincón-Flores, E. G., Mena, J., García-Peñalvo, F. J., & Ramírez-Montoya, M. S. (2019). Engagement in the course of programming in higher education through the use of gamification. Universal Access in the Information Society, 18(3), 583–597.

    Article  Google Scholar 

  • Romero, C., & Ventura, S. (2020). Educational data mining and learning analytics: An updated survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 10(3), e1355.

    Google Scholar 

  • Rosiene, C. P., & Rosiene, J. A. (2015). Flipping a programming course: The good, the bad, and the ugly. In; 2015 IEEE Frontiers in Education Conference (FIE) (pp. 1–3). IEEE.

  • Rubio, M. A., Romero-Zaliz, R., Mañoso, C., & Angel, P. (2014). Enhancing an introductory programming course with physical computing modules. In: 2014 IEEE Frontiers in Education Conference (FIE) Proceedings (pp. 1–8). IEEE.

  • Santana, B. L., Figueredo, J. S. L., & Bittencourt, R. A. (2018). Motivation of engineering students with a mixed-contexts approach to introductory programming. In: 2018 IEEE Frontiers in Education Conference (FIE) (pp. 1–9). IEEE.

  • Scott, M. J., Counsell, S., Lauria, S., Swift, S., Tucker, A., Shepperd, M., & Ghinea, G. (2015). Enhancing practice and achievement in introductory programming with a robot Olympics. IEEE Transactions on Education, 58(4), 249–254.

    Article  Google Scholar 

  • Seeling, P., & Eickholt, J. (2017). Levels of active learning in programming skill acquisition: From lecture to active learning rooms. In: 2017 IEEE Frontiers in Education Conference (FIE) (pp. 1–5). IEEE.

  • Seeling, P. (2016). Evolving an introductory programming course: Impacts of student self-empowerment, guided hands-on times, and self-directed training. In: 2016 IEEE Frontiers in Education Conference (FIE) (pp. 1–5). IEEE.

  • Seanosky, J., Guillot, I., Boulanger, D., Guillot, R., Guillot, C., Kumar, V., ... & Munshi, A. (2017). Real-time visual feedback: a study in coding analytics. In: 2017 IEEE 17th International Conference on Advanced Learning Technologies (ICALT) (pp. 264–266). IEEE.

  • Simkins, D., & Decker, A. (2016). Examining the intermediate programmers understanding of the learning process. In: 2016 IEEE Frontiers in Education Conference (FIE) (pp. 1–4). IEEE.

  • Society for Learning Analytics Research. About, (2012). Retrieved from http://www.solaresearch.org/about/. Accessed 10 Mar 2012

  • Su, X., Wang, T., Qiu, J., & Zhao, L. (2015). Motivating students with new mechanisms of online assignments and examination to meet the MOOC challenges for programming. In; 2015 IEEE Frontiers in Education Conference (FIE) (pp. 1–6). IEEE.

  • Tempelaar, D. (2021). Learning analytics and its data sources: Why we need to foster all of them. International Conference on Cognition and Exploratory Learning in Digital Age (CELDA).

  • Turner, S. A., Pérez-Quiñones, M. A., & Edwards, S. H. (2018). Peer Review in CS2: Conceptual Learning and High-Level Thinking. ACM Transactions on Computing Education (TOCE), 18(3), 13.

    Google Scholar 

  • Ullah, Z., Lajis, A., Jamjoom, M., Altalhi, A. H., Shah, J., & Saleem, F. (2019). A Rule-Based Method for Cognitive Competency Assessment in Computer Programming Using Bloom’s Taxonomy. IEEE Access, 7, 64663–64675.

    Article  Google Scholar 

  • Ureel, L. C., & Wallace, C. (2015). WebTA: Automated iterative critique of student programming assignments. In: 2015 IEEE Frontiers in Education Conference (FIE) (pp. 1–9). IEEE.

  • Ureel II, L. C., & Wallace, C. (2019). Automated critique of early programming antipatterns. In: Proceedings of the 50th ACM Technical Symposium on Computer Science Education (pp. 738–744). ACM.

  • Veerasamy, A. K., Laakso, M. J., & D’Souza, D. (2022). Formative assessment tasks as indicators of student engagement for predicting at-risk students in programming courses. Informatics in Education, 21(2), 375–393.

    Google Scholar 

  • Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The current landscape of learning analytics in higher education. Computers in Human Behavior, 89, 98–110.

    Article  Google Scholar 

  • Wainer, J., & Xavier, E. C. (2018). A Controlled Experiment on Python vs C for an Introductory Programming Course: Students’ Outcomes. ACM Transactions on Computing Education (TOCE), 18(3), 12.

    Google Scholar 

  • Watson, C., & Li, F. W. (2014). Failure rates in introductory programming revisited. In: Proceedings of the 2014 conference on Innovation & technology in computer science education (pp. 39–44). ACM.

  • Watson, C., Li, F. W., & Godwin, J. L. (2014). No tests required: comparing traditional and dynamic predictors of programming success. Association for Computing Machinery (ACM).

  • Wood, Z., & Keen, A. (2015). Building worlds: Bridging imperative-first and object-oriented programming in CS1-CS2. In: Proceedings of the 46th ACM Technical Symposium on Computer Science Education (pp. 144–149). ACM.

  • Wohlin, C. (2014). Guidelines for snowballing in systematic literature studies and a replication in software engineering. In: Proceedings of the 18th international conference on evaluation and assessment in software engineering (pp. 1–10).

  • Xinogalos, S. (2015). Object-oriented design and programming: An investigation of novices’ conceptions on objects and classes. ACM Transactions on Computing Education (TOCE), 15(3), 13.

    Google Scholar 

  • Yeomans, L., Zschaler, S., & Coate, K. (2019). Transformative and Troublesome? Students’ and Professional Programmers’ Perspectives on Difficult Concepts in Programming. ACM Transactions on Computing Education (TOCE), 19(3), 23.

    Google Scholar 

  • Zheng, L., Zhen, Y., Niu, J., & Zhong, L. (2022). An exploratory study on fade-in versus fade-out scaffolding for novice programmers in online collaborative programming settings. Journal of Computing in Higher Education, 1–28.

  • Zur, E., & Vilner, T. (2014). Assessing the assessment—Insights into CS1 exams. In: 2014 IEEE Frontiers in Education Conference (FIE) Proceedings (pp. 1–7). IEEE.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Uzma Omer.

Ethics declarations

Competing interest

There is no competing interest associated to this research.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The original online version of this article was revised: The original publication of this article contains inappropriate numbering of figures and there were missing figures as well.

Appendix 1

Appendix 1

Tables

Table 10 Classification and quality scoring

10

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Omer, U., Tehseen, R., Farooq, M.S. et al. Learning analytics in programming courses: Review and implications. Educ Inf Technol 28, 11221–11268 (2023). https://doi.org/10.1007/s10639-023-11611-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10639-023-11611-0

Keywords

Navigation