Skip to main content

Advertisement

Log in

Student performance prediction, risk analysis, and feedback based on context-bound cognitive skill scores

  • Published:
Education and Information Technologies Aims and scope Submit manuscript

Abstract

In recent times, Educational Data Mining and Learning Analytics have been abundantly used to model decision-making to improve teaching/learning ecosystems. However, the adaptation of student models in different domains/courses needs a balance between the generalization and context specificity to reduce the redundancy in creating domain-specific models. This paper explores the predictive power and generalization of a feature - context-bound cognitive skill score- in estimating the likelihood of success or failure of a student in a traditional higher education course so that the appropriate intervention is provided to help the students. To identify the students at risk in different courses, we applied classification algorithms on context-bound cognitive skill scores of a student to estimate the chances of success or failure, especially failure. The context-bound cognitive skill scores were aggregated based on the learning objective of a course to generate meaningful visual feedback to teachers and students so that they can understand why some students are predicted to be at risk. Evaluation of the generated model shows that this feature is applicable in a range of courses, and it mitigates the effort in engineering features/models for each domain. We submit that overall, context-bound cognitive skill scores prove to be effective in flagging the student performance when the accurate metrics related to learning activities and social behaviors of the students are unavailable.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
€32.70 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Finland)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Aldowah, H., Al-Samarraie, H., & Fauzy, W. M. (2019). Educational data mining and learning analytics for 21st century higher education: a review and synthesis. Telematics and Informatics, 37, 13–49.

    Article  Google Scholar 

  • Asif, R., Merceron, A., Ali, S. A., & Haider, N. G. (2017). Analyzing undergraduate students’ performance using educational data mining. Computers & Education, 113, 177–194.

    Article  Google Scholar 

  • Avella, J. T., Kebritchi, M., Nunn, S. G., & Kanai, T. (2016). Learning analytics methods, benefits, and challenges in higher education: a systematic literature review. Online Learning, 20(2), 13–29.

    Google Scholar 

  • Baker, R., et al. (2010). Data mining for education. International Encyclopedia of Education, 7(3), 112–118.

    Article  Google Scholar 

  • Baker, R. S. (2019). Challenges for the future of educational data mining: The baker learning analytics prizes. JEDM— Journal of Educational Data Mining, 11(1), 1–17.

    MathSciNet  Google Scholar 

  • Baker, R. S., & Inventado, P. S. (2014). Educational data mining and learning analytics. In Learning analytics (pp. 61–75). Springer.

  • Bakhshinategh, B., Zaiane, O. R., ElAtia, S., & Ipperciel, D. (2018). Educational data mining applications and tasks: a survey of the last 10 years. Education and Information Technologies, 23(1), 537–553.

    Article  Google Scholar 

  • Bloom, B. S., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., Raths, j, & Wittrock, M. C. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. Prentice Hall.

  • Bloom, B. S., et al. (1956). Taxonomy of educational objectives. vol. 1: Cognitive domain. New York: McKay, 20, 24.

    Google Scholar 

  • Chen, Y., Liu, Q., Huang, Z., Wu, L., Chen, E., Wu, R., Su, Y., & Hu, G. (2017). Tracking knowledge proficiency of students with educational priors. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management (pp. 989–998).

  • Chen, Y., Zheng, Q., Ji, S., Tian, F., Zhu, H., & Liu, M. (2020). Identifying at-risk students based on the phased prediction model. Knowledge and Information Systems, 62(3), 987–1003.

    Article  Google Scholar 

  • Conijn, R., Snijders, C., Kleingeld, A., & Matzat, U. (2016). Predicting student performance from lms data: a comparison of 17 blended courses using moodle lms. IEEE Transactions on Learning Technologies, 10(1), 17–29.

    Article  Google Scholar 

  • Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine Learning, 20(3), 273–297.

    MATH  Google Scholar 

  • Cox, K., Imrie, B. W., & Miller, A. (2014). Student assessment in higher education: a handbook for assessing performance. Routledge.

  • Finn, J. D., & Zimmer, K. S. (2012). Student engagement: What is it? why does it matter?. In Handbook of research on student engagement (pp. 97–131). Springer.

  • Gardner J. (2012). Assessment and learning. Sage.

  • Gitinabard, N., Xu, Y., Heckman, S., Barnes, T., & Lynch, C. F. (2019). How widely can prediction models be generalized? performance prediction in blended courses. IEEE Transactions on Learning Technologies, 12(2), 184–197.

    Article  Google Scholar 

  • Haridas, M., Gutjahr, G., Raman, R., Ramaraju, R., & Nedungadi, P. (2020). Predicting school performance and early risk of failure from an intelligent tutoring system. Education and Information Technologies, 1–19.

  • Helal, S., Li, J., Liu, L., Ebrahimie, E., Dawson, S., Murray, D. J., & Long, Q. (2018). Predicting academic performance by considering student heterogeneity. Knowledge-Based Systems, 161, 134–146.

    Article  Google Scholar 

  • Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: a review. Computers & Education, 90, 36–53.

    Article  Google Scholar 

  • Henrie, C. R., Bodily, R., Larsen, R., & Graham, C. R. (2018). Exploring the potential of lms log data as a proxy measure of student engagement. Journal of Computing in Higher Education, 30(2), 344–362.

    Article  Google Scholar 

  • Khan, A., & Ghosh, S. K. (2021). Student performance analysis and prediction in classroom learning: a review of educational data mining studies. Education and Information Technologies, 26(1), 205–240.

    Article  Google Scholar 

  • Kiely, K. (2014). Cognitive function. Encyclopedia of quality of life and well-being research, 974–978.

  • Kleinbaum, D. G., Dietz, K., Gail, M., Klein, M., & Klein, M. (2002). Logistic regression. Springer.

  • Krathwohl, D. R. (2002). A revision of bloom’s taxonomy: an overview. Theory Into Practice, 41(4), 212–218.

    Article  Google Scholar 

  • Leitner, P., Khalil, M., & Ebner, M. (2017). Learning analytics in higher education—a literature review. Learning analytics: Fundaments, applications, and trends, 1–23.

  • Liu, Q., Wu, R., Chen, E., Xu, G., Su, Y., Chen, Z., & Hu, G. (2018). Fuzzy cognitive diagnosis for modelling examinee performance. ACM Transactions on Intelligent Systems and Technology (TIST), 9(4), 1–26.

    Article  Google Scholar 

  • Liu, Q., Huang, Z., Yin, Y., Chen, E., Xiong, H., Su, Y., & Hu, G. (2019). Ekt: Exercise-aware knowledge tracing for student performance prediction. IEEE Transactions on Knowledge and Data Engineering, 33 (1), 100–115.

    Article  Google Scholar 

  • Marbouti, F., Diefes-Dux, H. A., & Madhavan, K. (2016). Models for early prediction of at-risk students in a course using standards-based grading. Computers & Education, 103, 1–15.

    Article  Google Scholar 

  • Narayanan, S., Saj, F. M., Bijlani, K., & Rajan, S. P. (2018a). Automatic assessment item bank calibration for learning gap identification. In 2018 International conference on advances in computing, communications and informatics (ICACCI) (pp. 1429–1435). IEEE.

  • Narayanan, S., Saj, F. M., Soumya, M., & Bijlani, K. (2018b). Predicting assessment item difficulty levels using a gaussian mixture model. In 2018 International conference on data science and engineering (ICDSE) (pp. 1–6). IEEE.

  • Nghe, N. T., Janecek, P., & Haddawy, P. (2007). A comparative analysis of techniques for predicting academic performance. In 2007 37th annual frontiers in education conference-global engineering: knowledge without borders, opportunities without passports (pp. T2G–7). IEEE.

  • Okubo, F., Yamashita, T., Shimada, A., & Ogata, H. (2017). A neural network approach for students’ performance prediction. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 598–599).

  • Olivé, D M, Huynh, D. Q., Reynolds, M., Dougiamas, M., & Wiese, D. (2020). A supervised learning framework: Using assessment to identify students at risk of dropping out of a mooc. Journal of Computing in Higher Education, 32(1), 9–26.

    Article  Google Scholar 

  • Papamitsiou, Z. K., & Economides, A. A. (2014). Learning analytics and educational data mining in practice: a systematic literature review of empirical evidence. Educational Technology & Society, 17(4), 49–64.

    Google Scholar 

  • Pardo, A., Poquet, O., Martínez-Maldonado, R, & Dawson, S. (2017). Provision of data-driven student feedback in la & edm. Handbook of learning analytics, 163–174.

  • Polyzou, A., & Karypis, G. (2019). Feature extraction for next-term prediction of poor student performance. IEEE Transactions on Learning Technologies, 12(2), 237–248.

    Article  Google Scholar 

  • Romero, C., & Ventura, S. (2010). Educational data mining: a review of the state of the art. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 40(6), 601–618.

    Article  Google Scholar 

  • Romero, C., & Ventura, S. (2020). Educational data mining and learning analytics: an updated survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 10(3), e1355.

    Google Scholar 

  • Romero, C., Ventura, S., Pechenizkiy, M., & Baker, R. S. (2010). Handbook of educational data mining. CRC press.

  • Romero, C., López, M. I., Luna, J. M., & Ventura, S. (2013). Predicting students’ final performance from participation in on-line discussion forums. Computers & Education, 68, 458–472.

    Article  Google Scholar 

  • She, H. C., Cheng, M. T., Li, T. W., Wang, C. Y., Chiu, H. T., Lee, P. Z., Chou, W. C., & Chuang, M. H. (2012). Web-based undergraduate chemistry problem-solving: The interplay of task performance, domain knowledge and web-searching strategies. Computers & Education, 59(2), 750–761.

    Article  Google Scholar 

  • Siemens, G., & Baker, R. S. d. (2012). Learning analytics and educational data mining: towards communication and collaboration. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 252–254).

  • Soumya, M., Sugathan, T., & Bijlani, K. (2017). Improve student placement using job competency modeling and personalized feedback. In 2017 International conference on advances in computing, communications and informatics (ICACCI) (pp. 1751–1755). IEEE.

  • Srilekshmi, M., Sindhumol, S., Chatterjee, S., & Bijlani, K. (2016). Learning analytics to identify students at-risk in moocs. In 2016 IEEE Eighth international conference on technology for education (T4E) (pp. 194–199). IEEE.

  • Tomasevic, N., Gvozdenovic, N., & Vranes, S. (2020). An overview and comparison of supervised data mining techniques for student exam performance prediction. Computers & Education, 143, 103676.

    Article  Google Scholar 

  • Verbert, K., Manouselis, N., Drachsler, H., & Duval, E. (2012). Dataset-driven research to support learning and knowledge analytics. Educational Technology & Society, 15(3), 133–148.

    Google Scholar 

  • Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The current landscape of learning analytics in higher education. Computers in Human Behavior, 89, 98–110.

    Article  Google Scholar 

  • Zimmermann, J., Brodersen, K. H., Heinimann, H. R., & Buhmann, J. M. (2015). A model-based approach to predicting graduate-level performance using indicators of undergraduate-level performance. Journal of Educational Data Mining, 7(3), 151–176.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Soumya MD.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

MD, S., Krishnamoorthy, S. Student performance prediction, risk analysis, and feedback based on context-bound cognitive skill scores. Educ Inf Technol 27, 3981–4005 (2022). https://doi.org/10.1007/s10639-021-10738-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10639-021-10738-2

Keywords

Navigation