Skip to main content
Log in

Evaluating the impact of prior required scaffolding items on the improvement of student performance prediction

  • Published:
Education and Information Technologies Aims and scope Submit manuscript

Abstract

Recently, tracking student behavior has become a very important phase for constructing adaptive educational systems. Several researchers have developed various methods based on machine learning for better tracing students’ knowledge. Most of these methods have shown an effective estimation of student features and an accurate prediction of future performance. However, these methods recognized certain limitations since they use only the correctness of prior student responses to make predictions without paying attention to many other important student behaviors. In addition, researchers have only considered scaffolding items as a pure method of learning without having analyzed student performance at the time of answering these items. Our purpose in this article is to conduct an experiment that aims to evaluate how best to use data about the prior required scaffolding items to predict future student performance. For this reason, we proposed two separate models, namely, the first one identifies whether a student has previously required to use scaffolding items prior main question or has immediately answered it without requiring assistance. For the second model, as an improvement of model 1, our objective is to improve the student’s performance under the constraint of answering scaffolding items. The performance of our two models is evaluated against the original Performance Factors Analysis algorithm to mark differences. The results show that the two proposed models provide a positive improvement in predicting the future performance of students. Moreover, our second model can reliably increase the predictive accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Acharya, A., & Sinha, D. (2014). Early prediction of students performance using machine learning techniques. International Journal of Computer Applications 107(1).

  • An, Y.J., & Cao, L. (2014). Examining the effects of metacognitive scaffolding on students’ design problem solving and metacognitive skills in an online environment. Journal of Online Learning and Teaching, 10(4), 552.

    Google Scholar 

  • Arroyo, I., Woolf, B.P., Burelson, W., Muldner, K., Rai, D., Tai, M. (2014). A multimedia adaptive tutoring system for mathematics that addresses cognition, metacognition and affect. International Journal of Artificial Intelligence in Education, 24(4), 387–426.

    Article  Google Scholar 

  • Asselman, A., Nasseh, A., Aammou, S. (2018). Revealing strengths, weaknesses and prospects of intelligent collaborative e-learning systems. Advances in Science, Technology and Engineering Systems Journal, 3(3), 67–79. https://doi.org/10.25046/aj030310.

    Article  Google Scholar 

  • Azevedo, R., & Hadwin, A.F. (2005). Scaffolding self-regulated learning and metacognition–implications for the design of computer-based scaffolds. Instructional Science, 33(5), 367–379.

    Article  Google Scholar 

  • d Baker, R.S., Corbett, A.T., Aleven, V. (2008). More accurate student modeling through contextual estimation of slip and guess probabilities in bayesian knowledge tracing. In International conference on intelligent tutoring systems (pp. 406–415): Springer.

  • d Baker, R.S., Corbett, A.T., Gowda, S.M., Wagner, A.Z., MacLaren, B.A., Kauffman, L.R., Mitchell, A.P., Giguere, S. (2010). Contextual slip and prediction of student performance after use of an intelligent tutor. In International conference on user modeling, adaptation, and personalization (pp. 52–63): Springer.

  • Binh, H.T., & Duy, B.T. (2017). Predicting students’ performance based on learning style by using artificial neural networks. In 2017 9th international conference on knowledge and systems engineering (KSE) (pp. 48–53): IEEE.

  • Brusilovsky, P., & Millán, E. (2007). User models for adaptive hypermedia and adaptive educational systems. In The adaptive web (pp. 3–53): Springer.

  • Chang, R.I., Hung, Y.H., Lin, C.F. (2015). Survey of learning experiences and influence of learning style preferences on user intentions regarding mooc s. British Journal of Educational Technology, 46(3), 528–541.

    Article  Google Scholar 

  • Chrysafiadi, K., & Virvou, M. (2015). Student modeling for personalized education: A review of the literature. In Advances in personalized web-based education (pp. 1–24): Springer.

  • Conati, C., Gertner, A., Vanlehn, K. (2002). Using bayesian networks to manage uncertainty in student modeling. User Modeling and User-Adapted Interaction, 12(4), 371–417.

    Article  Google Scholar 

  • Feng, M., Heffernan, N., Koedinger, K. (2009). Addressing the assessment challenge with an online system that tutors as it assesses. User Modeling and User-Adapted Interaction, 19(3), 243–266.

    Article  Google Scholar 

  • Gong, Y., Beck, J.E., Heffernan, N.T. (2011). How to construct more accurate student models: Comparing and optimizing knowledge tracing and performance factor analysis. International Journal of Artificial Intelligence in Education, 21(1-2), 27–46.

    Google Scholar 

  • González-Brenes, J., Huang, Y., Brusilovsky, P. (2014). General features in knowledge tracing to model multiple subskills, temporal item response theory, and expert knowledge. In The 7th international conference on educational data mining (pp. 84–91): University of Pittsburgh.

  • Han, J., Pei, J., Kamber, M. (2011). Data mining: Concepts and techniques. Amsterdam: Elsevier.

    MATH  Google Scholar 

  • Heffernan, N.T., & Heffernan, C.L. (2014). The assistments ecosystem: Building a platform that brings scientists and teachers together for minimally invasive research on human learning and teaching. International Journal of Artificial Intelligence in Education, 24(4), 470–497.

    Article  MathSciNet  Google Scholar 

  • Iqbal, Z., Qadir, J., Mian, A.N., Kamiran, F. (2017). Machine learning based student grade prediction: A case study. arXiv:1708.08744.

  • Khajah, M.M., Huang, Y., González-Brenes, J. P., Mozer, M.C., Brusilovsky, P. (2014). Integrating knowledge tracing and item response theory: A tale of two frameworks. In CEUR Workshop proceedings, (Vol. 1181 pp. 7–15): University of Pittsburgh.

  • de Koch, N.P. (2001). Software engineering for adaptive hypermedia systems-reference model modeling techniques and development process.

  • Kohavi, R., & et al. (1995). A study of cross-validation and bootstrap for accuracy estimation and model selection. In Ijcai, (Vol. 14 pp. 1137–1145). Montreal, Canada.

  • Korhonen, A.M., Ruhalahti, S., Veermans, M. (2019). The online learning process and scaffolding in student teachers’ personal learning environments. Education and Information Technologies, 24(1), 755–779.

    Article  Google Scholar 

  • Kukkonen, J., Dillon, P., Kärkkäinen, S., Hartikainen-Ahia, A., Keinonen, T. (2016). Pre-service teachers’ experiences of scaffolded learning in science through a computer supported collaborative inquiry. Education and Information Technologies, 21(2), 349–371.

    Article  Google Scholar 

  • Lajoie, S.P. (2005). Extending the scaffolding metaphor. Instructional Science, 33(5-6), 541–557.

    Article  Google Scholar 

  • Liyanage, M.P.P., KS, L.G., Hirakawa, M. (2016). Detecting learning styles in learning management systems using data mining. Journal of Information Processing, 24(4), 740–749.

    Article  Google Scholar 

  • Ma, Y., Agnihotri, L., Baker, R., Mojarad, S. (2016). Effect of student ability and question difficulty on duration. International Educational Data Mining Society.

  • Mayo, M., & Mitrovic, A. (2000). Using a probabilistic student model to control problem difficulty. In International conference on intelligent tutoring systems (pp. 524–533): Springer.

  • Minn, S., Yu, Y., Desmarais, M.C., Zhu, F., Vie, J.J. (2018). Deep knowledge tracing and dynamic student classification for knowledge tracing. In 2018 IEEE International conference on data mining (ICDM) (pp. 1182–1187): IEEE.

  • Morfidi, E., Mikropoulos, A., Rogdaki, A. (2018). Using concept mapping to improve poor readers’ understanding of expository text. Education and Information Technologies, 23(1), 271–286.

    Article  Google Scholar 

  • Ostrow, K.S., Selent, D., Wang, Y., Van Inwegen, E.G., Heffernan, N.T., Williams, J.J. (2016). The assessment of learning infrastructure (ali): The theory, practice, and scalability of automated assessment. In Proceedings of the sixth international conference on learning analytics & knowledge (pp. 279–288): ACM.

  • Papoušek, J., Pelánek, R., Řihák, J., Stanislav, V. (2015). An analysis of response times in adaptive practice of geography facts. In Proceedings of the 8th international conference on educational data mining (pp. 562–563): Citeseer.

  • Pardos, Z.A., & Heffernan, N.T. (2010). Modeling individualization in a bayesian networks implementation of knowledge tracing. In International conference on user modeling, adaptation, and personalization (pp. 255–266): Springer.

  • Pardos, Z.A., & Heffernan, N.T. (2011). Kt-idem: Introducing item difficulty to the knowledge tracing model. In International conference on user modeling, adaptation, and personalization (pp. 243–254): Springer.

  • Pardos, Z.A., & Yudelson, M.V. (2013). Towards moment of learning accuracy. In AIED 2013 workshops proceedings, (Vol. 4 p. 3).

  • Pavlik, P.I. Jr, Cen, H., Koedinger, K.R. (2009a). Learning factors transfer analysis: Using learning curve analysis to automatically generate domain models. Online Submission.

  • Pavlik, P.I. Jr, Cen, H., Koedinger, K.R. (2009b). Performance factors analysis–a new alternative to knowledge tracing. Online Submission.

  • Pelánek, R. (2017). Bayesian knowledge tracing, logistic models, and beyond: An overview of learner modeling techniques. User Modeling and User-Adapted Interaction, 27(3-5), 313–350.

    Article  Google Scholar 

  • Pelánek, R. (2018). Exploring the utility of response times and wrong answers for adaptive learning. In Proceedings of the fifth annual ACM conference on learning at scale (p. 18): ACM.

  • van de Pol, J., Volman, M., Oort, F., Beishuizen, J. (2015). The effects of scaffolding in the classroom: Support contingency and student independent working time in relation to student achievement, task effort and appreciation of support. Instructional Science, 43(5), 615–641.

  • Salleh, S.M., Shukur, Z., Judi, H.M. (2018). Scaffolding model for efficient programming learning based on cognitive load theory. Int. J. Pure Appl. Math, 118 (7), 77–83.

    Google Scholar 

  • Shahiri, A.M., Husain, W., et al. (2015). A review on predicting student’s performance using data mining techniques. Procedia Computer Science, 72, 414–422.

    Article  Google Scholar 

  • Simons, K.D., & Klein, J.D. (2007). The impact of scaffolding and student achievement levels in a problem-based learning environment. Instructional science, 35 (1), 41–72.

    Article  Google Scholar 

  • Van Laer, S., & Elen, J. (2017). In search of attributes that support self-regulation in blended learning environments. Education and Information Technologies, 22(4), 1395–1454.

    Article  Google Scholar 

  • Wang, Y., & Heffernan, N. (2013). Extending knowledge tracing to allow partial credit: Using continuous versus binary nodes. In International conference on artificial intelligence in education (pp. 181–188): Springer.

  • Wang, Y., & Heffernan, N.T. (2011). The “assistance” model: Leveraging how many hints and attempts a student needs. In Twenty-fourth international FLAIRS conference.

  • Xiong, X., Pardos, Z.A., et al. (2011). An analysis of response time data for improving student performance prediction.

  • Xiong, X., Zhao, S., Van Inwegen, E.G., Beck, J.E. (2016). Going deeper with deep knowledge tracing. International Educational Data Mining Society.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Amal ASSELMAN.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

ASSELMAN, A., KHALDI, M. & AAMMOU, S. Evaluating the impact of prior required scaffolding items on the improvement of student performance prediction. Educ Inf Technol 25, 3227–3249 (2020). https://doi.org/10.1007/s10639-019-10077-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10639-019-10077-3

Keywords

Navigation