Fuzzy-based active learning for predicting student academic performance using autoML: a step-wise approach

Abstract

Predicting students’ learning outcomes is one of the main topics of interest in the area of Educational Data Mining and Learning Analytics. To this end, a plethora of machine learning methods has been successfully applied for solving a variety of predictive problems. However, it is of utmost importance for both educators and data scientists to develop accurate learning models at low cost. Fuzzy logic constitutes an appropriate approach for building models of high performance and transparency. In addition, active learning reduces both the time and cost of labeling effort, by exploiting a small set of labeled data along with a large set of unlabeled data in the most efficient way. In addition, choosing the proper method for a given problem formulation and configuring the optimal parameter setting is a demanding task, considering the high-dimensional input space and the complexity of machine learning algorithms. As such, exploring the potential of automated machine learning (autoML) strategies from the perspective of machine learning adeptness is important. In this context, the present study introduces a fuzzy-based active learning method for predicting students’ academic performance which combines, in a modular way, autoML practices. A lot of experiments was carried out, revealing the efficiency of the proposed method for the accurate prediction of students at risk of failure. These insights may have the potential to support the learning experience and be useful the wider science of learning.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Notes

  1. 1.

    https://github.com/JasperSnoek/spearmint

  2. 2.

    https://github.com/sorend/fylearn

  3. 3.

    https://scikit-learn.org/stable/

References

  1. Arora, N., & Saini, J. (2013). A fuzzy probabilistic neural network for student’s academic performance prediction. International Journal of Innovative Research in Science, Engineering and Technology, 2(9), 4425–4432.

    Google Scholar 

  2. Bakhshinategh, B., Zaiane, O. R., ElAtia, S., & Ipperciel, D. (2018). Educational data mining applications and tasks: A survey of the last 10 years. Education and Information Technologies, 23, 537–553.

    Article  Google Scholar 

  3. Baudart, G., Hirzel, M., Kate, K., Ram, P., & Shinnar, A. (2020). Lale: Consistent automated machine learning. In KDD workshop on automation in machine learning (AutoML@KDD).

  4. Baudart, G., Kirchner, P., Hirzel, M., & Kate, K. (2020). Mining documentation to extract hyperparameter schemas. In ICML workshop on automated machine learning (AutoML@ICML) .

  5. Bengio, Y. (2000). Gradient-based optimization of hyperparameters. Neural Computation, 12, 1889–1900.

    Article  Google Scholar 

  6. Bergstra, J. S., Bardenet, R., Bengio, Y., & Kégl, B. (2011). Algorithms for hyper-parameter optimization. In Advances in neural information processing systems (pp. 2546–2554). USA: Curran Associates Inc.

  7. Bergstra, J., & Bengio, Y. (2012). Random search for hyper-parameter optimization. The Journal of Machine Learning Research, 13, 281–305.

    Google Scholar 

  8. Bergstra, J., & Cox, D. D. (2013). Hyperparameter optimization and boosting for classifying facial expressions: How good can a “Null” Model be? In Workshop on Challenges in Representation Learning, ICML.

  9. Bergstra, J., Yamins, D., & Cox, D. (2013). Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures. In Proceedings of the 30th international conference on international conference on machine learning (pp. 115–123). Atlanta.

  10. Bergstra, J., Yamins, D., & Cox, D. D. (2013). Hyperopt: A python library for optimizing the hyperparameters of machine learning algorithms. In Proceedings of the 12th Python in science conference (pp. 13–20). Brussels.

  11. Berland, M., Baker, R., & Blikstein, P. (2014). Educational data mining and learning analytics: Applications to constructionist research. Technology, Knowledge and Learning, 19(1–2), 205–220.

    Article  Google Scholar 

  12. Breiman, L., Friedman, J., Stone, C. J., & Olshen, R. A. (1984). Classification and regression trees. (pp. 237–251). Wadsworth Int. Group.

    Google Scholar 

  13. Brochu, E., Cora, V. M., & De Freitas, N. (2010). A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. University of British Columbia.

    Google Scholar 

  14. Chen, T., Shang, C., Su, P., & Shen, Q. (2018). Induction of accurate and interpretable fuzzy rules from preliminary crisp representation. Knowledge-Based Systems, 146, 152–166.

    Article  Google Scholar 

  15. Clark, S., Liu, E., Frazier, P., Wang, J., Oktay, D., & Vesdapunt, N. (2014). MOE: A global, black box optimization engine for real world metric optimization. MOE: A global, black box optimization engine for real world metric optimization. Retrieved from https://github.com/Yelp/MOE.

  16. Costa, E. B., Fonseca, B., Santana, M. A., de Araújo, F. F., & Rego, J. (2017). Evaluating the effectiveness of educational data mining techniques for early prediction of students’ academic failure in introductory programming courses. Computers in Human Behavior, 73, 247–256.

    Article  Google Scholar 

  17. Dasgupta, S. (2011). Two faces of active learning. Theoretical Computer Science, 412(19), 1767–1781.

    Article  Google Scholar 

  18. Davidsen, S., & Padmavathamma, M. (2017). A Novel Hybrid Fuzzy Pattern Classifier Using Order-Weighted Averaging. In Proceedings of the first international conference on computational intelligence and informatics (pp. 535–547).

  19. Davidsen, S., Sreedevi, E., & Padmavathamma, M. (2015). Local and global genetic fuzzy pattern classifiers. In International workshop on machine learning and data mining in pattern recognition (pp. 55–69).

  20. Dewancker, I., McCourt, M., & Clark, S. (2015). Bayesian optimization primer. [online] Available: https://sigopt.com/static/pdf/SigOpt_Bayesian_Optimization_Primer.pdf.

  21. Do, Q., & Chen, J.-F. (2013). A neuro-fuzzy approach in the classification of students’ academic performance. Computational Intelligence and Neuroscience. https://doi.org/10.1155/2013/179097.

    Article  Google Scholar 

  22. Dong, G., & Liu, H. (2018). Feature engineering for machine learning and data analytics. CRC Press.

    Google Scholar 

  23. Drăgulescu, B., & Bucos, M. (2020). Hyperparameter tuning using automated methods to improve models for predicting student success. In A. Lopata, R. Butkienė, D. Gudonienė, & V. Sukackė (Eds.), Information and software technologies. (pp. 309–320). Springer.

    Chapter  Google Scholar 

  24. Dubios, D., & Prade, H. (1980). Fuzzy sets and systems: theory and applications (Vol. 144). London: Mathematics in Science and Engineering, Academic press.

  25. Eggensperger, K., Feurer, M., Hutter, F., Bergstra, J., Snoek, J., Hoos, H., & Leyton-Brown, K. (2013). Towards an empirical foundation for assessing bayesian optimization of hyperparameters. NIPS Workshop on Bayesian Optimization in Theory and Practice, 10, 3.

    Google Scholar 

  26. El Aissaoui, O., El Madani, Y. E., Oughdir, L., & El Allioui, Y. (2019). A fuzzy classification approach for learning style prediction based on web mining technique in e-learning environments. Education and Information Technologies, 24, 1943–1959.

    Article  Google Scholar 

  27. ElAtia, S., Ipperciel, D., & Zaiane, O. R. (2016). Data mining and learning analytics: Applications in educational research. Wiley.

    Book  Google Scholar 

  28. Elkano, M., Bustince, H., & Galar, M. (2019). Do we still need fuzzy classifiers for small data in the era of big data? In 2019 IEEE international conference on fuzzy systems (FUZZ-IEEE) (pp. 1–6).

  29. Feurer, M., & Hutter, F. (2019). Hyperparameter optimization. In Automated Machine Learning (pp. 3–33). Springer.

  30. Feurer, M., Klein, A., Eggensperger, K., Springenberg, J., Blum, M., & Hutter, F. (2015). Efficient and robust automated machine learning. In Advances in neural information processing systems (pp. 2962–2970).

  31. Filev, D., & Yager, R. (1995). Analytic properties of maximum entropy OWA operators. Information Sciences, 85(1–3), 11–27.

    Article  Google Scholar 

  32. Ghosh, A., Meher, S., & Shankar, B. (2008). A novel fuzzy classifier based on product aggregation operator. Pattern Recognition, 41(3), 961–971.

    Article  Google Scholar 

  33. Guo, X. C., Yang, J. H., Wu, C. G., Wang, C. Y., & Liang, Y. C. (2008). A novel LS-SVMs hyper-parameter selection based on particle swarm optimization. Neurocomputing, 71, 3211–3215.

    Article  Google Scholar 

  34. Hamsa, H., Indiradevi, S., & Kizhakkethottam, J. J. (2016). Student academic performance prediction model using decision tree and fuzzy genetic algorithm. Procedia Technology, 25, 326–332.

    Article  Google Scholar 

  35. Hidayah, I., Permanasari, A., & Ratwastuti, N. (2013). Student classification for academic performance prediction using neuro fuzzy in a conventional classroom. In 2013 international conference on information technology and electrical engineering (ICITEE) (pp. 221–225).

  36. Huang, S.-J., Jin, R., & Zhou, Z.-H. (2010). Active learning by querying informative and representative examples. In J. D. Lafferty, C. K. Williams, J. Shawe-Taylor, R. S. Zemel, & A. Culotta (Eds.), Advances in neural information processing systems 23. (pp. 892–900). Curran Associates.

    Google Scholar 

  37. Huang, Z., Gedeon, T., & Nikravesh, M. (2008). Pattern trees induction: A new machine learning method. IEEE Transactions on Fuzzy Systems, 16(4), 958–970.

    Article  Google Scholar 

  38. Hutter, F., Hoos, H. H., & Leyton-Brown, K. (2011). Sequential model-based optimization for general algorithm configuration. In International conference on learning and intelligent optimization (pp. 507–523).

  39. Hutter, F., Lücke, J., & Schmidt-Thieme, L. (2015). Beyond manual tuning of hyperparameters. Künstliche Intelligenz, 29, 329–337.

    Article  Google Scholar 

  40. Karlos, S., Kostopoulos, G., & Kotsiantis, S. (2020). Predicting and interpreting students’ grades in distance higher education through a semi-regression method. Applied Sciences, Multidisciplinary Digital Publishing Institute, 10, 8413.

    Google Scholar 

  41. Kostopoulos, G., Karlos, S., Kotsiantis, S., & Ragos, O. (2018). Semi-supervised regression: A recent review. Journal of Intelligent & Fuzzy Systems, 35, 1483–1500.

    Article  Google Scholar 

  42. Kostopoulos, G., Kotsiantis, S., Fazakis, N., Koutsonikos, G., & Pierrakeas, C. (2019). A semi-supervised regression algorithm for grade prediction of students in distance learning courses. International Journal on Artificial Intelligence Tools, 28, 1940001.

    Article  Google Scholar 

  43. Kostopoulos, G., Lipitakis, A.-D., Kotsiantis, S., & Gravvanis, G. (2017). Predicting student performance in distance higher education using active learning. In International conference on engineering applications of neural networks (pp. 75–86).

  44. Kotthoff, L., Thornton, C., Hoos, H. H., Hutter, F., & Leyton-Brown, K. (2017). Auto-WEKA 2.0: Automatic model selection and hyperparameter optimization in WEKA. The Journal of Machine Learning Research, 18, 826–830.

    Google Scholar 

  45. Larsen, H. (2003). Efficient andness-directed importance weighted averaging operators. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 11(supp01), 67–82.

    Article  Google Scholar 

  46. Lewis, D., & Gale, W. (1994). A sequential algorithm for training text classifiers. In SIGIR’94 (pp. 3–12).

  47. Li, C.-L., Ferng, C.-S., & Lin, H.-T. (2012). Active learning with hinted support vector machine. In Asian Conference on Machine Learning (pp. 221–235).

  48. Lohweg, V., Diederichs, C., & Müller, D. (2004). Algorithms for hardware-based pattern recognition. EURASIP Journal on Advances in Signal Processing, 2004(12), 642357.

    Article  Google Scholar 

  49. Maron, O., & Moore, A. W. (1997). The racing algorithm: Model selection for lazy learners. Artificial Intelligence Review, 11, 193–225.

    Article  Google Scholar 

  50. Meher, S. (2007). A new fuzzy supervised classification method based on aggregation operator. In 2007 Third International IEEE Conference on Signal-Image Technologies and Internet-Based System (pp. 876–882).

  51. Melgani, F., Al Hashemy, B., & Taha, S. (2000). An explicit fuzzy supervised classification method for multispectral remote sensing images. IEEE Transactions on Geoscience and Remote Sensing, 38(1), 287–295.

    Article  Google Scholar 

  52. Mönks, U., Lohweg, V., & Larsen, H. (2009). Aggregation operator based fuzzy pattern classifier design. Lemgo Series on Industrial Information Technology, 3.

  53. Moreno-Marcos, P. M., Alario-Hoyos, C., Muñoz-Merino, P. J., & Kloos, C. D. (2018). Prediction in MOOCs: A review and future research directions. IEEE Transactions on Learning Technologies, 12, 384–401.

    Article  Google Scholar 

  54. Pedersen, M. (2010). Tuning and simplifying heuristical optimization. University of Southampton.

    Google Scholar 

  55. Rajeswari, A. M., & Deisy, C. (2019). Fuzzy logic based associative classifier for slow learners prediction. Journal of Intelligent and Fuzzy Systems, 36, 2691–2704.

    Article  Google Scholar 

  56. Ramirez-Loaiza, M., Sharma, M., Kumar, G., & Bilgic, M. (2017). Active learning: An empirical study of common baselines. Data Mining and Knowledge Discovery, 31(2), 287–313.

    Article  Google Scholar 

  57. Rao, S. S. (2019). Engineering optimization: Theory and practice. Wiley.

    Book  Google Scholar 

  58. Rasmussen, C. E., & Williams, C. K. (2006). Gaussian processes for machine learning the MIT press. (p. 2). MIT Press.

    Google Scholar 

  59. Romero, C., & Ventura, S. (2020). Educational data mining and learning analytics: An updated survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, e1355.

  60. Rusli, N., Ibrahim, Z., & Janor, R. (2008). Predicting students’ academic achievement: Comparison between logistic regression, artificial neural network, and Neuro-fuzzy. In 2008 international symposium on information technology (Vol. 1, pp. 1–6).

  61. Sclater, N. (2017). Learning analytics explained. Taylor & Francis.

    Book  Google Scholar 

  62. Senge, R., & Hüllermeier, E. (2010). Top-down induction of fuzzy pattern trees. IEEE Transactions on Fuzzy Systems, 19(2), 241–252.

    Article  Google Scholar 

  63. Settles, B. (2012). Active learning, volume 6 of synthesis lectures on artificial intelligence and machine learning. Berlin: Morgan & Claypool.

    Google Scholar 

  64. Simon, D. (2013). Evolutionary optimization algorithms. Wiley.

    Google Scholar 

  65. Snoek, J., Larochelle, H., & Adams, R. P. (2012). Practical bayesian optimization of machine learning algorithms. In 25–26th annual conference on neural information processing systems (Ed.), Advances in neural information processing systems (pp. 2951–2959). NIPS 2012.

  66. Srinivas, N., Krause, A., Kakade, S. M., & Seeger, M. (2010). Gaussian process optimization in the bandit setting: No regret and experimental design. In Proceedings of the 27th international conference on international conference on machine learning (pp. 1015–1022).

  67. Taylan, O., & Karagözoğlu, B. (2009). An adaptive neuro-fuzzy model for prediction of student’s academic performance. Computers and Industrial Engineering, 57(3), 732–741.

    Article  Google Scholar 

  68. Truong, A., Walters, A., Goodsitt, J., Hines, K., Bruss, C. B., & Farivar, R. (2019). Towards automated machine learning: Evaluation and comparison of automl approaches and tools. In 2019 IEEE 31st international conference on tools with artificial intelligence (ICTAI) (pp. 1471–1479).

  69. Tsiakmaki, M., Kostopoulos, G., Kotsiantis, S., & Ragos, O. (2019). Implementing AutoML in educational data mining for prediction tasks. Applied Sciences, 10(1), 90.

    Article  Google Scholar 

  70. Tsiakmaki, M., Kostopoulos, G., Kotsiantis, S., & Ragos, O. (2020). Fuzzy-based active learning for predicting student academic performance. In Proceedings of the 6th international conference on engineering & MIS 2020 (pp. 1–6).

  71. Verma, S. K., & Thakur, R. S. (2017). Fuzzy Association rule mining based model to predict students’ performance. International Journal of Electrical and Computer Engineering, 2088–8708, 7.

    Google Scholar 

  72. Waring, J., Lindvall, C., & Umeton, R. (2020). Automated machine learning: Review of the state-of-the-art and opportunities for healthcare. Artificial Intelligence in Medicine, 104, 101822.

    Article  Google Scholar 

  73. Yager, R. (1988). On ordered weighted averaging aggregation operators in multicriteria decisionmaking. IEEE Transactions on Systems, Man, and Cybernetics, 18(1), 183–190.

    Article  Google Scholar 

  74. Yang, Y.-Y., Lee, S.-C., Chung, Y.-A., Wu, T.-E., Chen, S.-A., & Lin, H.-T. (2017). Libact: Pool-based active learning in python.

  75. Yildiz, O., Bal, A., & Gulsecen, S. (2013). Improved fuzzy modelling to predict the academic performance of distance education students. The International Review of Research in Open and Distributed Learning, 14(5), 144.

    Article  Google Scholar 

  76. Yoo, D., & Kweon, I. S. (2019). Learning loss for active learning. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 93–102).

  77. Zadeh, L. (1965). Fuzzy sets. Information and Control, 8(3), 338–353.

    Article  Google Scholar 

  78. Zhao, Q., Wang, J.-L., Pao, T.-L., & Wang, L.-Y. (2020). Modified fuzzy rule-based classification system for early warning of student learning. Journal of Educational Technology Systems, 48(3), 385–406.

    Article  Google Scholar 

  79. Zhou, Z.-H. (2018). A brief introduction to weakly supervised learning. National Science Review, 5(1), 44–53.

    Article  Google Scholar 

Download references

Acknowledgements

Results presented in this work have been produced using the Aristotle University of Thessaloniki (AUTh) High Performance Computing Infrastructure and Resources.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Maria Tsiakmaki.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Tsiakmaki, M., Kostopoulos, G., Kotsiantis, S. et al. Fuzzy-based active learning for predicting student academic performance using autoML: a step-wise approach. J Comput High Educ (2021). https://doi.org/10.1007/s12528-021-09279-x

Download citation

Keywords

  • Educational data mining
  • Learning analytics
  • Fuzzy classification
  • Automated machine learning
  • Model selection
  • Hyperparameter optimization
  • Performance prediction
  • Pool-based active learning