Skip to main content
Log in

Models to provide guidance in flipped classes using online activity

  • Published:
Journal of Computing in Higher Education Aims and scope Submit manuscript

Abstract

The flipped classroom gives students the flexibility to organize their learning, while teachers can monitor their progress analyzing their online activity. In massive courses where there are a variety of activities, automated analysis techniques are required in order to process the large volume of information that is generated, to help teachers take timely and appropriate actions. In these scenarios, it is convenient to classify students into a small number of groups that can receive dedicated support. Using only online activity to group students has proven to be insufficient to characterize relevant groups, because of that this study proposes to understand differences in online activity using differences in course status and learning experience, using data from a programming course (n = 409). The model built shows that learning experience can be categorized in three groups, each with different academic performance and distinct online activity. The relationship between groups and online activity allowed us to build classifiers to detect students who are at risk of failing the course (AUC = 0.84) or need special support (AUC = 0.73), providing teachers with a useful mechanism for predicting and improving student outcomes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • Agresti, A. (2007). Introduction to categorical data analysis. Hoboken: Wiley.

    Book  Google Scholar 

  • Agudo-Peregrina, Á. F., Iglesias-Pradas, S., Conde-González, M. Á., & Hernández-García, Á. (2014). Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior,31, 542–550.

    Article  Google Scholar 

  • Andergassen, M., Mödritscher, F., & Neumann, G. (2014). Practice and repetition during exam preparation in blended learning courses: Correlations with learning results. Journal of Learning Analytics,1(1), 48–74.

    Article  Google Scholar 

  • Anderson, A., Huttenlocher, D., Kleinberg, J., & Leskovec, J. (2014). Engaging with massive online courses. In Proceedings of the 23rd international conference on World Wide Web (pp. 687–698). ACM.

  • Barba, P. G., Kennedy, G. E., & Ainley, M. D. (2016). The role of students’ motivation and participation in predicting performance in a MOOC. Journal of Computer Assisted Learning,3(32), 218–231.

    Article  Google Scholar 

  • Bergmann, J., & Sams, A. (2012). Flip your classroom: Reach every student in every class every day. Washington: International Society for Technology in Education.

    Google Scholar 

  • Bishop, J. L., & Verleger, M. A. (2013). The flipped classroom: A survey of the research. In ASEE national conference proceedings, Atlanta, GA (Vol. 30, No. 9).

  • Block, J. H., & Burns, R. B. (1976). 1: Mastery learning. Review of Research in Education,4(1), 3–49.

    Article  Google Scholar 

  • Bloom, B. S. (1984). The 2 sigma problem: The search for methods of group instruction as effective as one-to-one tutoring. Educational Researcher,13(6), 4–16.

    Article  Google Scholar 

  • Conijn, R., Snijders, C., Kleingeld, A., & Matzat, U. (2017). Predicting student performance from LMS data: A comparison of 17 blended courses using Moodle LMS. IEEE Transactions on Learning Technologies,10(1), 17–29.

    Article  Google Scholar 

  • Dillon, J., Bosch, N., Chetlur, M., Wanigasekara, N., Ambrose, G. A., Sengupta, B., & D’Mello, S. K. (2016). Student emotion, co-occurrence, and dropout in a MOOC context. In Proceedings of the 9th international conference on educational data mining (pp. 353–357).

  • Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning,4(5–6), 304–317.

    Article  Google Scholar 

  • Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. Internet and Higher Education,28, 68–84.

    Article  Google Scholar 

  • Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends,59(1), 64–71.

    Article  Google Scholar 

  • Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting learning strategies with analytics: Links with self-reported measures and academic performance. Journal of Learning Analytics,4(1), 113.

    Article  Google Scholar 

  • Guskey, T. R. (2007). Closing achievement gaps: revisiting Benjamin S. Bloom’s, “Learning for Mastery”. Journal of Advanced Academics,19(1), 8–31.

    Article  Google Scholar 

  • Hattie, J. (2013). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York: Routledge.

    Google Scholar 

  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112.

    Article  Google Scholar 

  • Haykin, S. (2009). Neural networks and learning machines. Upper Saddle River: Pearson.

    Google Scholar 

  • Ho, A. D., Reich, J., Nesterko, S., Seaton, D. T., Mullaney, T., Waldo, J., & Chuang, I. (2014). HarvardX and MITx: The first year of open online courses, Fall 2012–Summer 2013 (January 21, 2014) (HarvardX and MITx Working Paper No. 1). http://dx.doi.org/10.2139/ssrn.2381263.

  • Hone, K. S., & El Said, G. R. (2016). Exploring the factors affecting MOOC retention: A survey study. Computers and Education,98, 157–168.

    Article  Google Scholar 

  • Jensen, J. L., Kummer, T. A., & Godoy, P. D. D. M. (2015). Improvements from a flipped classroom may simply be the fruits of active learning. CBE-Life Sciences Education,14(1), 5.

    Article  Google Scholar 

  • Khalil, M., & Ebner, M. (2017). Clustering patterns of engagement in massive open online courses (MOOCs): The use of learning analytics to reveal student categories. Journal of Computing in Higher Education,29(1), 114–132.

    Article  Google Scholar 

  • Kizilcec, R. F., Piech, C., & Schneider, E. (2013). Deconstructing disengagement: Analyzing learner subpopulations in massive open online courses. In Proceedings of the third international conference on learning analytics and knowledge (pp. 170–179). ACM.

  • Lanza, S. T., Flaherty, B. P., & Collins, L. M. (2003). Latent class and latent transition analysis. In I. Weiner (Ed.), Handbook of psychology (pp. 663–685). Hoboken: Wiley.

    Google Scholar 

  • Nakamura, J., & Csikszentmihalyi, M. (2014). The concept of flow. Flow and the foundations of positive psychology (pp. 239–263). Dordrecht: Springer.

    Google Scholar 

  • O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education,25, 85–95.

    Article  Google Scholar 

  • Pekrun, R., Goetz, T., Daniels, L. M., Stupnisky, R. H., & Perry, R. P. (2010). Boredom in achievement settings: Exploring control-value antecedents and performance outcomes of a neglected emotion. Journal of Educational Psychology,102(3), 531.

    Article  Google Scholar 

  • Schwarzenberg, P., Navon, J., Nussbaum, M., Pérez-Sangustín, M., & Caballero, D. (2017). Learning experience assessment of flipped courses. Journal of Computing Higher Education,1, 11. https://doi.org/10.1007/s12528-017-9159-8.

    Article  Google Scholar 

  • Sharma, K., Jermann, P., & Dillenbourg, P. (2015). Identifying styles and paths toward success in MOOCs. In Proceedings of the 8th international conference on educational data mining (pp. 408–411).

  • Siemens, G., & Gasevic, D. (2012). Guest editorial—Learning and knowledge analytics. Educational Technology and Society,15(3), 1–2.

    Google Scholar 

  • Zimmerman, B. J. (2013). From cognitive modeling to self-regulation: A social cognitive career path. Educational Psychologist,48(3), 135–147.

    Article  Google Scholar 

  • Zimmerman, B. J., & Campillo, M. (2003). Motivating self-regulated problem solvers. In J. Davidson & R. Sternberg (Eds.), The psychology of problem solving (pp. 233–262). New York: Cambridge University Press.

    Chapter  Google Scholar 

Download references

Acknowledgements

This work was partially funded by Grant CONICYT-PCHA/doctorado Nacional/2013-21130045. This work was partially supported by the FONDECYT (11150231), and by the European Commission through the project LALA (586120-EPP-1-2017-1-ES-EPPKA2-CBHE-JP).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pablo Schwarzenberg.

Ethics declarations

Conflict of interest

The authors have no conflict of interest with the execution or outcomes of this study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix: Statistical models

Appendix: Statistical models

See Tables 15, 16 and 17.

Table 15 Latent class model
Table 16 Parameters of logistic risk classifiers
Table 17 Parameters of logistic guidance classifiers

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Schwarzenberg, P., Navon, J. & Pérez-Sanagustín, M. Models to provide guidance in flipped classes using online activity. J Comput High Educ 32, 282–306 (2020). https://doi.org/10.1007/s12528-019-09233-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12528-019-09233-y

Keywords

Navigation