Design, Deployment and Evaluation of a Flipped Learning First-Year Engineering Course

  • Abelardo PardoEmail author
  • Negin Mirriahi


This chapter focuses on the design of a flipped learning experience and, in particular, focuses on the types of activities and their scheduling. The problem can be described as how to stratify the type of activities and how they are distributed in time such that students are provided a gradual and engaging approach to achieve the learning outcomes. The flipped learning design model described in this section has been deployed in a first-year engineering course on computer systems at a higher education institution. In the remainder of the chapter, we assume that students with respect to a topic need to traverse the levels of the revised Bloom’s taxonomy starting with acquiring basic knowledge about a concept and then making a transition to the point where they can evaluate or create artefacts within that area.


Flipped learning Learning analytics Engineering education Educational technology Analytics 


  1. Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. In Proceedings ASCILITE 2012: 29th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education: Future Challenges, Sustainable Futures (pp. 78–87).Google Scholar
  2. Brookfield, S. D. (1995). Becoming a critically reflective teacher. San Francisco, CA, USA: Jossey-Bass Inc., Publishers.Google Scholar
  3. Brooks, C., Epp, C. D., Logan, G., & Greer, J. (2011). The who, what, when, and why of lecture capture. In Proceedings of the First International Conference on Learning Analytics and Knowledge (pp. 86–92).Google Scholar
  4. Clow, D. (2012). The learning analytics cycle: Closing the loop effectively. In S. Buckingham Shum, D. Gašević, & R. Ferguson (Eds.), 2nd International Conference on Learning Analytics and Knowledge (pp. 134–138). ACM Press.Google Scholar
  5. Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 1–13. doi: 10.1080/13562517.2013.827653
  6. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America (pp. 1–6). doi: 10.1073/pnas.1319030111
  7. Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for learning analytics. Educational Technology & Society, 15(3), 42–57.Google Scholar
  8. Guo, P., Kim, J., & Rubin, R. (2014). How video production affects student engagement: An empirical study of mooc videos. In Proceedings of the first ACM Conference on Learning at Scale (pp. 41–50).Google Scholar
  9. Houston, M., & Lin, L. (2012). Humanizing the classroom by flipping the homework versus lecture equation. In Society for Information Technology & Teacher Education International Conference (Vol. 2012, pp. 1177–1182). Chesapeake, VA, USA: Association for the Advancement of Computing in Education.Google Scholar
  10. Karpicke, J. D., & Roediger, H. L. (2008). The critical importance of retrieval for learning. Science (New York, N.Y.), 319(5865), 966–968. doi: 10.1126/science.1152408
  11. Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. doi: 10.1177/0002764213479367
  12. Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588–599. doi: 10.1016/j.compedu.2009.09.008
  13. Mu, X. (2010). Towards effective video annotation: An approach to automatically link notes with video content. Computers & Education, 55(4), 1752–1763. doi: 10.1016/j.compedu.2010.07.021
  14. Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400. doi: 10.1177/0002764213498851
  15. Vigentini, L., Mirriahi, N., & Kligyte, G. (2016). From reflective practitioner to active researcher: Towards a role for learning analytics in higher education scholarship. In M. Spector, B. Lockee, & M. Childress (Eds.), Learning, design, and technology. An international compendium of theory, research, practice, and policy (pp. 1–29). Springer International Publishing.Google Scholar
  16. Wieman, C. E. (2014). Large-scale comparison of science teaching methods sends clear message. In Proceedings of the National Academy of Sciences of the United States of America, 111(23), 8319–8320. doi: 10.1073/pnas.1407304111
  17. Winne, P. H., & Jamieson-Noel, D. (2002). Exploring students’ calibration of self reports about study tactics and achievement. Contemporary Educational Psychology, 27(4), 551–572.CrossRefGoogle Scholar
  18. Wise, A. F., Speer, J., Marbouti, F., & Hsiao, Y.-T. (2013). Broadening the notion of participation in online discussions: examining patterns in learners’ online listening behaviors. Instructional Science, 41(2), 323–343. doi: 10.1007/s11251-012-9230-9

Copyright information

© Springer Nature Singapore Pte Ltd. 2017

Authors and Affiliations

  1. 1.The University of SydneySydneyAustralia
  2. 2.University of New South WalesSydneyAustralia

Personalised recommendations