Skip to main content

Advertisement

Log in

A methodology for improving active learning engineering courses with a large number of students and teachers through feedback gathering and iterative refinement

  • Published:
International Journal of Technology and Design Education Aims and scope Submit manuscript

Abstract

In the last decade, engineering education has evolved in many ways to meet society demands. Universities offer more flexible curricula and put a lot of effort on the acquisition of professional engineering skills by the students. In many universities, the courses in the first years of different engineering degrees share program and objectives, having a large number of students and teachers. These common courses are expected to provide the students with meaningful learning experiences, which could be achieved by using active learning. The use of active learning in engineering courses improves traditional teaching by promoting students’ participation and engagement, although active learning courses can be very sensitive to differences in learning paces or team conflicts; this being a challenge for the widespread adoption of active learning in courses with many students and teachers. This paper proposes a methodology that facilitates the detection and reaction to problems in active learning engineering courses with many students and teachers. This methodology is based on gathering feedback (from students and teachers) and decision-making processes at selected milestones. The methodology integrates intra-edition mechanisms in order to detect problems and react as the courses are being taught, and inter-edition mechanisms to ensure the persistence of necessary changes in the courses design. The methodology has been successfully applied during four consecutive editions to improve an undergraduate active learning programming course with an average of 257 students and 9 teachers per edition. An extended validation of expert educators suggests that this methodology can also be applied to traditional engineering courses.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  • Alpay, E. (2013). Student attraction to engineering through flexibility and breadth in the curriculum. European Journal of Engineering Education, 38(1), 58–69.

    Article  Google Scholar 

  • Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers. New York: Jossey-Bass.

    Google Scholar 

  • Ashton, K. B. (2013). Reflections on the gathering of student feedback: An evaluative case study. Higher Education Research Network Journal, 7, 3–12.

  • Balam, E. M., & Shannon, D. M. (2010). Student ratings of college teaching: A comparison of faculty and their students. Assessment & Evaluation in Higher Education, 35(2), 209–221.

    Article  Google Scholar 

  • Barone, S., & Lo Franco, E. (2010). TESF methodology for statistics education improvement. Journal of Statistics Education, 18(3), 1–25.

    Google Scholar 

  • Bateman, G.R., Roberts, H.V. (1993). TQM for professors and students.

  • Bouton, C., & Garth, R. Y. (1983). Students in learning groups: Active learning through conversation. New Directions for Teaching and Learning, 1983(14), 73–82.

    Article  Google Scholar 

  • Brinko, K. T. (1993). The practice of giving feedback to improve teaching: What is effective? Journal of Higher Education, 64(5), 574–593.

    Article  Google Scholar 

  • Brookfield, S. (1995). The getting of wisdom: What critically reflective teaching is and why it’s important. In: Becoming a Critically Reflective Teacher, Jossey-Bass.

  • Bullock, C. D. (2003). Online collection of midterm student feedback. New Directions for Teaching and Learning, 2003(96), 95–102.

    Article  Google Scholar 

  • Cohen, P. A. (1980). Effectiveness of student-rating feedback for improving college instruction: A meta-analysis of findings. Research in Higher Education, 13(4), 321–341.

    Article  Google Scholar 

  • Cohen, P. A. (1981). Student ratings of instruction and student achievement: A meta-analysis of multisection validity studies. Review of Educational Research, 51(3), 281–309.

    Article  Google Scholar 

  • Daly, C. (2008). Evaluation for new learning contexts how can it be fit for purpose ? Reflecting Education, 4(1), 127–138.

    Google Scholar 

  • Denzin, N. K., & Lincoln, Y. S. (2005). The sage handbook of qualitative research (3rd ed.). Thousand Oaks, CA: SagePublications Inc.

    Google Scholar 

  • EUA (2009) Improving quality, enhancing creativity: Change processes in European higher education institutions: Final report of the quality assurance for the higher education change agenda (QAHECA) project. European University Association.

  • Felder, R. (1992). What do they know anyway? Chemical Engineering Education, 26(3), 134–135.

    Google Scholar 

  • Felder, R. M., & Brent, R. (2009). Active learning: An introduction. ASQ Higher Education Brief, 2(4), 1–5.

    Google Scholar 

  • Felton, J., Mitchell, J., & Stinson, M. (2004). Web-based student evaluations of professors: The relations between perceived quality, easiness and sexiness. Assessment & Evaluation in Higher Education, 29(1), 91–108.

    Article  Google Scholar 

  • Gahan, C., & Hannibal, M. (1999). Doing qualitative research using QSR NUD* IST (1st ed.). London, UK: SagePublications Ltd.

    Google Scholar 

  • Jara, M., & Mellar, H. (2010). Quality enhancement for e-learning courses: The role of student feedback. Computers & Education, 54(3), 709–714.

    Article  Google Scholar 

  • Leony, D., Pardo, A., de la Fuente Valentín L., de Castro, S.D., Delgado Kloos, C. (2012). Glass: a learning analytics visualization tool. In: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, ACM, pp 162–163.

  • Litzinger, T., Lattuca, L. R., Hadgraft, R., & Newstetter, W. (2011). Engineering education and the development of expertise. Journal of Engineering Education, 100(1), 123–150.

    Article  Google Scholar 

  • Marsh, H.W. (2007). Students evaluations of university teaching: Dimensionality, reliability, validity, potential biases and usefulness. In: The scholarship of teaching and learning in higher education: An evidence-based perspective, Springer, pp 319–383.

  • Martínez-Monés, A., Gómez-Sánchez, E., Dimitriadis, Y. A., Jorrín-Abellán, I. M., & Rubia-Avi, B. (2005). Multiple case studies to enhance project-based learning in a computer architecture course. IEEE Transactions on Education, 48(3), 482–489.

    Article  Google Scholar 

  • Martínez-Monés, A., Dimitriadis, Y., Gómez-Sánchez, E., Rubia-Avi, B., Jorrín-Abellán, I., & Marcos, J. A. (2006). Studying participation networks in collaboration using mixed methods. International Journal of Computer-Supported Collaborative Learning, 1(3), 383–408.

    Article  Google Scholar 

  • Moura, I. C., & van Hattum-Janssen, N. (2011). Teaching a CS introductory course: An active approach. Computers and Education, 56(2), 475–483.

    Article  Google Scholar 

  • Moursund, D. (1999). Project-based learning using information technology (1st ed.). Eugene, OR, USA: ISTE.

    Google Scholar 

  • Oakley, B., Felder, R. M., Brent, R., & Elhajj, I. (2004). Turning student groups into effective teams. Journal of Student Centered Learning, 2(1), 9–34.

    Google Scholar 

  • Overall, J., & Marsh, H. W. (1979). Midterm feedback from students: Its relationship to instructional improvement and students’ cognitive and affective outcomes. Journal of Educational Psychology, 71(6), 856–865.

    Article  Google Scholar 

  • Pardo, A., Estévez-Ayres, I., Basanta-Val, P., & Fuentes-Lorenzo, D. (2011). Course quality improvement using mid-semester feedback. International Journal on Technology-Enhanced Learning, 3(4), 366–376.

    Article  Google Scholar 

  • Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223–231.

    Article  Google Scholar 

  • Sheppard, S., Macatangay, K., Colby, A., & Sullivan, W. M. (2008). Educating engineers: Designing for the future of the field. San Francisco, CA, USA: Jossey-Bass Inc Pub.

    Google Scholar 

  • Steward, B. L., Mickelson, S. K., & Brumm, T. J. (2005). Continuous engineering course improvement through synergistic use of multiple assessment. International Journal of Engineering Education, 21(2), 277.

    Google Scholar 

  • Stewart, J., Oliver, W, I. I. I., & Stewart, G. (2013). Revitalizing an undergraduate physics program: A case study of the University of Arkansas. American Journal of Physics, 81(12), 943–950.

    Article  Google Scholar 

  • Sursock, A., & Smidt, H. (2010). Trends 2010: A decade of change in European higher education. Brussels, Belgium: EUA publications, European University Association.

    Google Scholar 

  • Takriff, M.S., Abdullah, S.R.S., Mohammad, A.B., Anuar, N. (2011). Students’ feedback in the continuous quality improvement cycle of engineering education. In: Global Engineering Education Conference (EDUCON), 2011 IEEE, IEEE, pp 374–377.

  • Trilling, B., & Fadel, C. (2009). 21st century skills: Learning for life in our times (1st ed.). San Francisco, CA, USA: Jossey-Bass.

    Google Scholar 

  • Tuckman, B. W. (1965). Developmental sequence in small groups. Psychological bulletin, 63(6), 384.

    Article  Google Scholar 

  • Wachtel, H. (1998). Student evaluation of college teaching effectiveness: A brief review. Assessment and Evaluation in Higher Education, 23(2), 191–212.

    Article  Google Scholar 

  • Walker, A., & Leary, H. (2009). A problem based learning meta analysis: Differences across problem types, implementation types, disciplines, and assessment levels. Interdisciplinary Journal of Problem-based Learning, 3(1), 6–28.

    Article  Google Scholar 

  • Weinberg, B. A., Fleisher, B. M., & Hashimoto, M. (2007). Evaluating methods for evaluating instruction: The case of higher education. Tech. rep., National Bureau of Economic Research

  • Yao, Y., & Grady, M. L. (2005). How do faculty make formative use of student evaluation feedback?: A multiple case study. Journal of Personnel Evaluation in Education, 18(2), 107–126.

    Article  Google Scholar 

Download references

Acknowledgments

The authors want to thank the teachers involved in the 4-year enactment of this course. We appreciate all your time and effort. We also want to thank to the external teachers that completed the survey. Thank you for taking the time to help us improve our teaching.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Iria Estévez-Ayres.

Additional information

This work has been funded by the Spanish Ministry of Economy and Competitiveness “EEE” project TIN2011-28308-C03-01, the Regional Government of Madrid “eMadrid” project S2009/TIC-1650, and the postdoctoral fellowship Alianza 4 Universidades.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Estévez-Ayres, I., Alario-Hoyos, C., Pérez-Sanagustín, M. et al. A methodology for improving active learning engineering courses with a large number of students and teachers through feedback gathering and iterative refinement. Int J Technol Des Educ 25, 387–408 (2015). https://doi.org/10.1007/s10798-014-9288-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10798-014-9288-6

Keywords

Navigation