Skip to main content
Log in

A data analytics approach to contrast the performance of teaching (only) vs. research professors

  • Technical Paper
  • Published:
International Journal on Interactive Design and Manufacturing (IJIDeM) Aims and scope Submit manuscript

    We’re sorry, something doesn't seem to be working properly.

    Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.

Abstract

This research article presents a study to compare the teaching performance of teaching-only versus teaching-and-research professors at higher education institutions. It is a common belief that, generally, teaching professors outperform research professors in teaching-and-research universities according to student perceptions reflected in student surveys. This case study presents experimental evidence that shows this is not always the case and that, under certain circumstances, it can be the contrary. The case study is from Tecnologico de Monterrey (Tec), a teaching-and-research, private university in Mexico that has developed a research profile during the last two decades using a mix of teaching-only and teaching-and-research faculty members; during this time period, the university has had a growing ascendancy in world university rankings. Data from an institutional student survey called the ECOA was used. The data set contains more than 118,000 graduate and undergraduate courses for 5 semesters (January 2017 to May 2019). The results presented were derived from statistical to data mining methods, including Analysis of Variance and Logistic Regression, that were applied to this data set of more than nine thousand professors who taught those courses. The results show that teaching-and-research professors perform better or at least the same as teaching-only professors. The differences found in teaching with respect to attributes like professors’ gender, age, and research level are also presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Charlotte Danielson (2013). The Framework For Teaching Evaluation Instrument: The Newest Rubric Enhancing The Links To The Common Core State Standards, With Clarity Of Language For Ease Of Use And Scoring. Copyright material by Charlotte Danielson

  2. Lange, Arthur M.: Information Technology and Organizational Learning: Managing Behavioral Change in the Digital Age/Third edition). CRC Press, Taylor and Francisco, Boca Raton (2017)

    Book  Google Scholar 

  3. Association for the Advancement of Artificial Intelligence (2016). Artificial Intelligence and Life in 2030. One Hundred Year Study on Artificial Intelligence. Report of the Study Panel 2015. https://ai100.stanford.edu/2016-report Retrieved September 18, 2019

  4. Kaplan, Andreas M., Haenlein, Michael: Higher education and the digital revolution: about MOOCs, SPOCs, social media, and the Cookie Monster. Bus. Horiz. 59(4), 441–450 (2016)

    Article  Google Scholar 

  5. Yousef, A. M. F., Chatti, M. A., Schroeder, U., Wosnitza, M., Jakobs, H. (2014). MOOCs - A Review of the State-of-the-Art. Int Conf on Computer Supported Education 2014. Barcelona, Spain, pp. 9–20

  6. Shirky, Clay (8 July 2013). “MOOCs and Economic Reality.” The Chronicle of Higher Education

  7. Fowler, Geoffrey A. (2013). An early report card on MOOCs. Wall Street Journal

  8. Katy Jordan (2017). MOOC completion rates. www.katyjordan.com. Retrieved Sep 16, 2019

  9. D Koller (2012). “MOOCs on the Move: How Coursera Is Disrupting the Traditional Classroom” (text & video). Knowledge @ Wharton. U of Pennsylvania. Retrieved Sept 16, 2019

  10. Coffrin, Carleton; Corrin, Linda; de Barba, Paula; Kennedy, Gregor (2014). Visualizing Patterns of Student Engagement and Performance in MOOCs. In: Proceedings of the Fourth International Conf on Learning Analytics and Knowledge. New York, NY, USA: ACM. pp. 83–92

  11. Breslow, Lori, Pritchard, David E., DeBoer, Jennifer, Stump, Glenda S., Ho, Andrew D., Seaton, Daniel T.: Studying learning in the worldwide classroom research into edX’s first MOOC. Res. Pract. Assess. 8, 13–25 (2013)

    Google Scholar 

  12. Hew, Khe Foon: Promoting engagement in online courses: what strategies can we learn from three highly rated MOOCs. British J. Educ. Technol. 47(2), 320–341 (2016)

    Article  Google Scholar 

  13. Salazar, M.C., Lerner, J.: Teacher Evaluation as Cultural Practice: A Framework for Equity and Excellence. Language, Culture, and Teaching Series. Routledge, New York (2019)

    Book  Google Scholar 

  14. Marshal, Kim: Rethinking Teacher Supervision and Evaluation: How to Work Smart, Build Collaboration, and Close the Achievement Gap, 2nd edn. Copyright material Jassey Bass, Willey, New York (2013)

    Google Scholar 

  15. Grubera, T., Reppelb, A., Voss, R.: Understanding the characteristics of effective professors: the student’s perspective. J. Market. Higher Educ. 20(2), 175–190 (2010). https://doi.org/10.1080/08841241.2010.526356

    Article  Google Scholar 

  16. Tsinidou, M., Gerogiannis, V., Fitsilis, P.: Evaluation of the factors that determine quality in higher education: an empirical study. Quality Assur. Educ. 18(3), 227–244 (2010). https://doi.org/10.1108/09684881011058669

    Article  Google Scholar 

  17. Spooren, P., Brockx, B., Mortelmans, D.: On the validity of student evaluation of teaching: the state of the art. Rev. Educ. Res. 83, 598–642 (2013)

    Article  Google Scholar 

  18. Stack, S.: Research productivity and student evaluation of teaching in social science classes. Res Higher Educ 44, 539–556 (2003)

    Article  Google Scholar 

  19. Cantú-Ortíz, F. (ed.): Research Analytics. Taylor and Francis Group. CRC Press, Boca Raton (2018)

    Google Scholar 

  20. Cantu-Ortiz, F., Bustani, A., Moreira, H., Molina, A.: A research-based development model: the research chairs approach. J. Knowl. Manag. 13(1), 154–170 (2009)

    Article  Google Scholar 

  21. Shearer, C.: The CRISP-DM model: the new blueprint for data mining. J. Data Warehousing 5, 13–22 (2000)

    Google Scholar 

  22. CRISP-DM (2019). Cross-industry Standard Process for Data Mining. https://en.wikipedia.org/wiki/Cross-industry_standard_process_for_data_mining. Retrieved September 18, 2019

  23. Pete Chapman, Julian Clinton, Randy Kerber, Thomas Khabaza, Thomas Reinartz, Colin Shearer, and Rüdiger Wirth (2000); CRISP-DM 1.0 Step-by-step data mining guides

  24. Harper, Gavin, Pickett, Stephen D.: Methods for mining HTS data. Drug Discov. Today 11(15–16), 694–699 (2006)

    Article  Google Scholar 

  25. QS World University Rankings 2020 www.topuniversities.com/university-rankings/world-university-rankings/2020. Accessed 12 Aug 2020

  26. Provost, F., Fawcett, T.: Data Science for Business: What you need to know about Data Mining and Data Analytics Thinking. O’ Reilly Media Inc., Sebastopol (2015)

    Google Scholar 

  27. Spooren, P.: On the credibility of the judge. A cross-classified multilevel analysis on student evaluations of teaching. Stud. Educ. Evaluat. 36, 121–131 (2010). https://doi.org/10.1016/j.stueduc.2011.02.001

    Article  Google Scholar 

  28. Benton, S. & Cashin, W. (2011). IDEA Paper No. 50: Student ratings of teaching: A summary of research and literature

  29. Djurdjevic, D., Radyakin, S.: Decomposition of the gender wage gap using matching: an application for Switzerland. Swiss J Econ Statist 143, 365 (2007). https://doi.org/10.1007/BF03399243

    Article  Google Scholar 

  30. Ting, K.: A multilevel perspective on student ratings of instruction: lessons from the Chinese experience. Res. Higher Educ. 41, 637–661 (2000). https://doi.org/10.1023/A:1007075516271

    Article  Google Scholar 

  31. McPherson, M. A., Todd Jewell, R.: Leveling the playing field: Should student evaluation scores be adjusted? J. Soc. Sci. Q. 88, 868–881 (2007). https://doi.org/10.1111/j.1540-6237.2007.00487.x

    Article  Google Scholar 

  32. McPherson, M.A., Todd Jewell, R., Kim, M.: What determines student evaluation scores? A random-effects analysis of undergraduate economics classes. Eastern Econ. J. 35, 37–51 (2009). https://doi.org/10.1057/palgrave.eej.9050042

    Article  Google Scholar 

  33. Basow, S.A., Montgomery, S.: Student ratings and professor self-ratings of college teaching: effects of gender and divisional affiliation. J. Personnel Evaluat. Educ. 18, 91–106 (2005). https://doi.org/10.1007/s11092-006-9001-8

    Article  Google Scholar 

  34. Smith, S.W., Yoo, J.H., Farr, A.C., Salmon, C.T., Miller, V.D.: The influence of student sex and instructor sex on student ratings of instructors Results from a college of communication. Women. Stud. Commun. 30, 64–77 (2007). https://doi.org/10.1080/07491409.2007.10162505

    Article  Google Scholar 

  35. Chaerani, D. and Dewanto, S. (2014). Active Learning based on Research for Linear Optimization Course at the Department of Mathematics, Universitas Padjadjaran. Int Conf on Advances in Education Technology

  36. Bucco, G., Bornia-Poulsen, C., Bandeira, C.: Development of a linear programming model for the university course timetabling problem. Gest. Prod. Sao Carlos. 24(1), 40–49 (2017). https://doi.org/10.1590/0104-530X2133-15

    Article  Google Scholar 

  37. Tec21 (2019). Educational Model of Tecnologico de Monterrey, Tec 21

  38. Akela, D.: Learning together: kolb’s experimental theory and its application. J Manag Org 16(1), 100–112 (2010)

    Article  Google Scholar 

  39. Garcia, N. Guajardo, B. Valenzuela J. (2017). Blueprint de un Sistema de Innovación Educativa en las Instituciones de Educación Superior: El Caso del Tecnológico de Monterrey y su Modelo al 2021. 4º Congreso Internacional de Innovacion Educativa, Monterrey, México

  40. Bridgstock, R., Tippet, N.: Higher Education and the Future of Graduate Employability. Edward Elgar Publishing Limited, Cheltenham (2019)

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Francisco J. Cantu-Ortiz.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chavez, M.D., Ceballos, H.G. & Cantu-Ortiz, F.J. A data analytics approach to contrast the performance of teaching (only) vs. research professors. Int J Interact Des Manuf 14, 1577–1592 (2020). https://doi.org/10.1007/s12008-020-00713-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12008-020-00713-5

Keywords

Navigation