Skip to main content

Broken Promises? Examining the Effectiveness of Promising Practices in STEM Lectures by Student Subgroups


Traditional approaches to teaching and learning in large undergraduate STEM lectures has not kept pace with the massification and diversification of higher education. Efforts to alleviate learning obstacles call for improved instruction, stipulating the utility of specific pedagogical techniques delineated as “promising practices”; however, little evidence supports their effectiveness. In this study, a quasi-experimental method is applied to a large panel dataset containing course observations and institutional records to investigate the effect of promising practices on student learning. Additionally, differential effects are examined across several important nontraditional and historically under-served populations, such as Hispanic, first-generation, low-income and lower prior ability students. Results suggest that two information relaying techniques—the use of prior content and reviewing exam content—were positively and significantly associated with student outcomes, and estimates were found to be stable across subgroups. Implications for higher education teaching and learning, as well as empirical research in these areas, are discussed.

This is a preview of subscription content, access via your institution.

Data Availability

Not applicable.


  1. 1.

    Epistemology is here used to refer to an explicit body of instructional techniques. It is not in reference to a particular theory or theories of knowledge; neither is it in reference to philosophical study.

  2. 2.

    Active learning is often used in two ways: (1) In reference to, and synonymous with, learner-centered teaching, and (2) with respect to an explicit body of instructional techniques. From this point forward, I use it in the latter sense.

  3. 3.

    More of the justification for using subsequent course grade is provided in the measures section.

  4. 4.

    The use of promising practices varies by course and instructor—not just the course level. For instance, two different instructors can teach the same course, or the same instructor can teach the same course (across different terms or sections), and each will have different promising practice implementations. This variation in promising practices at the instructor level allows for the control of instructor-level fixed effects, which controls for invariant features of the instructor (e.g., race, ethnicity, previous background, etc.).

  5. 5.

    The empirical distribution for this variable was not ideally normally distributed. However, the current methodological and statistical modeling literature suggests that modeling outcome variables as normal (Gaussian) even when the empirical distribution in the sample isn’t remains justified if current scientific knowledge does not suggest that the population distribution is otherwise (McElreath, 2020, p. 81). Additionally, Li, Wong, Lamoureux, and Wong (2012) demonstrate that, “…in a large sample, the use of a linear regression technique, even if the dependent variable violates the ‘normality assumption’ rule, remains valid.” And Schmidt and Finan (2018) note that such transformations are not necessary in large samples and can actually bias estimates. Still, to test the robustness of our results, we provide regression results for three transformations on the dependent variable in the appendix. The consistency of the estimates remain largely stable, with the general message taken from both analyses agreeing. However, due to the literature-based rationale just mentioned, we choose to focus the reporting on the original, non-transformed variable.

  6. 6.

    Column 5 in Table 4 is also presented in Table 3. It is presented here for the viewers’ ease in comparing subgroup outcomes with that of the full sample.

  7. 7.

    I would like to thank the anonymous journal reviewer for bringing this potential to such focus.


  1. Angrist, J. D., & Pischke, J. S. (2008). Mostly harmless econometrics: An empiricist’s companion. Princeton University Press.

  2. Arum, R., & Roksa, J. (2011). Academically adrift: Limited learning on college campuses. University of Chicago Press.

  3. Ball, D. L., & Forzani, F. M. (2007). What makes education eesearch “educational.” Educational Researcher, 36(9), 529.

  4. Blasco-Arcas, L., Buil, I., Hernández-Ortega, B., & Sese, F. J. (2013). Using clickers in class. The role of interactivity, active collaborative learning and engagement in learning performance. Computers & Education, 62, 102-110.

  5. Blasiman, R. N. (2017). Distributed concept reviews improve exam performance. Teaching of Psychology, 44(1), 46-50.

  6. Bolkan, S. (2017). Instructor clarity, generative processes, and mastery goals: Examining the effects of signaling on student learning. Communication Education, 66(4), 385-401.

  7. Bolkan, S., Goodboy, A. K., & Myers, S. A. (2017). Conditional processes of effective instructor communication and increases in students’ cognitive learning. Communication Education, 66(2), 129-147.

  8. Carrell, S. E., & West, J. E. (2010). Does professor quality matter? Evidence from random assignment of students to professors. Journal of Political Economy, 118(3), 409-432.

  9. Chen, H. T. (1990). Theory-driven evaluations. Sage.

  10. Di Leonardi, B. C. (2007). Tips for facilitating learning: The lecture deserves some respect. The Journal of Continuing Education in Nursing, 38(4), 154-161.

  11. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410-8415.

  12. Gelula, M. H. (1997). Effective lecture presentation skills. Surgical Neurology, 47(2), 201-204.

  13. Hill, H. C., & Erickson, A. (2019). Using implementation fidelity to aid in interpreting program impacts: A brief review. Educational Researcher, 48(9), 590-598.

  14. Hill, H. C., Kapitula, L., & Umland, K. (2011). A validity argument approach to evaluating teacher value-added scores. American Educational Research Journal, 48(3), 794-831.

  15. Hill, J., & West, H. (2019). Improving the student learning experience through dialogic feed-forward assessment. Assessment & Evaluation in Higher Education, 1-16.

  16. Hornsby, D. J., & Osman, R. (2014). Massification in higher education: Large classes and student learning. Higher education, 67(6), 711-719.

  17. Hornsby, D. J., Osman, R., & De Matos-Ala, J. (2013). Large-class pedagogy: Interdisciplinary perspectives for quality higher education. African Sun Media.

  18. Hsiao, C., Pesaran, M. H., & Tahmiscioglu, A. K. (2002). Maximum likelihood estimation of fixed effects dynamic panel data models covering short time periods. Journal of econometrics, 109(1), 107-150.

  19. Kennedy, R. R. (2009). The power of in-class debates. Active Learning in Higher Education, 10(3), 225-236.

  20. Kromka, S. M., & Goodboy, A. K. (2019). Classroom storytelling: Using instructor narratives to increase student recall, affect, and attention. Communication Education, 68(1), 20-43.

  21. Li, X., Wong, W., Lamoureux, E. L., & Wong, T. Y. (2012). Are linear regression techniques appropriate for analysis when the dependent (outcome) variable is not normally distributed?. Investigative Ophthalmology & Visual Science, 53(6), 3082-3083.

  22. McElreath, R. (2020). Statistical rethinking: A Bayesian course with examples in R and Stan. CRC press.

  23. Mervis, J. (2013). Transformation is possible if a university really cares. Science, 340 (6130), 292-296

  24. Mitchell, A., Petter, S. & Harris, A. (2017). Learning by doing: Twenty successful active learning exercises for information systems courses. Journal of Information Technology Education: Innovations in Practice, 16(1), 21-46. Informing Science Institute. Retrieved January 15, 2020 from

  25. Morton, A. (2008). Lecturing to large groups. In A handbook for teaching and learning in higher education (pp. 76-89). Routledge.

  26. National Research Council. (2011). Promising practices in undergraduate science, technology, engineering, and mathematics education: Summary of two workshops. National Academies Press.

  27. National Research Council. (2012). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering. National Academies Press.

  28. Norman, D. A., & Spohrer, J. C. (1996). Learner-centered education. Communications of the ACM, 39(4), 24-27.

  29. Pelikan, J. (1992). The idea of the university: A reexamination. Yale University Press.

  30. Reimer, L. C., Nili, A., Nguyen, T., Warschauer, M., & Domina, T. (2016). Clickers in the wild: A campus-wide study of student response systems. Transforming Institutions: 21st Century Undergraduate STEM Education, 383.

  31. Reimer, L. C., Schenke, K., Nguyen, T., O'dowd, D. K., Domina, T., & Warschauer, M. (2016). Evaluating promising practices in undergraduate STEM lecture courses. RSF: The Russell Sage Foundation Journal of the Social Sciences, 2(1), 212-233.

  32. Rodriguez, F., Rivas, M. J., Matsumura, L. H., Warschauer, M., & Sato, B. K. (2018). How do students study in STEM courses? Findings from a light-touch intervention and its relevance for underrepresented students. PloS one, 13(7), e0200767.

  33. Schmidt, A. F., & Finan, C. (2018). Linear regression and the normality assumption. Journal of clinical epidemiology, 98, 146-151.

  34. Smith, C. V., & Cardaciotto, L. (2011). Is active learning like broccoli? Student perceptions of active learning in large lecture classes. Journal of the Scholarship of Teaching and Learning, 11(1), 53-61.

  35. Vu, V. Q. (2017). Documenting instructional practices in large introductory STEM lecture courses (Doctoral dissertation, UC Irvine).

  36. Weimer, M. (2002). Learner-centered teaching: Fivekey changes to practice. San Francisco, CA: Jossey-Bass.

  37. Wind, S. A., & Jones, E. (2019). Not just generalizability: A case for multifaceted latent trait models in teacher observation systems. Educational Researcher, 48(8), 521-533.

  38. Wright, G. B. (2011). Student-centered learning in higher education. International Journal of Teaching and Learning in Higher Education, 23(1), 92-97.

  39. Xu, D., & Solanki, S. (2020). Tenure-track appointment for teaching-oriented faculty? The impact of teaching and research faculty on student outcomes. Educational Evaluation and Policy Analysis, 42(1), 66-86.

  40. Zambrano, J., Kirschner, F., Sweller, J., & Kirschner, P. A. (2019). Effects of prior knowledge on collaborative and individual learning. Learning and Instruction, 63, 101214.

Download references


I would like to thank Mark Warschauer and Di Xu for providing the data and guidance that made this project possible.


National Science Foundation under Grant Number 1256500.

Author information




Not applicable.

Corresponding author

Correspondence to Gabe Avakian Orona.

Ethics declarations

Conflicts of Interest/Competing Interests

(include appropriate disclosures) Not applicable.

Ethics Approval

(include appropriate approvals or waivers)

Consent to Participate

Not applicable.

Consent for Publication

I consent to have this article published and confirm it is neither published or considered for publication elsewhere.

Code Availability

Upon request.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information


(DOCX 38 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Orona, G.A. Broken Promises? Examining the Effectiveness of Promising Practices in STEM Lectures by Student Subgroups. Innov High Educ 46, 223–239 (2021).

Download citation


  • STEM
  • Higher education
  • Promising practices
  • Undergraduate education
  • Fixed effects