Higher Education

, Volume 69, Issue 5, pp 823–838 | Cite as

Student perceptions of effective instruction and the development of critical thinking: a replication and extension

  • Chad N. LoesEmail author
  • Mark H. Salisbury
  • Ernest T. Pascarella


This study utilized data from the Wabash National Study of Liberal Arts Education to test the robustness of research conducted by Pascarella et al. (J Coll Stud Dev 37:7–19, 1996) that explored the relationship between student perceptions of exposure to organized and clear instruction and growth in critical thinking skills among college freshmen. To accomplish this, we created fully-specified models that included statistical controls for an array of potential confounding influences such as, student race, sex, pre-college critical thinking ability, pre-college tested academic ability, parental educational degree attainment, pre-college academic motivation, and a measure of interaction with high school teachers. Net of these influences, our findings generally replicate those uncovered by Pascarella et al. (J Coll Stud Dev 37:7–19, 1996) which suggest that student perceptions of organized instruction are positively associated with gains in critical thinking. Perceptions of instructional clarity, however, failed to exert a statistically significant influence on the dependent variable. Lastly, the results of our analyses suggest the effect of student perceptions of organized instruction on critical thinking affects students similarly, regardless of tested academic preparation (ACT or equivalent score), sex, or pre-college critical thinking levels.


Effective teaching Instruction Critical thinking Replication 



The research on which this study was based was supported by a generous grant from the Center of Inquiry in the Liberal Arts at Wabash College to the Center for Research on Undergraduate Education at The University of Iowa.


  1. Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Tamim, R., et al. (2008). Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research. doi: 10.3102/0034654308326084.Google Scholar
  2. Abrami, P., d’Apollonia, S., & Rosenfield, S. (2007). The dimensionality of student ratings of instruction: What we know and what we do not. In R. Perry & J. Smart (Eds.), Effective teaching in higher education: Research and practice (pp. 385–456). The Netherlands: Springer.Google Scholar
  3. American College Testing Program (ACT). (2011). CAAP Technical Handbook 20112012.
  4. Astin, A. W. (1993). What matters in college? Four critical years revisited. San Francisco, CA: Jossey-Bass.Google Scholar
  5. Bailey, J. F. (1979). The effects of an instructional paradigm on the development of critical thinking of college students in an introductory botany course. Dissertation Abstracts International, 40(6), 3138A.Google Scholar
  6. Belsley, D. A., Kuh, E., & Welsch, R. E. (1980). Regression diagnostics: Identifying influential data and sources of collinearity. doi: 10.1002/0471725153.CrossRefGoogle Scholar
  7. Bowman, N. A. (2010). Can 1st-year college students accurately report their learning and development? American Educational Research Journal. doi: 10.3102/0002831209353595.Google Scholar
  8. Braskamp, L. A., & Ory, J. C. (1994). Assessing faculty work. San Francisco, CA: Jossey-Bass.Google Scholar
  9. Cashin, W. E. (1999). Student ratings of teaching: Uses and misuses. Unpublished manuscript, Kansas State University, Manhattan.Google Scholar
  10. Cashin, W. E., Downey, R. G., & Sixbury, G. R. (1994). Global and specific ratings of teaching effectiveness and their relation to course objectives: Reply to Marsh. Journal of Educational Psychology. doi: 10.1037/0022-0663.86.4.649.Google Scholar
  11. Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple regression/correlation analysis for the behavioral sciences. NJ: L. Erlbaum Associates Mahwah.Google Scholar
  12. Dale, P. M., Ballotti, D., Handa, S., & Zych, T. (1997). An approach to teaching problem solving in the classroom. College Student Journal, 31, 76–79.Google Scholar
  13. d’Apollonia, S., & Abrami, P. C. (1997). Navigating student ratings of instruction. American Psychologist. doi: 10.1037/0003-066X.52.11.1198.Google Scholar
  14. Ennis, R. H. (1993). Critical thinking assessment. Theory into Practice. doi: 10.1080/00405849309543594.Google Scholar
  15. Ethington, C. A. (1997). A hierarchical linear modeling approach to studying college effects. In J. C. Smart (Ed.), Higher education: Handbook of theory and research (Vol. XII, pp. 165–194). New York: Agathon Press.Google Scholar
  16. Evanschitzky, H., Baumgarth, C., Hubbard, R., & Armstrong, J. (2007). Replication research’s disturbing trend. Journal of Business Research. doi: 10.1016/j.jbusres.2006.12.003.Google Scholar
  17. Feldman, K. A. (1989). The association between student ratings of specific instructional dimensions and student achievement: Refining and extending the synthesis of data from multisection validity studies. Research in Higher Education. doi: 10.1007/BF00992392.Google Scholar
  18. Feldman, K. (1994). Identifying exemplary teaching: Evidence from course and teacher evaluations. Paper commissioned by the National Center on Postsecondary Teaching, Learning, and Assessment. Stony Brook, NY: State University of New York at Stony Brook.Google Scholar
  19. Feldman, K. A. (1996). Reflections on the study of effective college teaching and student ratings: One continuing quest and two unresolved issues. Unpublished manuscript, State University of New York, Stony Brook.Google Scholar
  20. Feldman, K. A. (1997). Identifying exemplary teachers and teaching: Evidence from student ratings. In R. Perry & J. Smart (Eds.), Effective teaching in higher education: Research and practice (pp. 368–395). New York, NY: Agathon.Google Scholar
  21. Gellin, A. (2003). The effect of undergraduate student involvement on critical thinking. Journal of College Student Development. doi: 10.1353/csd.2003.0066.Google Scholar
  22. Greenwald, A. G. (1997). Validity concerns and usefulness of student ratings of instruction. American Psychologist. doi: 10.1037/0003-066X.52.11.1182.Google Scholar
  23. Groves, R., Fowler, F., Couper, M., Lepkowski, J., Singer, E., & Tourangeau, R. (2004). Survey methodology. Hoboken, NJ: Wiley-Interscience.Google Scholar
  24. Halpern, D. F. (1993). Assessing the effectiveness of critical-thinking instruction. The Journal of General Education, 42, 238–254.Google Scholar
  25. Hays, W. (1994). Statistics (5th ed.). Fort Worth, TX: Harcourt Brace.Google Scholar
  26. Hines, C., Cruickshank, D., & Kennedy, J. (1985). Teacher clarity and its relationship to student achievement and satisfaction. American Educational Research Journal. doi: 10.3102/00028312022001087.Google Scholar
  27. Hyndman, R. J. (2010). Encouraging replication and reproducible research. International Journal of Forecasting. doi: 10.1016/j.ijforecast.2009.12.003.Google Scholar
  28. Klein, S., Liu, O., & Sconing, J. (2009). Test validity study (TVS) report. Washington, DC: Fund for the Improvement of Postsecondary Education.Google Scholar
  29. Makel, M. C., & Plucker, J. A. (2013). Facts are more important than novelty: Replication in the education sciences. Manuscript submitted for publication.Google Scholar
  30. Marquardt, D. W. (1970). Generalized inverses, ridge regression and biased linear estimation. Technometrics. doi: 10.2307/1267205.Google Scholar
  31. Marsh, H. W. (1987). Students’ evaluations of university teaching: Research findings, methodological issues, and directions for future research. International Journal of Educational Research doi: 10.1016/0883-0355(87)90001-2.
  32. Marsh, H., & Dunkin, M. (1997). Students’ evaluations of university teaching: A multi-dimensional perspective. In R. Perry & J. Smart (Eds.), Effective teaching in higher education: Research and practice (pp. 241–320). New York, NY: Agathon.Google Scholar
  33. Marsh, H. W., & Roche, L. A. (1997). Making students’ evaluations of teaching effectiveness effective: The critical issues of validity, bias, and utility. American Psychologist. doi: 10.1037/0003-066X.52.11.1187.Google Scholar
  34. McKeachie, W. J. (1997). Student ratings: The validity of use. American Psychologist. doi: 10.1037/0003-066X.52.11.1218.Google Scholar
  35. Mokhtari, K., Reichard, C. A., & Gardner, A. (2009). The impact of internet and television use on the reading habits and practices of college students. Journal of Adolescent & Adult Literacy. doi: 10.1598/JAAL.52.7.6.Google Scholar
  36. Myers, R. (1990). Classical and modern regression with applications (2nd ed.). Boston, MA: Duxbury Press.Google Scholar
  37. National Center for Education Statistics. (2005). The condition of education (NCES 2005-094). Washington, DC: U.S. Government Printing Office.Google Scholar
  38. National Endowment for the Arts. (2004). Reading at risk: A survey of literacy reading in America. Washington, DC: Author.Google Scholar
  39. National Endowment for the Arts. (2007). To read or not to read: A question of national consequence. Washington, DC: Author.Google Scholar
  40. Neuliep, J. W., & Crandall, R. (1990). Editorial bias against replication research. Journal of Social Behavior & Personality, 5(4), 85–90.Google Scholar
  41. Pascarella, E. T. (1985). College environmental influences on learning and cognitive development: A critical review and synthesis. Higher Education: Handbook of Theory and Research, 1(1), 1–61.Google Scholar
  42. Pascarella, E. T. (2001). Using student self-reported gains to estimate college impact: A cautionary tale. Journal of College Student Development, 42(5), 488–492.Google Scholar
  43. Pascarella, E. T. (2006). How college affects students: Ten directions for future research. Journal of College Student Development. doi: 10.1353/csd.2006.0060.Google Scholar
  44. Pascarella, E., Bohr, L., Nora, A., & Terenzini, P. (1995). Cognitive effects of 2-year and 4-year colleges: New evidence. Educational Evaluation and Policy Analysis. doi: 10.3102/01623737017001083.Google Scholar
  45. Pascarella, E., Edison, M., Nora, A., Hagedorn, L., & Braxton, J. (1996). Effects of teacher organization/preparation and teacher skill/clarity on general cognitive skills in college. Journal of College Student Development, 37, 7–19.Google Scholar
  46. Pascarella, E. T., Salisbury, M. H., & Blaich, C. (2011). Exposure to effective instruction and college student persistence: A multi-institutional replication and extension. Journal of College Student Development. doi: 10.1353/csd.2011.0005.Google Scholar
  47. Pascarella, E. T., & Terenzini, P. T. (1991). How college affects students: Findings and insights from twenty years of research. San Francisco, CA: Jossey-Bass.Google Scholar
  48. Pascarella, E.T, & Terenzini, P.T. (2005). How college affects students (Vol. 2): A third decade of research. San Francisco, CA: Jossey-Bass.Google Scholar
  49. Pascarella, E., Wang, J., Trolian, T., & Blaich, C. (2013). How the instructional and learning environments of liberal arts college enhance cognitive development. Higher Education, 66(5), 569–583.Google Scholar
  50. Pascarella, E. T., Wolniak, G. C., & Pierson, C. T. (2003). Explaining student growth in college when you don’t think you are. Journal of College Student Development. doi: 10.1353/csd.2003.0007.Google Scholar
  51. Perry, R. P. (1991). Perceived control in college students: Implications for instruction in higher education. In J. C. Smart (Ed.), Higher education: Handbook of theory and research (pp. 1–56). New York: Agathon.Google Scholar
  52. Perry, R. P., & Smart, J. C. (2007). The scholarship of teaching and learning in higher education: An evidence- based perspective. The Netherlands: Springer.CrossRefGoogle Scholar
  53. Pike, G. R. (1996). Limitations of using students’ self-reports of academic development as proxies for traditional achievement measures. Research in Higher Education. doi: 10.1007/BF01680043.Google Scholar
  54. Pryor, J. H., Hurtado, S., Sáenz, V. B., Santos, J. L., & Korn, W. S. (2007). The American freshman: Forty year trends. Los Angeles: Higher Education Research Institute.Google Scholar
  55. Rabinowitz, M., & Glaser, R. (1985). Cognitive structure and process in highly competent performance. In F. Horowitz and M. O’Brien (Eds.), The gifted and talented: Developmental perspectives, (pp. 75–98). Washington, DC: American Psychological Association, doi: 10.1037/10054-003.
  56. Raudenbush, S. W., & Bryk, A. S. (2001). Hierarchical linear models: Applications and data analysis methods. Thousand Oaks, CA: Sage Publications.Google Scholar
  57. Schonwetter, D., Menec, V., & Perry, R. (April, 1995). An empirical comparison of two effective college teaching behaviors: Expressiveness and organization. Paper presented at the annual meeting of the American Educational Research Association. San Francisco, CA.Google Scholar
  58. Seifert, T., Pascarella, E., Goodman, K., Salisbury, M., & Blaich, C. (2010). Liberal arts colleges and good practices in undergraduate education: Additional evidence. Journal of College Student Development. doi: 10.1353/csd.0.0113.Google Scholar
  59. Shim, W. J., & Walczak, K. (2012). The impact of faculty teaching practices on the development of students’ critical thinking skills. International Journal of Teaching and Learning in Higher Education, 24(1), 16–30.Google Scholar
  60. Smith, D. G. (1977). College classroom interactions and critical thinking. Journal of Educational Psychology 69(2), 180–190. doi: 10.1037/0022-0663.69.2.180.CrossRefGoogle Scholar
  61. Wachtel, H. K. (1998). Student evaluation of college teaching effectiveness: A brief review. Assessment & Evaluation in Higher Education. doi: 10.1080/0260293980230207.Google Scholar
  62. Wanous, J. P., & Hudy, M. J. (2001). Single-item reliability: A replication and extension. Organizational Research Methods. doi: 10.1177/109442810144003.Google Scholar
  63. Weimer, M., & Lenze, L. (1997). Instructional interventions: Review of the literature on efforts to improve instruction. In R. Perry & J. Smart (Eds.), Effective teaching in higher education: Research and practice (pp. 154–168). New York: Agathon Press.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  • Chad N. Loes
    • 1
    Email author
  • Mark H. Salisbury
    • 2
  • Ernest T. Pascarella
    • 3
  1. 1.Mount Mercy UniversityCedar RapidsUSA
  2. 2.Augustana CollegeRock IslandUSA
  3. 3.N440 Lindquist CenterThe University of IowaIowa CityUSA

Personalised recommendations