Abstract
Assessing gains in learning has received increased attention as one dimension of institutional accountability both in the USA (Arum and Roksa, Academically adrift: Limited learning on college campuses, 2011) and abroad (OECD, http://www.oecd.org/document/22/0,3746,en_2649_39263238_40624662_1_1_1_1,00.html, 2013, http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume2.pdf, 2012). Current approaches to assessing college learning are dominated by objective tests as well as student self-reported questionnaires, such as the National Survey of Student Engagement (NSSE). This study examined how the three NSSE deep approaches to learning scales contribute to the narrative on academic rigor at a large, public research institution. Using Confirmatory Factor Analyses and Structural Equation Modeling, results showed that the three deep approaches to learning constructs were internally valid, but deep learning was not related to GPA. Findings raised questions regarding good measurement of student learning and student reward for rigorous performance.
Similar content being viewed by others
Notes
We used Stata’s mvtest normality command to estimate both univariate, bivariate and multivariate tests of departure of normality.
We relied on the EQS’s mardia test.
Those items are: analyze, synthetize, applying, occide, integr, intede, othrvi and chngui.
Those items are ‘analyze’ and ‘integrate’ corresponding to Higher Order Learning and Integrative Learning respectively.
Four correspond to Higher Order Learning (analyze, synthesis, evaluate and apply), two belong to Integrative Learning (divclass and integr) and one is an indicator of Reflective Learning (Ownview).
References
Allen, J., Robbins, S. B., Casillas, A., & Oh, In-Sue. (2008). Third-year college retention and transfer: Effects of academic performance, motivation, and social connectedness. Research in Higher Education, 49, 647–664.
Arum, R., & Roksa, J. (2011). Academically adrift limited: Learning on college campuses. Chicago: University of Chicago Press.
Astin, A. (2011). In ‘Academically Adrift,’ Data Don’t Back Up Sweeping Claim. Chronicle of Higher Education. Retrieved at: http://chronicle.com/article/Academically-Adrift-a/12637.
Babcok, P. & Marks, M. (2010). Leisure college USA: The decline in study time. Report # 7. Washington, DC.: American Enterprise Institute for Public Research.
Banta, T. W., Pike, G. R., & Hansen, M. J. (2009). In T. W. Banta (Ed.). The Use of Engagement Data in Accreditation, Planning, and Assessment (pp.21-34). Using NSSE in Institutional Research. New Directions for Institutional Research, Issue # 141, Jossey-Bass.
Biggs, J. B., & Tang, C. (2011). Teaching for quality learning at university (4th ed.). Buckingham: Open University Press.
Bollen, K. A. (1989). Structural equations with latent variables. New York: Wiley.
Brown, T. A. (2006). Confirmatory Factor Analysis for Applied Research. New York: Guilford Press.
Byrne, B. M. (2006). Structural equation modeling with EQS (2nd ed.). Newbury Park: Sage.
Campbell, C. M., & Cabrera, A. F. (2011). How Sound Is NSSE?: Investigating the Psychometric Properties of NSSE at a Public, Research-Extensive Institution. The Review of Higher Education, 35(1), 77–103.
Carle, A. C., Jaffe, D., Vaughn, N. W., & Eder, D. (2009). Psychometric properties of three new national survey of student engagement based engagement scales: An item response theory analysis. Research in Higher Education, 50, 775–794.
Entwistle, N. (1991). Approaches to learning and perceptions of the learning environment. Higher Education, 22, 201–204.
Entwistle, N. (1997). Reconstituting approaches to learning: A response to Webb. Higher Education, 33, 213–218.
Entwistle, N. J., & Ramsden, P. (1983). Understanding student learning. London: Croom Helm.
Finney, S. J., & DiStefano, C. (2006). Non-normal and categorical data in structural equation modeling. In G. R. Hancock & R. O. Muller (Eds.), Structural Equation Modeling: A second course (pp. 269–314). Greenwich: IAP.
Geiser, C. (2013). Data analysis with Mplus. New York: Guilford Press.
Gordon, J., Ludlum, J., & Hoey, J. J. (2008). Validating NSSE against student outcomes: Are they related? Research in Higher Education, 49, 19–39.
Hall, C. W., Bolen, L. M., & Gupton, R. H. (1995). Predictive validity of the Study Process Questionnaire for undergraduate students. College Student Journal, 29, 234–239.
Hu, L.-T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55.
Jaschik, S. (2011, January 18). ‘Academically Adrift.’ Inside Higher Education. Retrieved from http://www.insidehighered.com/news/2011/01/18/study_finds_large_numbers_of_college_students_don_t_learn_much.
Kline, R. B. (2011). Principles and practices of structural equation modeling. New York: Guilford Press.
LaNasa, S., Cabrera, A. F., & Tangsrud, H. (2009). The construct validity of student engagement: A confirmatory factor analysis. Research in Higher Education, 50, 313–352.
Lui, O. L., Bridgeman, B., & Adler, R. M. (2012). Measuring Learning Outcomes in Higher Education: Motivation Matters. Educational Researcher, 41, 352–362.
Marton, F., & Säaljö, R. (1976). On qualitative differences in learning I: Outcome and process. British Journal of Educational Psychology, 46, 4–11.
Mayhew, M., Seifert, T. A., Pascarella, E. T., Nelson Laird, T. F., & Blaich, C. F. (2012). Going deep into mechanisms for moral reasoning growth: How deep learning approaches affect moral reasoning development for First-year students. Research in Higher Education, 53, 26–46.
Möller, J., Retelsdorf, J., Köller, O., & Marsh, H. W. (2011). The reciprocal internal/external frame of reference model: An integration of models of relations between academic achievement and self-concept. American Educational Research Journal, 48, 1315–1346.
Mueller, R. O., & Hancock, G. R. (2008). Best practices in structural equation modeling. In J. W. Osborne (Ed.), Best practices in quantitative methods (pp. 488–508). Thousand Oaks: Sage Publications Inc.
Mueller, R. O., & Hancock, G. R. (2010). Structural equation modeling. In G. R. Hancock & R. O. Mueller (Eds.), The Reviewer’s guide to quantitative methods in the social sciences (pp. 371–384). New York: Routledge.
Nelson Laird, T.F, Garver, A. K., Niskode-Dossett, A. S., & Banks, J. V. (2008a, Nov). The Predictive Validity of a Measure of Deep Approaches to Learning. Paper presented at the Annual Meeting of the Association for the Study of Higher Education, Jacksonville, FL.
Nelson Laird, T., Shoup, R., & Kuh, G. (2006, May). Measuring deep approaches to learning using the National Survey of Student Engagement. Paper presented at the Annual Forum of the Association for Institutional Research, Chicago, IL.
Nelson Laird, T. F., Shoup, R., Kuh, G., & Schwarz, M. (2008b). The effects of discipline on deep approaches to student learning and college outcomes. Research in Higher Education, 49, 469–494.
Nusche, D. (2008). Assessment of Learning Outcomes in Higher Education: A comparative review of selected practices. Organization for Economic Co-operation and Development.
Office of Institutional Research (OIR; 2011). NSSE Survey Deep Learning Items: Comparison of first-year and senior student responses. The University of Rhode Island.
Organization for Economic Co-operation and Development (OECD; 2012). Testing student and university performance globally: OECD’s AHELO. Retrieved from: http://www.oecd.org/document/22/0,3746,en_2649_39263238_40624662_1_1_1_1,00.html.
Organization for Economic Co-operation and Development (OECD; 2013). Assessment of Higher Education learning outcomes. Feasibility study report. Volume 2: Data analysis and national experiences. Retrieved from http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume2.pdf.
Pascarella, E. (2006). How college affects students: Ten directions for future research. Journal of College Student Development, 47, 508–520.
Pascarella, E. T., Blaich, C., Martin, G. L., & Hanson, J. M. (2011). How robust are the findings of Academically Adrift? Evidence from the Wabash National Study. Change: The Magazine of Higher Learning, 44(3), 20–24.
Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research. San Francisco: Jossey-Bass.
Pike, G. R. (2006). The convergent and discriminant validity of NSSE scalelet scores. Journal of College Student Development, 47, 550–563.
Prosser, M., & Millar, R. (1989). The “how” and “why” of learning physics. European Journal of Psychology of Education, 4, 513–528.
Ramsden, P. (2003). Learning to teach in higher education. London: RoutledgeFalmer.
Reason, R. D., Cox, B. E., McIntosh, K., & Terenzini, P. T.(2010, May). Deep learning as an individual, conditional, and contextual influence on first-year student outcomes. Paper presented at the Annual Forum of the Association for Institutional Research, Chicago, IL.
Sternberg, R. J. (2011, February 8). Who Is Really Adrift? Inside Higher Education. Retrieved from http://www.insidehighered.com/views/2011/02/08/a_critique_of_academically_adrift_and_the_test_behind_many_of_the_findings.
Tagg, J. (2003). The learning paradigm college. Boston: Anker.
Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition (2nd ed.). Chicago: University of Chicago Press.
Zhang, L. (2000). University students’ learning approaches in three cultures: An investigation of Biggs’s 3P model. The Journal of Psychology, 134(1), 37–55.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Campbell, C.M., Cabrera, A.F. Making the Mark: Are Grades and Deep Learning Related?. Res High Educ 55, 494–507 (2014). https://doi.org/10.1007/s11162-013-9323-6
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11162-013-9323-6