Skip to main content

Advertisement

Log in

Making the Mark: Are Grades and Deep Learning Related?

  • Published:
Research in Higher Education Aims and scope Submit manuscript

Abstract

Assessing gains in learning has received increased attention as one dimension of institutional accountability both in the USA (Arum and Roksa, Academically adrift: Limited learning on college campuses, 2011) and abroad (OECD, http://www.oecd.org/document/22/0,3746,en_2649_39263238_40624662_1_1_1_1,00.html, 2013, http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume2.pdf, 2012). Current approaches to assessing college learning are dominated by objective tests as well as student self-reported questionnaires, such as the National Survey of Student Engagement (NSSE). This study examined how the three NSSE deep approaches to learning scales contribute to the narrative on academic rigor at a large, public research institution. Using Confirmatory Factor Analyses and Structural Equation Modeling, results showed that the three deep approaches to learning constructs were internally valid, but deep learning was not related to GPA. Findings raised questions regarding good measurement of student learning and student reward for rigorous performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. We used Stata’s mvtest normality command to estimate both univariate, bivariate and multivariate tests of departure of normality.

  2. We relied on the EQS’s mardia test.

  3. Those items are: analyze, synthetize, applying, occide, integr, intede, othrvi and chngui.

  4. As noted by Brown (2006) and Geiser (2013), higher order factor analysis allows the researcher to test in a parsimonious manner hypothesis seeking to explain shared variance across similar constructs.

  5. Those items are ‘analyze’ and ‘integrate’ corresponding to Higher Order Learning and Integrative Learning respectively.

  6. Four correspond to Higher Order Learning (analyze, synthesis, evaluate and apply), two belong to Integrative Learning (divclass and integr) and one is an indicator of Reflective Learning (Ownview).

References

  • Allen, J., Robbins, S. B., Casillas, A., & Oh, In-Sue. (2008). Third-year college retention and transfer: Effects of academic performance, motivation, and social connectedness. Research in Higher Education, 49, 647–664.

    Article  Google Scholar 

  • Arum, R., & Roksa, J. (2011). Academically adrift limited: Learning on college campuses. Chicago: University of Chicago Press.

    Google Scholar 

  • Astin, A. (2011). In ‘Academically Adrift,’ Data Don’t Back Up Sweeping Claim. Chronicle of Higher Education. Retrieved at: http://chronicle.com/article/Academically-Adrift-a/12637.

  • Babcok, P. & Marks, M. (2010). Leisure college USA: The decline in study time. Report # 7. Washington, DC.: American Enterprise Institute for Public Research.

  • Banta, T. W., Pike, G. R., & Hansen, M. J. (2009). In T. W. Banta (Ed.). The Use of Engagement Data in Accreditation, Planning, and Assessment (pp.21-34). Using NSSE in Institutional Research. New Directions for Institutional Research, Issue # 141, Jossey-Bass.

  • Biggs, J. B., & Tang, C. (2011). Teaching for quality learning at university (4th ed.). Buckingham: Open University Press.

    Google Scholar 

  • Bollen, K. A. (1989). Structural equations with latent variables. New York: Wiley.

    Book  Google Scholar 

  • Brown, T. A. (2006). Confirmatory Factor Analysis for Applied Research. New York: Guilford Press.

    Google Scholar 

  • Byrne, B. M. (2006). Structural equation modeling with EQS (2nd ed.). Newbury Park: Sage.

    Google Scholar 

  • Campbell, C. M., & Cabrera, A. F. (2011). How Sound Is NSSE?: Investigating the Psychometric Properties of NSSE at a Public, Research-Extensive Institution. The Review of Higher Education, 35(1), 77–103.

    Article  Google Scholar 

  • Carle, A. C., Jaffe, D., Vaughn, N. W., & Eder, D. (2009). Psychometric properties of three new national survey of student engagement based engagement scales: An item response theory analysis. Research in Higher Education, 50, 775–794.

    Article  Google Scholar 

  • Entwistle, N. (1991). Approaches to learning and perceptions of the learning environment. Higher Education, 22, 201–204.

    Article  Google Scholar 

  • Entwistle, N. (1997). Reconstituting approaches to learning: A response to Webb. Higher Education, 33, 213–218.

    Article  Google Scholar 

  • Entwistle, N. J., & Ramsden, P. (1983). Understanding student learning. London: Croom Helm.

    Google Scholar 

  • Finney, S. J., & DiStefano, C. (2006). Non-normal and categorical data in structural equation modeling. In G. R. Hancock & R. O. Muller (Eds.), Structural Equation Modeling: A second course (pp. 269–314). Greenwich: IAP.

    Google Scholar 

  • Geiser, C. (2013). Data analysis with Mplus. New York: Guilford Press.

    Google Scholar 

  • Gordon, J., Ludlum, J., & Hoey, J. J. (2008). Validating NSSE against student outcomes: Are they related? Research in Higher Education, 49, 19–39.

    Article  Google Scholar 

  • Hall, C. W., Bolen, L. M., & Gupton, R. H. (1995). Predictive validity of the Study Process Questionnaire for undergraduate students. College Student Journal, 29, 234–239.

    Google Scholar 

  • Hu, L.-T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55.

    Article  Google Scholar 

  • Jaschik, S. (2011, January 18). ‘Academically Adrift.’ Inside Higher Education. Retrieved from http://www.insidehighered.com/news/2011/01/18/study_finds_large_numbers_of_college_students_don_t_learn_much.

  • Kline, R. B. (2011). Principles and practices of structural equation modeling. New York: Guilford Press.

    Google Scholar 

  • LaNasa, S., Cabrera, A. F., & Tangsrud, H. (2009). The construct validity of student engagement: A confirmatory factor analysis. Research in Higher Education, 50, 313–352.

    Article  Google Scholar 

  • Lui, O. L., Bridgeman, B., & Adler, R. M. (2012). Measuring Learning Outcomes in Higher Education: Motivation Matters. Educational Researcher, 41, 352–362.

    Article  Google Scholar 

  • Marton, F., & Säaljö, R. (1976). On qualitative differences in learning I: Outcome and process. British Journal of Educational Psychology, 46, 4–11.

    Article  Google Scholar 

  • Mayhew, M., Seifert, T. A., Pascarella, E. T., Nelson Laird, T. F., & Blaich, C. F. (2012). Going deep into mechanisms for moral reasoning growth: How deep learning approaches affect moral reasoning development for First-year students. Research in Higher Education, 53, 26–46.

    Article  Google Scholar 

  • Möller, J., Retelsdorf, J., Köller, O., & Marsh, H. W. (2011). The reciprocal internal/external frame of reference model: An integration of models of relations between academic achievement and self-concept. American Educational Research Journal, 48, 1315–1346.

    Article  Google Scholar 

  • Mueller, R. O., & Hancock, G. R. (2008). Best practices in structural equation modeling. In J. W. Osborne (Ed.), Best practices in quantitative methods (pp. 488–508). Thousand Oaks: Sage Publications Inc.

    Chapter  Google Scholar 

  • Mueller, R. O., & Hancock, G. R. (2010). Structural equation modeling. In G. R. Hancock & R. O. Mueller (Eds.), The Reviewer’s guide to quantitative methods in the social sciences (pp. 371–384). New York: Routledge.

    Google Scholar 

  • Nelson Laird, T.F, Garver, A. K., Niskode-Dossett, A. S., & Banks, J. V. (2008a, Nov). The Predictive Validity of a Measure of Deep Approaches to Learning. Paper presented at the Annual Meeting of the Association for the Study of Higher Education, Jacksonville, FL.

  • Nelson Laird, T., Shoup, R., & Kuh, G. (2006, May). Measuring deep approaches to learning using the National Survey of Student Engagement. Paper presented at the Annual Forum of the Association for Institutional Research, Chicago, IL.

  • Nelson Laird, T. F., Shoup, R., Kuh, G., & Schwarz, M. (2008b). The effects of discipline on deep approaches to student learning and college outcomes. Research in Higher Education, 49, 469–494.

    Article  Google Scholar 

  • Nusche, D. (2008). Assessment of Learning Outcomes in Higher Education: A comparative review of selected practices. Organization for Economic Co-operation and Development.

  • Office of Institutional Research (OIR; 2011). NSSE Survey Deep Learning Items: Comparison of first-year and senior student responses. The University of Rhode Island.

  • Organization for Economic Co-operation and Development (OECD; 2012). Testing student and university performance globally: OECD’s AHELO. Retrieved from: http://www.oecd.org/document/22/0,3746,en_2649_39263238_40624662_1_1_1_1,00.html.

  • Organization for Economic Co-operation and Development (OECD; 2013). Assessment of Higher Education learning outcomes. Feasibility study report. Volume 2: Data analysis and national experiences. Retrieved from http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume2.pdf.

  • Pascarella, E. (2006). How college affects students: Ten directions for future research. Journal of College Student Development, 47, 508–520.

    Article  Google Scholar 

  • Pascarella, E. T., Blaich, C., Martin, G. L., & Hanson, J. M. (2011). How robust are the findings of Academically Adrift? Evidence from the Wabash National Study. Change: The Magazine of Higher Learning, 44(3), 20–24.

    Article  Google Scholar 

  • Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research. San Francisco: Jossey-Bass.

    Google Scholar 

  • Pike, G. R. (2006). The convergent and discriminant validity of NSSE scalelet scores. Journal of College Student Development, 47, 550–563.

    Article  Google Scholar 

  • Prosser, M., & Millar, R. (1989). The “how” and “why” of learning physics. European Journal of Psychology of Education, 4, 513–528.

    Article  Google Scholar 

  • Ramsden, P. (2003). Learning to teach in higher education. London: RoutledgeFalmer.

    Google Scholar 

  • Reason, R. D., Cox, B. E., McIntosh, K., & Terenzini, P. T.(2010, May). Deep learning as an individual, conditional, and contextual influence on first-year student outcomes. Paper presented at the Annual Forum of the Association for Institutional Research, Chicago, IL.

  • Sternberg, R. J. (2011, February 8). Who Is Really Adrift? Inside Higher Education. Retrieved from http://www.insidehighered.com/views/2011/02/08/a_critique_of_academically_adrift_and_the_test_behind_many_of_the_findings.

  • Tagg, J. (2003). The learning paradigm college. Boston: Anker.

    Google Scholar 

  • Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition (2nd ed.). Chicago: University of Chicago Press.

    Google Scholar 

  • Zhang, L. (2000). University students’ learning approaches in three cultures: An investigation of Biggs’s 3P model. The Journal of Psychology, 134(1), 37–55.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Corbin M. Campbell.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Campbell, C.M., Cabrera, A.F. Making the Mark: Are Grades and Deep Learning Related?. Res High Educ 55, 494–507 (2014). https://doi.org/10.1007/s11162-013-9323-6

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11162-013-9323-6

Keywords

Navigation