Skip to main content
Log in

Student Engagement and Student Learning: Examining the Convergent and Discriminant Validity of the Revised National Survey of Student Engagement

  • Published:
Research in Higher Education Aims and scope Submit manuscript

Abstract

The present study examined the relationships between student engagement, represented by two versions of the National Survey of Student Engagement (NSSE) and self-reported gains in learning. The study drew on institutional-level data from participating institutions in 2011 and 2013. The objective of the research was to compare evidence of convergence and discrimination for the two versions of NSSE using canonical correlation analysis. Results indicated that both versions of NSSE provided clear evidence of convergence in that student engagement measures were significantly and positively related to perceived gains in learning. However, only the most recent version of NSSE provided strong evidence of discrimination (i.e., differential relationships between engagement measures and self-reported learning outcomes). Thus, the revised NSSE appears to offer substantial advantages for institutions interested in more nuanced understandings of the relationships between student engagement and perceived learning outcomes. Implications for educators, with goals of enhancing student learning, and for researchers, who often compare complex sets of data, are included.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (2014). Standard for educational and psychological testing. Washington, DC: American Educational Research Association.

  • An, B. P. (2015). The role of academic motivation and engagement on the relationship between dual enrollment and academic performance. Journal of Higher Education, 86, 98–126.

    Article  Google Scholar 

  • Anderson, P., Anson, C. M., Gonyea, R. M., & Paine, C. (2015). The contributions of writing to learning and development: Results from a large-scale multi-institutional study. Research in the Teaching of English, 50, 199.

    Google Scholar 

  • Astin, A. W. (1977). Four critical years: Effects of college on beliefs, attitudes, and knowledge. San Francisco: Jossey-Bass.

    Google Scholar 

  • Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal of College Student Personnel, 25, 297–307.

    Google Scholar 

  • Axelson, R. D., & Flick, A. (2011). Defining student engagement. Change, 31(1), 38–43.

    Google Scholar 

  • Baird, L. L. (1976). Using self-reports to predict student performance. College Entrance Examination Board Research Monograph No. 7. New York: College Entrance Examination Board (ED 126 116).

  • Banta, T. W., & Pike, G. R. (1989). Methods for evaluating assessment instruments. Research in Higher Education, 30, 455–470.

    Article  Google Scholar 

  • Banta, T. W., Pike, G. R., & Hansen, M. J. (2009). The use of engagement data in accreditation, planning and assessment. In R. M. Gonyea & G. D. Kuh (Eds.), Using NSSE in institutional research (pp. 21–34). New Directions for Institutional Research series, No. 141. San Francisco: Jossey-Bass.

  • Berdie, R. F. (1971). Self-claimed and tested knowledge. Educational and Psychological Measurement, 31, 629–636.

    Article  Google Scholar 

  • Bowman, N. A. (2011). Examining systematic errors in predictors of college student self-reported gains. In S. Herzog & N. A. Bowman (Eds.), Validity and limitations of college student self-report data (pp. 7–20). New Directions for Institutional Research series, No. 150. San Francisco: Jossey-Bass.

  • Bowman, N. A., & Hill, P. L. (2011). Measuring how college affects students: Social desirability and other potential biases in college student self-reported gains. In S. Herzog & N. A. Bowman (Eds.), Validity and limitations of college student self-report data (pp. 73–86). New Directions for Institutional Research series, No. 150. San Francisco: Jossey-Bass.

  • Bowman, N. A., Park, J. J., & Denson, N. (2015). Examining student involvement in ethnic student organizations: Examining civic outcomes 6 years after graduation. Research in Higher Education, 56, 127–145.

    Article  Google Scholar 

  • Bowman, N. A., & Seifert, T. A. (2011). Can college students accurately assess what affects their learning and development. Journal of College Student Development, 52, 270–290.

    Article  Google Scholar 

  • Brackett, M. A., & Mayer, J. D. (2003). Convergent, discriminant, and incremental validity of competing measures of emotional intelligence. Personality and Social Psychology Bulletin, 29, 1147–1158.

    Article  Google Scholar 

  • Brennan, R. L. (1995). The conventional wisdom about group mean scores. Journal of Educational Measurement, 32, 385–396.

    Article  Google Scholar 

  • Brint, S., & Cantwell, A. M. (2014). Conceptualizing, measuring, and analyzing the characteristics of academically disengaged students: Results from UCUES 2010. Journal of College Student Development, 55, 808–823.

    Article  Google Scholar 

  • Campbell, D. T. (1960). Recommendations for APA test standards regarding construct, trait, and discriminant validity. American Psychologist, 15, 546–553.

    Article  Google Scholar 

  • Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validity by the multitrait-multimethod matrix. Psychological Bulletin, 56, 81–105.

    Article  Google Scholar 

  • Carini, R. M., Kuh, G. D., & Klein, S. P. (2006). Student engagement and student learning: Testing the linkages. Research in Higher Education, 47, 1–32.

    Article  Google Scholar 

  • Chickering, A. W., & Gamson, Z. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39(7), 3–7.

    Google Scholar 

  • Cohen, J., & Cohen, P. (2003). Applied multiple regression/correlation analysis for the behavioral sciences (3rd ed.). Hillsdate, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Cole, J. S., & Korkmaz, A. (2013). First-year students’ psychological well-being and need for cognition: Are they important predictors of academic engagement? Journal of College Student Development, 54, 557–569.

    Article  Google Scholar 

  • Dougan, J. P. (2013). Patterns in group involvement experiences during college: Identifying a taxonomy. Journal of College Student Development, 54, 229–246.

    Article  Google Scholar 

  • Dumont, R. G., & Troelstrup, R. L. (1980). Exploring relationships between objective and subjective measures of instructional outcomes. Research in Higher Education, 12, 37–51.

    Article  Google Scholar 

  • Fiske, D. W. (1982). Convergent-discriminant validation measurements in research strategies. In L. H. Kidder (Ed.), Forms of validity research (pp. 79–92). New directions for the methodology of social and behavioral science series, No. 12. San Francisco: Jossey-Bass.

  • Flynn, D. (2014). Baccalaureate attainment of college students at 4-year institutions as a function of student engagement behaviors: Social and academic student engagement behaviors matter. Research in Higher Education, 55, 467–493.

    Article  Google Scholar 

  • Gayles, J. G., & Ampaw, F. (2014). The impact of college experiences on degree completion in STEM fields at four-year institutions: Does gender matter? Journal of Higher Education, 85, 439–468.

    Article  Google Scholar 

  • Gellin, A. (2003). The effect of undergraduate student involvement on critical thinking: A meta-analysis of the literature, 1991-2000. Journal of College Student Development, 44, 746–762.

    Article  Google Scholar 

  • Gordon, J., Ludlum, J., & Hoey, J. J. (2008). Validating NSSE against student outcomes: Are they related? Research in Higher Education, 49, 19–39.

    Article  Google Scholar 

  • Griffin, K. A., & McIntosh, K. L. (2015). Finding a fit: Understanding Black immigrant students’ engagement in campus activities. Journal of College Student Development, 56, 243–260.

    Article  Google Scholar 

  • Holland, J. L. (1997). Making vocational choices: A theory of vocational personalities and work environments (3rd ed.). Odessa, FL: Psychological Assessment Resources.

  • Hotelling, H. (1936). Relations between two sets of measures. Biometrika, 28, 321–377.

    Article  Google Scholar 

  • Hu, S., & Wolniak, G. C. (2013). College student engagement and early career earnings: Differences by gender, race/ethnicity, and academic preparation. Review of Higher Education, 36, 211–233.

    Article  Google Scholar 

  • Kane, M. T. (2006). Validation. In R. L. Brennan (Ed.), Educational measurement (4th ed., pp. 17–64). Westport, CT: American Council on Education and Praeger.

    Google Scholar 

  • Kim, Y. K., & Sax, L. J. (2014). The effects of student–faculty interaction on academic self-concept: Does academic major matter? Research in Higher Education, 55, 780–809.

    Article  Google Scholar 

  • Kline, S., Benjamin, R., Shavelson, R., & Bolus, R. (2007). The Collegiate Learning Assessment: Facts and fantasies. Evaluation Review, 31, 415–439.

    Article  Google Scholar 

  • Kuh, G. D. (2001). The National Survey of Student Engagement: Conceptual framework and overview of psychometric properties. Bloomington, IN: Indiana University Center for Postsecondary Research.

    Google Scholar 

  • Kuh, G. D. (2003). What we are learning about student engagement from NSSE. Change, 35(2), 24–32.

    Article  Google Scholar 

  • Kuh, G. D. (2005). Putting student engagement results to use: Lessons from the field. Assessment Update: Progress, Trends, and Practices in Higher Education, 17(1), 12–13.

    Google Scholar 

  • Kuh, G. D. (2006). Making students matter. In J. C. Burke (Ed.), Fixing the fragmented university: Decentralization with discretion (pp. 235–264). Boston: Jossey-Bass.

    Google Scholar 

  • Kuh, G. D. (2009). The National Survey of Student Engagement: Conceptual and empirical foundations. In R. M. Gonyea & G. D. Kuh (Eds.), Using NSSE in institutional research (pp. 5–20). New Directions for Institutional Research series, No. 141. San Francisco: Jossey-Bass.

  • Kuh, G. D., Hayek, J. C., Carini, R. M., Ouimet, J. A., Gonyea, R. M., & Kennedy, J. (2001). NSSE technical and norms report. Bloomington, IN: Indiana University Center for Postsecondary Research.

    Google Scholar 

  • Kuh, G. D., & Ikenberry, S. (2009). More than you think, less than we need: Learning outcomes assessment in American higher education. Champaign: National Institute for Learning Outcomes Assessment.

    Google Scholar 

  • Kuh, G. D., Kinzie, J., Cruce, T., Shoup, R., & Gonyea, R. M. (2007). Connecting the dots: Multifaceted analyses of the relationships between student engagement results from the NSSE, and the institutional practices and conditions that foster student success. Final report prepared for Lumina Foundation for Education. Bloomington: Center for Postsecondary Research.

  • Kuh, G. D., Schuh, J. H., Whitt, E. J., & Associates. (1991). Involving colleges: Encouraging student learning and personal development through out-of-class experiences. San Francisco: Jossey-Bass.

    Google Scholar 

  • LaNasa, S. M., Cabrera, A. F., & Transgrud, H. (2009). The construct validity of student engagement: A confirmatory factor analysis approach. Research in Higher Education, 50, 315–332.

    Article  Google Scholar 

  • Lingenfelter, P. E. (2013). Forward: Why a fresh look at student engagement? In A fresh look at student engagement: Annual results 2013 (pp. 2–3). Bloomington, IN: Indiana University Center for Postsecondary Research.

  • McCormick, A. C. (2013). Director’s Message: If it’s not broken … make it better. In A fresh look at student engagement: Annual results 2013 (pp. 4–5). Bloomington, IN: Indiana University Center for Postsecondary Research.

  • McCormick, A. C., Gonyea, R. M., & Kinzie, J. (2013). Refreshing engagement: NSSE at 13. Change, 45(3), 6–15.

    Article  Google Scholar 

  • McCormick, A. C., & McClenney, K. (2012). Will these trees ever bear fruit? A response to the special issue on student engagement. Review of Higher Education, 35, 307–333.

    Article  Google Scholar 

  • Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 13–103). New York: American Council on Education and MacMillan.

    Google Scholar 

  • National Survey of Student Engagement. (2001). Improving the college experience: National benchmarks of effective educational practice. Bloomington, IN: Indiana University Center for Postsecondary Research.

    Google Scholar 

  • National Survey of Student Engagement. (2009). Using NSSE to assess and improve undergraduate education: Lessons from the field, 2009. Bloomington, IN: Indiana University Center for Postsecondary Research.

    Google Scholar 

  • National Survey of Student Engagement. (2014a). About NSSE. Bloomington, IN: Indiana University Center for Postsecondary Research. Retrieved from http://nsse.iub.edu/html/about.cfm.

  • National Survey of Student Engagement. (2014b). NSSE 2014 engagement indicators: Internal consistency statistics by class level. Bloomington, IN: Indiana University Center for Postsecondary Research. Retrieved from http://nsse.iub.edu/2014_institutional_report/pdf/EI%20Intercorrelations%202014.pdf.

  • Nelson Laird, T. F., Seifert, T. A., Pascarella, E. T., Mayhew, M. J., & Blaich, C. F. (2014). Deeply affecting first-year students’ thinking: Deep approaches to learning and three dimensions of cognitive development. Journal of Higher Education, 85, 402–432.

    Article  Google Scholar 

  • Nisbett, R. E., & Wilson, T. D. (1977). Telling more than we can know: Verbal reports on mental processes. Psychological Review, 84, 231–259.

    Article  Google Scholar 

  • Ouimet, J. A., Bunnage, J. B., Carini, R. M., Kuh, G. D., & Kennedy, J. (2004). Using focus groups to establish the validity and reliability of a college student survey. Research in Higher Education, 45, 223–250.

    Article  Google Scholar 

  • Pace, C. R. (1980). Measuring the quality of student effort. Current Issues in Higher Education, 2, 10–16.

    Google Scholar 

  • Pace, C. R. (1984). Measuring the quality of college student experiences. Los Angeles, CA: Center for the Study of Evaluation, University of California Los Angeles.

    Google Scholar 

  • Pace, C. R. (1985). The credibility of student self-reports. Los Angeles: UCLA Center for the Study of Evaluation. (ED 266 174).

  • Pascarella, E. T. (2001). Using student self-reported gains to estimate college impact: A cautionary tale. Journal of College Student Development, 42, 488–492.

    Google Scholar 

  • Pascarella, E. T., Seifert, T. A., & Blaich, C. (2009). How effective are the NSSE benchmarks in predicting important educational outcomes? Change, 42(1), 16–22.

    Article  Google Scholar 

  • Pike, G. R. (1989). Background, college experiences, and the ACT-COMP exam: Using construct validity to evaluate assessment instruments. Review of Higher Education, 13, 91–118.

    Article  Google Scholar 

  • Pike, G. R. (1992). Components of construct validity: A comparison of two measures of general education outcomes. Journal of General Education, 41, 130–150.

    Google Scholar 

  • Pike, G. R. (1995). The relationship between self-reports of college experiences and achievement test scores. Research in Higher Education, 36, 1–21.

    Article  Google Scholar 

  • Pike, G. R. (1996). Limitations of using students’ self-reports of academic development as proxies for traditional achievement measures. Research in Higher Education, 37, 89–114.

    Article  Google Scholar 

  • Pike, G. R. (1999). The constant error of the halo in educational outcomes research. Research in Higher Education, 40, 61–86.

    Article  Google Scholar 

  • Pike, G. R. (2006a). The convergent and discriminant validity of NSSE scalelet scores. Journal of College Student Development, 47, 550–563.

    Article  Google Scholar 

  • Pike, G. R. (2006b). The dependability of NSSE scalelets for college and department-level assessment. Research in Higher Education, 47, 177–195.

    Article  Google Scholar 

  • Pike, G. R. (2011). Using college students’ self-reported learning outcomes in scholarly research. In S. Herzog & N. A. Bowman (Eds.), Validity and limitations of college student self-report data (pp. 41–58). New Directions for Institutional Research series, no. 150. San Francisco: Jossey-Bass.

  • Pike, G. R. (2013a). Assessment measures: The updated National Survey of Student Engagement (NSSE). Assessment Update: Progress, Trends, and Practices in Higher Education, 25(4), 10–11.

    Google Scholar 

  • Pike, G. R. (2013b). NSSE benchmarks and institutional outcomes: A note on the importance of considering the intended uses of a measure in validity studies. Research in Higher Education, 54, 149–170.

    Article  Google Scholar 

  • Pike, G. R., & Killian, T. S. (2001). Reported gains in student learning: Do academic disciplines make a difference? Research in Higher Education, 42, 429–454.

    Article  Google Scholar 

  • Pike, G. R., Smart, J. C., & Ethington, C. A. (2011, November). Differences in learning outcomes across academic environments: Further evidence concerning the construct validity of students’ self-reports. Paper presented at the annual meeting of the Association for the Study of higher Education, Charlotte, NC.

  • Pike, G. R., Smart, J. C., & Ethington, C. A. (2012). The mediating effects of student engagement on the relationships between academic disciplines and learning outcomes: An extension of Holland’s theory. Research in Higher Education, 53, 550–575.

    Article  Google Scholar 

  • Pohlmann, J. T., & Beggs, D. L. (1974). A study of the validity of self‐reported measures of academic growth. Journal of Educational Measurement, 11(2), 115–119.

    Article  Google Scholar 

  • Porter, S. R. (2011). Do college student surveys have any validity? Review of Higher Education, 35, 45–76.

    Article  Google Scholar 

  • Rocconi, L. M., Ribera, A. K., & Nelson Laird, T. F. (2015). College seniors’ plans for graduate school: Do deep approaches to learning and Holland academic environments matter? Research in Higher Education, 56, 178–201.

    Article  Google Scholar 

  • Seifert, T. A., Gillig, B., Hanson, J. M., Pascarella, E. T., & Blaich, C. F. (2014). The conditional nature of high impact/good practices on student learning. Journal of Higher Education, 85, 531–564.

    Article  Google Scholar 

  • Sherry, A., & Henson, R. K. (2005). Conducting and interpreting canonical correlation analysis in personality research: A user-friendly primer. Journal of Personality Assessment, 84, 37–48.

    Article  Google Scholar 

  • StataCorp. (2014). Stata14: Multivariate statistics. College Station, TX: Stata Press.

    Google Scholar 

  • Thompson, B. (1984). Canonical correlation analysis: Uses and interpretation. Sage Quantitative Applications in the Social Sciences series, No. 47. Beverly Hills, CA: Sage.

  • Thompson, B. (1991). A primer on the logic and use of canonical correlation analysis. Measurement and Evaluation in Counseling and Development, 24, 80–95.

    Google Scholar 

  • Thorndike, E. L. (1920). A constant error in psychological ratings. Journal of Applied Psychology, 4, 25–29.

    Article  Google Scholar 

  • Tyler, R. W. (1932). Service studies in higher education. Columbus: Bureau of Educational Research, Ohio State University.

    Google Scholar 

  • Umbach, P. D., & Wawrzynski, M. R. (2005). Faculty do matter: The role of college faculty in student learning and engagement. Research in Higher Education, 46, 153–184.

    Article  Google Scholar 

  • Valiga, M. J. (1986). The accuracy of self-reported high school course and grade information. ACT Research Report Series 87–1. Iowa City, Iowa: American College Testing.

  • Van Den Wollenberg, A. L. (1977). Redundancy analysis: An alternative for canonical correlation analysis. Psychometrika, 42, 207–219.

    Article  Google Scholar 

  • Wainer, H., & Kiely, G. L. (1987). Item clusters and computer adaptive testing: A case for testlets. Journal of Educational Measurement, 24, 185–201.

    Article  Google Scholar 

  • Ward, C., Fisher, R., Lam, F. S. Z., & Hall, L. (2009). The convergent, discriminant, and incremental validity of scores on a self-report measure of cultural intelligence. Educational and Psychological Measurement, 69, 85–105.

    Article  Google Scholar 

  • Webber, K. L., Krylow, R. B., & Zhang, Q. (2013). Does involvement really matter? Indicators of college student success and satisfaction. Journal of College Student Development, 54, 591–611.

    Article  Google Scholar 

  • Wilson, D., Jones, D., Bocell, F., Crawford, J., Kim, M. J., Veilleux, N., et al. (2015). Belonging and academic engagement among undergraduate STEM students: A multi-institutional study. Research in Higher Education, 56(3), 750–776.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to John Zilvinskis.

Appendices

Appendix 1: Items Comprising the NSSE Scalelets

Course challenge (Eρ2 = 0.73)

 How often have you … worked harder than you thought you could to meet an instructor’s standards or expectations? [workhard]

 How often have you … come to class without completing readings or assignments? {Reverse Scored} [clunprep]

 To what extent have … your examinations during the current school year challenged you to do your best work? [exams]

 How many hours a week do you spend … preparing for class (studying, reading, writing, rehearsing, and other activities related to you academic program)? [acadpr01]

 To what extent does your institution emphasize … spending significant amounts of time studying and on academic work? [envschol]

Writing (Eρ2 = 0.75)

 How often have you … prepared two or more drafts of a paper or assignment before turning it in? [rewropap]

 How often have you … worked on a paper or project that required integrating ideas or information from various sources? [integrat]

 During the current school year … number of written papers or reports of 20 pages or more? [writemor]

 During the current school year … number of written papers or reports between 5 and 19 pages? [writemid]

 During the current school year … number of written papers or reports of fewer than 5 pages? [writesml]

Higher-order thinking skills (Eρ2 = 0.77)

 During the current school year, to what extent has your coursework emphasized … memorizing facts, ideas, or methods from your courses and readings so you can repeat them in pretty much the same form? {Reverse Scored} [memorize]

 During the current school year, to what extent has your coursework emphasized … analyzing the basic elements of an idea, experience, or theory, such as examining a particular case or situation in depth and considering its components? [analyze]

 During the current school year, to what extent has your coursework emphasized … synthesizing and organizing ideas, information, or experiences into new, more complex interpretations and relationships? [synthesz]

 During the current school year, to what extent has your coursework emphasized … making judgments about the value of information, arguments, or methods such as examining how others gathered and interpreted data and assessing the soundness of their conclusions? [evaluate]

 During the current school year, to what extent has your coursework emphasized … Applying theories or concepts to practical problems or in new situations? [applying]

Active learning (Eρ2 = 0.84)

 How often have you … asked questions in class or contributed to class discussions? [clquest]

 How often have you … made a class presentation? [clpresen]

 How often have you … participated in a community-based project as part of a regular course? [commproj]

Collaborative learning (Eρ2 = 0.72)

 How often have you … worked with other students on projects during class? [classgrp]

 How often have you … worked with classmates outside of class to prepare class assignments? [occgrp]

 How often have you … tutored or taught other students (paid or voluntary)? [tutor]

 How often have you … discussed ideas from your readings or classes with others outside of class (students, family members, coworkers, etc.)? [oocideas]

Course interaction (Eρ2 = 0.80)

 How often have you … discussed grades or assignments with an instructor? [facgrade]

 How often have you … discussed ideas from your readings or classes with faculty members outside of class? [facideas]

 How often have you … received prompt feedback from faculty on your academic performance (written or oral)? [facfeed]

Out-of-class interaction (Eρ2 = 0.84)

 How often have you … talked about career plans with a faculty member or advisor? [facplans]

 How often have you … worked with faculty members on activities other than coursework (committees, orientation, student-life activities, etc.)? [facother]

 Have you, or do you plan to, … work on a research project with a faculty member outside of course or program requirements? [research]

Support for student success (Eρ2 = 0.83)

 To what extent does your institution emphasize … providing the support you need to help you succeed academically? [envsuprt]

 To what extent does your institution emphasize … helping you cope with your non-academic responsibilities (work, family, etc.)? [envnacad]

 To what extent does your institution emphasize … providing the support you need to thrive socially? [envsocal]

Interpersonal environment (Eρ2 = 0.80)

 Quality of your relationships with … other students? [envstu]

 Quality of your relationships with … faculty members? [envfac]

 Quality of your relationships with … administrative personnel and offices? [envadm]

Appendix 2: Items Comprising the NSSE Engagement Indicators

Higher-order learning (Eρ2 = 0.85)

During the current school year, how much has your coursework emphasized the following:

 Applying facts, theories, or methods to practical problems or new situations [Hoapply]

 Analyzing an idea, experience, or line of reasoning in depth by examining its parts [Hoanalyze]

 Evaluating a point of view, decision, or information source [Hoevaluate]

 Forming a new idea or understanding from various pieces of information [Hoform]

Reflective & Integrative Learning (Eρ2 = 0.87)

During the current school year, how often have you:

 Combined ideas from different courses when completing assignments [RIintegrate]

 Connected your learning to societal problems or issues [RIsocietal]

 Included diverse perspectives (political, religious, racial/ethnic, gender, etc.) in course discussions or assignments [RIdiverse]

 Examined the strengths and weaknesses of your own views on a topic or issue [RIownview]

 Tried to better understand someone else’s views by imagining how an issue looks from his or her perspective [RIperspect]

 Learned something that changed the way you understand an issue or concept [RInewview]

 Connected ideas from your courses to your prior experiences and knowledge [RIconnect]

Learning strategies (Eρ2 = 0.77)

During the current school year, how often have you:

 Identified key information from reading assignments [LSreading]

 Reviewed your notes after class [LSnotes]

 Summarized what you learned in class or from course materials [LSsummary]

Quantitative reasoning (Eρ2 = 0.86)

During the current school year, how often have you:

 Reached conclusions based on your own analysis of numerical information (numbers, graphs, statistics, etc.) [QRconclude]

 Used numerical information to examine a real-world problem or issue (unemployment, climate change, public health, etc.) [QRproblem]

 Evaluated what others have concluded from numerical information [QRevaluate]

Collaborative Learning (Eρ2 = 0.81)

During the current school year, how often have you:

 Asked another student to help you understand course material [CLaskhelp]

 Explained course material to one or more students [CLexplain]

 Prepared for exams by discussing or working through course material with other students [CLstudy]

 Worked with other students on course projects or assignments [CLproject]

Discussions with diverse others (Eρ2 = 0.89)

During the current school year, how often have you had discussions with people from the following groups:

 People from a race or ethnicity other than your own [DDrace]

 People from an economic background other than your own [DDeconomic]

 People with religious beliefs other than your own [DDreligion]

 People with political views other than your own [DDpolitical]

Student–faculty interaction (Eρ2 = 0.83)

During the current school year, how often have you:

 Talked about career plans with a faculty member [SFcareer]

 Worked with a faculty member on activities other than coursework (committees, student groups, etc.) [SFotherwork]

 Discussed course topics, ideas, or concepts with a faculty member outside of class [SFdiscuss]

 Discussed your academic performance with a faculty member [SFperform]

Effective Teaching Practices (Eρ2 = 0.85)

During the current school year, to what extent have your instructors done the following:

 Clearly explained course goals and requirements [ETgoals]

 Taught course sessions in an organized way [ETorganize]

 Used examples or illustrations to explain difficult points [ETexample]

 Provided feedback on a draft or work in progress [ETdraftfb]

 Provided prompt and detailed feedback on tests or completed assignments [ETfeedback]

Quality of Interactions (Eρ2 = 0.84)

Indicate the quality of your interactions with the following people at your institution:

 Students [QIstudent]

 Academic advisors [QIadvisor]

 Faculty [QIfaculty]

 Student services staff (career services, student activities, housing, etc.) [QIstaff]

 Other administrative staff and offices (registrar, financial aid, etc.) [QIadmin]

Supportive Environment (Eρ2 = 0.89)

How much does your institution emphasize the following:

 Providing support to help students succeed academically [SEacademic]

 Using learning support services (tutoring services, writing center, etc.) [SElearnsup]

 Encouraging contact among students from different backgrounds (social, racial/ethnic, religious, etc.) [SEdiverse]

 Providing opportunities to be involved socially [SEsocial]

 Providing support for your overall well-being (recreation, health care, counseling, etc.) [SEwellness]

 Helping you manage your non-academic responsibilities (work, family, etc.) [SEnonacad]

 Attending campus activities and events (performing arts, athletic events, etc.) [SEactivities]

 Attending events that address important social, economic, or political issues [SEevents]

Appendix 3: Items Comprising the NSSE Learning Gains Factors

Academic and Interpersonal Gains (Eρ2 = 0.86, 2011; Eρ2 = 0.88, 2013)

How much has your experience at this institution contributed to your knowledge, skills, and personal development in the following areas?

 Writing clearly and effectively [pgwrite]

 Speaking clearly and effectively [pgspeak]

 Thinking critically and analytically [pgthink]

 Developing or clarifying a personal code of values and ethics [pgvalues]

 Understanding people of other backgrounds (economic, racial/ethnic, political, religious, nationality, etc.) [pgdiverse]

 Being an informed and active citizen [pgcitizen]

Application Gains (Eρ2 = 0.86, 2011; Eρ2 = 0.79, 2013)

How much has your experience at this institution contributed to your knowledge, skills, and personal development in the following areas?

 Analyzing numerical and statistical information [pganalyze]

 Acquiring job- or work-related knowledge and skills [pgwork]

 Working effectively with others [pgothers]

 Solving complex real-world problems [pgprobsolve]

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zilvinskis, J., Masseria, A.A. & Pike, G.R. Student Engagement and Student Learning: Examining the Convergent and Discriminant Validity of the Revised National Survey of Student Engagement. Res High Educ 58, 880–903 (2017). https://doi.org/10.1007/s11162-017-9450-6

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11162-017-9450-6

Keywords

Navigation