Introduction

We have often wished that higher education had something akin to a customs office, whereby every stakeholder would come through a central lane and declare their academic goods. Are you a student who performed an internship last summer? Great, share what you learned in the booth to your left. Are you a professor who utilized service-learning or international virtual exchange pedagogy in your classroom? Fantastic, step straight ahead and let’s talk student learning outcomes. Are you a director who created metacognitive learning activities for a group of students in a learning community? Thanks for your ingenuity and commitment to education.

Alas, since the academic customs office remains an elusive dream, those of us whose work intersects high-impact educational practices, curriculum and pedagogy, student success, and equity must find other ways of acquiring the full story.

In these pages, we recount such a story. We describe how we embarked on an institutional research and mapping project of defining and seeking data on undergraduate engagement in high-impact educational practices at a large, public institution. In mapping where high-impact practices were occurring in the curriculum and co-curriculum, we were also able to map who was and was not engaging and started to determine reasons why and implications for improvement. The work entailed partial use of institutional-level data, partial use of programmatic-level data, and some good old-fashioned walking around campus to talk to those who had pieces of the student engagement puzzle. Most importantly, we were able to put that puzzle together to provide a detailed longitudinal and equity-informed snapshot of how undergraduate students navigate curricular and co-curricular engagement opportunities at a complex institution. The longitudinal nature of the project was carried out with an entering cohort of students by retroactively identifying each students' involvement in high-impact practices each semester - for a full six years or until they left the institution with or without a degree. We intend that our project provides a roadmap to consider for others seeking to fully understand undergraduate engagement and drive change on their campuses.

Why High-Impact Educational Practices

What if every undergraduate college student had opportunities to do more than just sit in a desk or in front of a screen, take tests, and receive a degree? If doing more meant that they were able to be co-creators of their own education rather than mere participants in it? If faculty and staff were intentionally creating opportunities for these students, supported by institutional resources, rewards, recognition, and a data ecosystem that made these opportunities equitable and achievable for all students? Enter the concept of high-impact educational practices (HIPs) and the changing role of higher educational institutions in guiding student learning (Kezar & Holcombe, 2017; Kuh, 2008; Kuh & O’Donnell, 2013; Schwartz & Miller, 2020).

The list of HIPs includes activities such as first-year seminars & experiences, learning communities, student-faculty research, study abroad and diversity/global learning, service-learning, internships, senior experiences/capstones, common intellectual experiences, writing intensive courses, and collaborative assignments and projects. However, continuing research shows that simply pursuing a checkbox manner of HIP cataloguing at an institution is not beneficial. Rather, examining the quality of practice, asking deeper questions about student learning, and assessing and mitigating equity gaps are ways of ensuring that high-impact educational practices are, in fact, high impact (Finley, 2019; Finley & McNair, 2013; McNair et al., 2020). The relationships that students build with teachers and mentors while engaging in HIPs is a key commonality among them (Felten & Lambert, 2020). Research also supports the idea that helping new majority college students – those from underrepresented backgrounds, communities with fewer socioeconomic resources, and/or households with less social capital and knowledge of how to navigate college – to access positive mentoring relationships and prosocial classroom environments leads to their success in college and beyond (Heinrich et al., 2021; Ro et al., 2021; Seemiller & Grace, 2016).

If simply listing and checking off student access to HIPs does not an inclusive institution make, then the need for accurate data-informed discussions is paramount. Quality institutional alignment for HIPs has implications for improved curricular planning, strategic initiative generation, and democratization of data for decision-making. Increasingly, institutions of higher education around the globe are interested in confirming the benefits of HIPs and measuring the impact on diverse students (Kinzie et al., 2020). Determining all measures and impacts are beyond the scope of this paper, but a few illustrative examples are worth note. The U.S.-based National Association of System Heads (NASH) networked their member institutions to study and create a roadmap for HIPs at scale, organized in an educational toolkit open to all (NASH, n.d.). The Comprehensive Learner Record project, jointly administered through the American Association of Collegiate Registrars and Admissions Officers (AACRAO) and NASPA: Association of Student Affairs Professionals, provided a variety of methods across the 20+ partner institutions to record and communicate student learning in HIPs (AACRAO, 2015). (Both the NASH and the AACRAO/NASPA project were funded by The Lumina Foundation.) Virginia Commonwealth University (Richmond, Virginia, United States) has increased graduation rates for underrepresented students by 15 percent, using high-impact practices as a driver of a student success initiative (Goldman, 2021). The University of Guleph (Guleph, Ontario, Canada) created a joint initiative to better understand student perspectives in high-impact practices while establishing a baseline of student participation and developing tools to enhance retention and student success (Cook et al., 2017). Researchers at the University of Edinburgh (Scotland) have taken an interdisciplinary approach to examining how faculty develop real-world learning opportunities for students through a wicked problems framework (McCune et al., 2021), thus addressing the impact of HIPs at a pedagogical level.

We find through discussions at our own institution and with our counterparts at other institutions that integrated data systems for HIPs and collaboration across units on shared values are essential to an engaged institution, a belief also supported in the literature (Kinzie et al., 2015; Kinzie & Franklin, 2020; Nadasen & Alig, 2021). Shared values also lead to shared language and definitions (IUPUI RISE Initiative, 2019), which is necessary for any institution embarking on efforts aimed at equitable HIP access across sectors of the student experience. We next discuss how our institutional goals align with the map our HIP patterns, along with some of the challenges with collecting this information.

Institutional Context and Purpose of the Study

Clemson University has a history as a public, land-grant institution with a strong undergraduate educational focus. However, with a Carnegie reclassification within the past several years as a R1 institution (doctoral universities – very high research activities), it is important to understand various practices that impact undergraduate student engagement and learning to maintain quality as we continue to evolve. Because much of our data are not centralized, we have not been able to capture the full undergraduate high-impact educational practices landscape: which students are and are not engaging and what student demographics, backgrounds, and/or curricular choices may be at play over time as we seek to provide equitable engagement opportunities at an institutional level. We drew inspiration from the quotation attributed to sixteenth century scientist Galileo Galilei: “measure what is measurable and make measurable what is not so.” As a first step, we sought to undertake a project to identify, collect, and analyze HIP experiences. As the more important second step, we worked with a range of stakeholders to make meaning via identification and initiation of steps for improvements, based firmly within an inclusive excellence framework for ensuring our educational opportunities benefit all students.

Methods

Methods of data acquisition, analysis, and interpretation occurred in an iterative process, where stakeholders identified areas to address, data was acquired and interpreted for mapping to student HIP involvements, and additional groups of stakeholders have been making meaning and driving institutional change. The mapping process for high-impact educational practices involved determination of curricular and co-curricular activities occurring along with demographics of the students who were or were not engaging.

Stakeholder Focus Group

To properly map the HIPs that were occurring at our institution, identifying our institutional scope of HIPs was a necessary first step, as while Kuh and O’Donnell have gathered national data and definitions on HIPs (Kuh, 2008; Kuh & O’Donnell, 2013), there is not always a corresponding one-to-one mapping of the defined HIPs by Kuh and O’Donnell and what an institution offers. To accomplish this mapping, we conducted a two-part focus group with faculty and staff stakeholders. The focus group was comprised of the key personnel such as directors, faculty, deans, and provosts from various areas of academic affairs, college administration, housing, student affairs, and innovation center.

In bringing these stakeholders together, our goal was to consult individuals most involved in undergraduate student engaged learning to advance identification and measurement of student engagement. We reviewed national data on HIPs (Kuh, 2008; Kuh & O’Donnell, 2013) along with our institutional strategic planning dashboards. We identified key questions to consider and discuss:

  • How do we currently measure the involvement of our undergraduate students in high-impact practices? How can we improve this process?

  • What high-impact practices might our students be doing that are not captured in the strategic plan and dashboards? How can we better capture this information?

  • How do we collaborate on data to determine which student population groups and subgroups are receiving opportunities and which students are not?

In the second meeting, we used a SWOT+S (Strengths, Weaknesses, Opportunities, Threats, plus Strategies) strategic planning framework to direct our conversation. The primary strategy identified was to initiate an institutional research study of HIPs, by gathering cross-institutional data and correlating to how student involvement impacts undergraduate students. We determined that the HIPs study would be the driver of all other strategies to improve and equitably deliver high-impact educational practices to a diverse student body.

Student Population

We focused on the 2012 first-time, first-year entering student cohort. For the 2012 cohort, we had full demographic and academic data, including majors of entry and 6-year graduation dates. Going earlier than 2012 would not have been possible, as the institution had changed student information systems prior to that time. Going later would not have been possible, as we would not have had the 6-year graduation rate data at the time the project was initiated (2018).

Once we established the method for matching cohort members to HIP data, we added the 2013 transfer entering student cohort to the same database. At Clemson, transfer admission is typically limited to students with 30+ earned college credit hours, and approximately 2/3 of our transfer students enter from a bridge program that is aligned with our own first college year, so we determined that the 2013 transfer cohort was an appropriate group to add to the 2012 first-time, first-year student group. Both cohorts were navigating the undergraduate experience at the same time. In all, we had 3,435 first-time, first-year student records and 1,383 transfer student records. We had access to 57 demographic fields for each student, and we accounted for students’ HIP engagement across eighteen semesters (6 years of fall, spring, and summer terms).

Data Collection

We identified eight main types of high-impact educational practices occurring with our Clemson undergraduate students: living-learning communities, first-year cohort programs, service-learning, student-faculty research, study abroad, internships, cooperative education (co-op), and capstones. To determine the engagement of our student population in Clemson’s HIPs, we had to identify where the data for each resided. Data collection was often complicated, but the stakeholders from the focus group were highly involved. To ensure confidentiality and proper research practices in data collection, we received approval through the Clemson Institutional Review Board, a necessary step for engaging in human subjects research. In the IRB application, we outlined that we would only collect data that was non-identifiable by our research team. Clemson’s Office of Institutional Research created a set of unique student identifiers, minimizing the likelihood that students in the data set could be identified.

Table 1 provides an overview summary of the data collection and map. In the table, the rows list the types of HIPs, while the columns provide information on how data were acquired and tracked (if available/applicable). For some of the HIPs, data were collected through institutional sources or codes, and for others, data were collected from various offices and programs that maintained lists of courses or student participants. The final column summarizes limitations on data collection for the project, although the process of even undertaking such a project illustrated areas for improvement in HIP institutionalization and scaling. Data acquisition took approximately a semester.

Table 1 Summary of identified data sources and limitations, by high-impact educational practice (HIP) type

Institutional Sources

There were multiple sources of the data in this mapping project, and we often had to look to different business systems. The Office of Institutional Research was the most instrumental partner; institutional data are the most accurate and verifiable. Table 1 provides the summary of tracking via institutional data. Through discussions with directors, we determined that internships and co-ops should be separated in our database. Students involved in a co-op experience participate in a structured program across three semesters, with additional requirements to meet while in the programs. The structure and experiences are distinctive enough to warrant separation.

Programmatic Sources

The most laborious component of the research study was to identify areas where students were engaging in HIPs but where the data resided with an individual program or program director rather than at the institutional level. We often colloquially referred to this component of the project as “where’s Waldo?”

  • Clemson University does not currently have a first-year seminar program, but our stakeholder focus group identified that we have a number of first-year cohort experiences that incorporate the tenets of high-impact learning (Heinrich et al., 2021). Students in first-year cohort programs have additional co-curricular activities aimed at improved belonging, programming, guest seminars and speakers, tutors, and often specially designated lounges for casual group gathering and study. We were able to obtain lists of students involved from the program directors.

  • For service-learning, we supplemented the institution’s Banner-tagged courses with a list of students engaged in co-curricular service-learning such as alternative breaks and service-leadership experiences that are not based in the academic curriculum, but that engage students deeply in all components of impactful service-learning – inclusive of design, implementation, and reflection cycles (Bringle et al., 2010; Felten & Clayton, 2011; Jacoby, 2009). We worked with the staff in the center for student leadership and engagement to define and obtain the student list.

  • We were able to supplement the identification of student-faculty research participants through a list of all departmental undergraduate research courses. The list was obtained through an advisory committee with faculty members from each academic college; the faculty members had been recently charged with identifying undergraduate research courses in their departments.

  • Although Clemson does not have an institutional definition or tag for capstone courses, courses with the keywords “capstone” or “senior seminar” were identified, and we analyzed course descriptions and syllabi for capstone HIP indicators (Laye et al., 2020; McGill, 2012; Rowles et al., 2004; Zilvinskis, 2019). Additionally, academic programs accredited by ABET (the Accreditation Board for Engineering & Technology) must have a capstone or integrating experience (ABET, 2020), so we were able to identify capstone programs through a list maintained in the engineering college.

Decisions and Limitations

In performing the work of the HIPs study, we were very clear with colleagues and stakeholders that this is an institutional research study, not an institutional research report. Accordingly, we had to make some decisions about data to include versus data that seemed inaccurate or unavailable. There were limitations to the data collection, and in sharing those limitations, we aim that our transparency helps other institutions undertake similar projects.

  • Living-learning communities: Students in a living-learning community are co-housed in a specific on-campus residence hall or floor, so data were available to match students who were participating to this type of HIP. We only acquired data for the first year and second year of our student population, as many students seek off-campus housing after the second year.

  • First-year cohort programs: Since 2012, we have added first-year cohort programs for underrepresented students in STEM majors (general engineering, science and mathematics, agricultural sciences, etc.), students of color regardless of major, and neurodiverse students. The impacts of these experiences are not represented in our current data set.

  • Service-learning: We uncovered (and did not use) several academic programs that were using service-learning but were counting client encounters or student hours rather than tracking involvement of individual students. Our Banner system has certain courses tagged as “service-learning optional,” “community service required,” and “community service optional,” but we did not use the data from those tags, as we did not find that they were a fully reliable indicator of students’ high-impact practice. For the courses that are tagged as “service-learning required,” we are unsure of regular and ongoing quality control measures for accuracy, since we currently have no central unit in charge of or supporting service-learning.

  • Student-faculty research: Some undergraduate students are performing research for pay with faculty members, but we are unable to easily find and track the student data. A portion of our courses tagged as creative inquiry do not involve student-faculty research. The creative inquiry office has now launched new reporting methods for determining creative inquiry course attributes, but such specific methodology was not available for the data collection timeframe for our student population group. Other populations of students may be conducting research in the summer (via NSF REUs or other programs), but we were unable to locate or track that information.

  • Study abroad: We did not count short trips or non-course programs as high-impact practices in our data collection, because we could not verify that these experiences had the qualities that are a hallmark of a high-impact study abroad experience (Kuh et al., 2017; Tarrant et al., 2014; Thomas & Kerstetter, 2020).

  • Internships: We know that several students are performing internships that they find on their own or through their personal or professional networks, but without a central reporting mechanism for students to let Clemson University know what they are doing – and when and where and for how long – we did not feel that any data emanating from student self-reports would be accurate.

  • Capstones: Clemson has no universal definition of what a capstone course entails. Our study revealed that a better understanding of capstones is needed before an institutional-level Banner tag could be created.

  • Managing multiple and overlapping HIPs: As faculty, students, and other stakeholders engage, HIPs can get messy. A student might be doing an internship abroad for an international company, but it is only tracked by the internship office. A student might be performing student-faculty research that incorporates service-learning, identifiable in Banner as a research course, but not as a service-learning course. Other students may be performing student-faculty research as part of a capstone course while students in a different capstone course are not performing research in the same way. The lack of a clear way to study and affirm when HIPs blend is a limit of our research.

Data Set

Once the data on our institutional high-impact educational practices were collected, we matched activities from institutional tags, course enrollments, and programmatic sources for each student over the span of the 18 semesters (Fall 2012 through Summer 2018). At this stage, all data were in Microsoft Excel, with each student comprising one row. Each student activity was given a binary input (1 for involvement, 0 for none) according to our eight HIP categories for each semester. We worked with our institutional research department on anonymization of student records before incorporating the full set of 57 demographic factors. Once the data set was complete, we downloaded it into Tableau and built a Tableau workbook for further data analysis and visualization.

Findings

Our data set is quite large, and covering every possible result is beyond the scope of this (and likely most) manuscripts. And yet, with data in hand, we were able to talk more specifically with a range of faculty, staff, and student stakeholders. We conducted a dozen listening and presenting sessions to approximately 400 total individuals over the span of six months. We wanted to ensure that our collective analysis was guided by appropriate questions and made sense to all rather than just attempting to probe data with no plans of what we wanted answered. This step cannot be overlooked, as stakeholder input helped us to make meaning from data as we sought to determine how our students and colleagues could inform specific ideas or questions.

Key questions initially asked and answered were: what percentage of students total did and did not engage in a high-impact educational practice, what was the relationship between HIPs and graduation rate, and what were the engagement levels by class (first-year, sophomore, junior, senior, beyond senior/those that took more than 4 years)? We found that 78% of our first-time, first-year entering cohort and 73% of our transfer cohort were engaging in at least one HIP before graduating. For students from the full data set who did not graduate, only 34% of them participated in a HIP, meaning that 66% of students who did not graduate from the institution did not engage. We did a Chi-square test for independence, which showed that the graduation rates are related to students’ experience with HIPs. In performing an odds ratio calculation, we found that students with at least some level of engagement were seven times more likely to graduate with a baccalaureate degree within six years than those with no engagement.

Higher engagement is related to higher graduation rates, but students who did not leave the institution also had more time and opportunities across multiple semesters. Figure 1 shows the comparative volume of students engaged in HIPs by year. The numbers are totaled for each year rather than by discrete students, thus if a student started on a research project in year 2 and continued it for more academic years, they would appear in the figure for each year. The student year “4+” refers to students in a ninth semester or beyond. Because first-year cohort programs and living-learning communities were targeted toward first-year students, the volume was 100%. On the whole, the greatest engagement is seen the third year of college for the student population examined. The patterns of engagement show trends across students’ academic trajectories. Higher yield in undergraduate research begins in the second year of college, whereas students are doing more study abroad in the third year. Service-learning, internships, and co-ops are more likely in the third and fourth years, and students engage in capstone experiences and courses in their final semesters. Institutions wishing to improve student engagement may wish to further examine the curricular and co-curricular implications of these patterns and/or determine patterns in their own data.

Fig. 1
figure 1

Volume of HIP engagement by student academic year

We were particularly inspired by the Kuh and O’Donnell (2013) report using disaggregated student data from the National Survey on Student Engagement (NSSE) to demonstrate percent participation in high-impact activities by institutional and student characteristics. We and our Clemson University colleagues were interested in producing a similar report. We used our Tableau workbook to sort our student HIP participation across multiple factors, demonstrated in Table 2. Row A shows overall participation by HIP in aggregate, although we did provide separate results for our first-time, first-year cohort and our transfer cohort at the senior level. The data in row B come from Clemson’s spring 2013 administration of NSSE to the 2012 first-time, first-year cohort (n=771, 23% response rate) and Clemson’s spring 2016 administration of NSSE to the student population during their senior year (n=1,040, 22% response rate). Discrepancies are apparent between our verifiable institutional data in Row A and student self-reports via NSSE in Row B; student self-reports are inflated. We included the national data from Kuh & O’Donnell’s, 2013 report in Row C as a comparison to Row B, and we see that Clemson University students are self-reporting slightly to considerably higher engagement via HIPs than the self-reports of all NSSE students surveyed. There are likely a number of student engagement activities occurring that are not reported or reflected in institutional data, and there are also likely a number of places where students do not entirely understand how to connect their experiences to the NSSE survey items. (Discussed more in the section on Conclusions and Implications below). For instance, institutional categorization of student-faculty research does not include sponsored research, whereas students may include the activity in their NSSE self-report. We identified internships and co-ops, but NSSE also includes activities such as field experiences and student teaching into the internship category. The NSSE question prompts give some examples of what is meant for each category, but the engaged learning portfolio at Clemson is not always a complete match, thus limiting direct correlations between the two data sources.

Table 2 Clemson University first-time, first-year 2012 entering student cohort (FY) and transfer 2013 entering student cohort (TR), percent participation in high-impact practices by aggregate, drawn from institutional research, course enrollment, and programmatic sources

We made the determination to report our first-time, first-year cohort engagements separately from our transfer cohort engagements. Approximately 30% of all Clemson students are transfer students, and nearly all of those students enter the institution with 30+ college credits. Effectively, the transfer students miss out on the support and intensive efforts directed at first-year students at a 4-year institution, and research shows that their participation in HIPs is often negatively impacted (Bonet & Walters, 2016; Center for Community College Student Engagement, 2014; Zilvinskis & Dumford, 2018). We disaggregated the transfer student cohort in the data to provide a more accurate view of their experiences in different HIP types.

Our primary purpose in creating Table 2 was to separate out our student demographics by gender, ethnicity, first generation, Pell Grant eligible status, and major. (“Pell Grant eligible” is a U.S.-specific item related to students’ financial need, as determined by the Free Application for Federal Student Aid, or FAFSA. Income limits vary year-by-year, but undergraduate students who are Pell Grant eligible typically have an estimated family contribution of less than $6,000 toward college expenses. For demographic purposes, they represent students with high financial need.) Our Tableau workbook contained our student demographics, but we did have to perform an additional analysis to translate our specific Clemson University majors into comparable national majors. Because NSSE – and thereby, Kuh & O’Donnell – used CIP (Classification of Instructional Programs) codes for academic grouping, we did the same. CIP codes are commonly used by IPEDS (the Integrated Postsecondary Education Data System) to track fields of study. Some academic programs represented in IPEDS groupings are not in line with where they are housed at Clemson. For instance, Construction Science Management at Clemson is a department within our College of Architecture, Arts and Humanities, but its CIP code family is Business. Computer Science and Computer Information System majors at Clemson are in the College of Engineering, Computing and Applied Sciences, but the CIP code separates them into their own category. There are a few other examples where the CIP codes do not represent our own Clemson academic structure, but for the purposes of Table 2, using the groupings from a standardized national source was key.

In examining our demographic data, we see gender parity across HIP participation (Rows D-E). The main difference is that male transfer students are participating in capstones at 17% higher rates than female students. A large portion of male transfer students at Clemson are graduating in engineering fields, and the higher rates of engineering participation in capstone courses is common (as per ABET requirements, as discussed in the Data Collection section). We also do not see large differences in HIP participation across student ethnicity (Rows F-K). Although the percentage of Hispanic students at Clemson was small for the student population, it is continuing to grow. Improving avenues for Hispanic students to engage – particularly, starting with first-year cohort programs and learning communities – is an important implication to consider. We also do not see many gaps in HIP participation with our first-generation or Pell Grant eligible students (Rows L-O). Clemson has long had a program for first-generation college students to introduce them to the necessary skills, knowledge, and aptitudes for navigating college life. Additionally, we have a specific on-campus internship program – the University Professional Internship and Co-op Program (UPIC) - that prioritizes students from underrepresented backgrounds and those with financial need (Clemson University Center for Career and Professional Development, n.d.). Not only do those students have a HIP internship experience, but their supervisors are also trained as mentors to advocate for and coach the students to successfully navigate the undergraduate academic experience at a large, public institution, a promising practice documented in the literature (Felten et al., 2016; McNair et al., 2016). Through both analyzing our data to create Table 2 and consulting with our stakeholders on desired data-supported outcomes, we see evidence that many institutional programs are working to equilibrate the undergraduate experience for everyone.

When we look at student HIP participation across majors (Rows P-Y), a number of differences are evident. For instance, students in education majors do not study abroad at high frequency, typically due to the inflexibility of the curriculum that necessitates coursework in educational theory, specific disciplines, and practicum prior to licensure. Transfer students in physical and social sciences participate in internships at rates 30% below the first-time, first-year students. Likewise, transfer students across nearly every disciplinary category participate in study abroad and co-op at lower rates. The commonality between internships, study abroad, and co-op is that they require advance planning. When transfer students have less semesters at Clemson overall to complete their degree requirements, they also seem less likely to engage in these types of HIPs. Improving academic advising with transfer students is needed so that they are aware of opportunities such as internships, study abroad, and co-op and how they can integrate participation into their academic trajectories.

We also see unexpected trends in research experiences for STEM fields and in capstones. In talking with several department chairs and curriculum coordinators, we found that uneven communication between departmental registration coordinators and the registrar’s office is a likely source of error. Common understanding and regular training on HIP Banner codes, what they mean, when they should be applied, and when they need to be removed from courses is much needed.

In addition to Table 2, our Tableau workbook provides multiple avenues for further analyzing and making meaning from data. Table 3 describes the data fields and visualization that are possible with our Tableau workbook dashboards. For example, the faculty in one growing engineering subfield were interested in further disaggregation of their students’ HIP engagement data to support an ongoing curricular revision. We were able to show that women students, students of underrepresented ethnicities, financially needy students, first-generation students, and transfer students engaged at lower rates across every HIP type. This was a powerful indicator to the department that their aim was justifiable to revise the curriculum towards an equity focus and to infuse HIPs into courses at the sophomore year and beyond. Figure 2 shows another example of the results of our research: determining the footprint of HIP involvement across a college. In the Fig. 2 snapshot, we see the undergraduate majors in the College of Business. For many disciplines, high levels of internship placements are desired. The graphic communications major has the highest student involvement in internships, while students in management have lower engagement in internships. However, the trend is reversed when we look at capstones. Management students are the most likely to engage in a capstone experience, while students in graphic communications are the least likely. This type of visualization allows colleges and departments to analyze where their students are engaging and to determine departmental-level priorities for maintaining or accelerating student involvement in those areas. With the creation Tableau workbook, we have enabled our colleagues to develop and answer their own questions in just a few clicks.

Table 3 Contents of Tableau workbook by dashboard
Fig. 2
figure 2

Visualization of capstones and internships in the College of Business

We also have the ability to analyze student involvement in multiple HIPs as a result of our research. In moving from a flat Excel sheet or a report like Table 2 to a Tableau workbook, we can now understand multiple, overlapping factors. For instance, we know that study abroad occurs at low rates in our transfer student population. For those who are both transfer students and first-generation college students, study abroad experiences are ten times less likely to occur than they would for students with transfer student identity alone. This determination has implications for fundraising and helping the institution deploy our resources to where they can be of most benefit. In a different analysis, African American students who are first-generation college students are eight times more likely to participate in a Clemson cohort program than African American students who are not first-generation. This point helps us to know that the first-year cohort programs are aimed appropriately at students who most need the attention. Having done the hard work of establishing the database, the possibilities are now endless for how we can continue to use it to improve the delivery of our academic and equity-focused mission.

Conclusions and Implications for Continuous and Equitable Improvement

We entered into this work with the need to address the key questions delineated in the Stakeholder Focus Group section above. The project was successful in providing answers and insights to those questions, along with a high-quality data set to guide multiple stakeholders with new and emerging questions. High-impact educational practices can be complicated and overlapping, but projects such as this help institutions to move from scattered snapshots and toward strategies.

Quality control of institutional data is a major issue - the “where’s Waldo?” component, as we referred to it. Part of what makes high-impact practices so high-impact is that they frequently involve mentoring and close relationships. The student and faculty/staff/mentor participants in those relationships do not always disclose their activities to the keepers of an institution’s data ecosystem. It is likely that, even with broad and careful data collection methods such as those described herein, a number of engagements go unreported, potentially one of the reasons for the discrepancies in student-reported data and institutional data as seen in Fig. 2, rows A and B.

It is important to have “gatekeeper champions” for HIPs at an institutional level – those who advocate for the work, but also pass only certain activities through the quality control gate. Having well-resourced centers and low turn-over among personnel ensures better quality and transparency for HIPs, as concrete changes cannot be built on a foundation of shaky data. Just as there is a mandate for equity within student subpopulations served, there is a need for equity in supporting different types of HIPs, as priorities differ among departments and institutional missions. These findings also have implications for non-U.S.-based institutions. HIPs are often more common in North American institutions, but as others around the globe seek to improve student engagement and agency, the role of the centralized gatekeeper champion unit should be considered. Future research could allow for collaborations across countries to determine where higher education HIPs intersect and places where they are not easily implementable or found to be less impactful.

An institution-wide mapping of student engagement supports a number of potential uses across institutional priorities. It is important to have data to support continuing resource allocation and reallocation, as well as to support garnering new resources through grants and fundraising. For instance, our data analysis supported the need for improved internships and training for food science students, resulting in $487,000 from the USDA to fund student internships and enable curriculum revision within the food science academic program. The dean of the one of our colleges is using the data on underrepresented student engagement in study abroad when she talks with potential donors to endow a scholarship program. Using data to find and address gaps supports institutional strategic planning, including the more recently popularized – and essential – area of diversity strategic planning. Shared accountability propagates real and lasting changes in the student experience.

Improvement of HIPs within degree programs are advantageous. Because we looked at student engagement over time and disaggregated by major (Fig. 2, rows P-Y), we see places for change. The spring of the junior year is a prime time for study abroad, whereas the spring of the senior year is the most popular semester for internships. Departments seeking to encourage these trends may wish to look to their curriculum maps to allow increased flexibility during those semesters. Alternatively, other departments may wish to intentionally alter their curricular sequences and/or pedagogical choices to enable students to participate in HIPs earlier in their academic trajectories. Since education students do not do study abroad at high rates, could education faculty consider incorporating international virtual exchange? Since students in some physical science majors are not performing student-faculty research, could they incorporate course-based undergraduate research experiences into their 1000- and 2000-level foundational courses? Transfer students do not participate in HIPs when advanced planning is needed (study abroad, internships, co-op), but they do participate when HIPs are part of the curriculum (capstones, research). Making courses more HIP-like or improving conversations with transfer students about HIP opportunities could help them engage at higher rates.

Several Clemson-specific initiatives have begun as an outcome of the project. Pell Grant eligible students are participating in internships at high rates, likely because of the aforementioned on-campus, mentoring-intensive UPIC program that prioritizes students with financial need or with underrepresented identities. Nationally, financially needy students do not fully pursue internships – especially unpaid internships – because they need a wage-earning job while in school. Developing more intentional internship opportunities like UPIC could be a promising practice for other institutions. Likewise, student-to-student peer tutoring and supplemental instruction programs are evolving to incorporate mentoring and reflection for the student leaders, and researchers anticipate that these practices will become more HIP-like as the trend continues (Kuh et al., 2020). Despite a long period without a first-year seminar program, Clemson undergraduate education is now turning a corner, piloting first-year seminars in our Honors College and within our general education curriculum, supported by the Teagle Foundation and National Endowment for the Humanities (Teagle Foundation, 2021). Leaders of HIP-embedded initiatives are empowered when they have a clear and transparent source of data upon which to build their efforts.

A final, but essential, point from the research study relates to the approach undertaken. One cannot just acquire data, write a report, and put it on the shelf. To make meaning from the data – and especially to use the data in driving equity-minded change – you have to understand your institution and the way that the people who make up the institution deliver its mission. You must understand your colleagues and how they may have a different understanding or vision of priorities. The intellectual diversity of the international higher educational ecosystem is one of its strengths. The goal of a mapping project is not to increase student involvement in every single area in the shortest amount of time possible. Rather, the data enable the conversations, and the conversations enable the people to make changes over time to shift institutional culture and impact student learning for the better.