Introduction

The number of graduate school applications in Biomedical Engineering (BME) is rising steadily every year,29,41 making competition stiffer and application reviews more resource intensive. Admissions committees review hundreds of applications each year to identify talented students who will not only be academically successful but also facilitate matching of the applicant’s interests with specific research areas or labs within the program.41,21,24 This is no small task given that application packages are lengthy and contain a broad range of application materials including but not limited to academic transcripts, standardized test scores, personal statements, recommendation letters, and research interest descriptions. Evaluation methods for each of these elements vary widely. For example, student personal statements and recommendation letters require qualitative review involving some level of subjective reviewer evaluation,39,38,2,34 whereas, quantitative metrics such as grade point average (GPA), Graduate Record Examination (GRE) scores and Test of English as a Foreign Language (TOEFL) scores39,2,9,10 may be reviewed based on minimum institutional requirements. It is well-established that screening and selecting graduate applicants based solely on academic metric thresholds (e.g., GRE and GPA cutoffs) will not guarantee graduate student performance and success in the program.36,32,31 In addition, recent evidence has highlighted that such practices generate selection bias14,27,33 against marginalized and underrepresented minority (URM) groups in science, technology, engineering and mathematics (STEM) disciplines9,18 and may also contribute to privileging the selection of applicants with backgrounds similar to the selection committee members.34,19,20

To address these challenges, graduate admissions committees across the US have begun to employ holistic review methods.39,34,16,7,40,30,1,17 Holistic review aims to assess each applicant’s capabilities in a flexible, individualized way, giving balanced consideration to prior experiences, personal attributes, academic performance metrics and how each individual might contribute value to the target program.16 This method was designed to make the selection process more evidence-based and to incorporate diversity as a key element for student selection into academic programs.7,27,11,18 For some programs, it has also increased the transparency of application reviews for prospective students, since some programs have published their holistic review frameworks on their websites. However, implementing holistic review processes is a non-trivial task and continues to be a work in progress.3 Some holistic review models and rubrics recommend tiered application reviews with more thorough examination of applications with lower or borderline standardized test scores, GPA, or other academic metrics. Several holistic review frameworks and rubrics focus on 4 criteria, namely: (1) Academic Performance, (2) Scholarly/Research Potential, (3) Commitment, Persistence and Leadership, (4) Life Experiences and Background.30,30 The implementation and weighting of these criteria to ensure fair, consistent review of applications is extremely challenging and a topic of intense debate. Furthermore, these rubrics do not always replace the traditional selection processes, where faculty or the institution’s graduate school at large make initial triage decisions using high standards of achievement that may undermine diversity and reinforce reliance on GPA or standardized test scores like GRE.39,16 In fact, there remains some concern that the holistic review process runs the risk of selection bias and leaves gaps between the ideal use and actual practice.5 Nevertheless, there is unanimous agreement that sole reliance on historical metrics (e.g., GPA and GRE) is not adequate for graduate admission reviews.21,36,32,31,32 There is an urgent need for rigorous consideration of the applicant’s unique experiences, background, and non-metric based factors to ensure fair, just, and equitable review of prospective student applicants.

In this paper, we explore the value of historical metrics of academic performance like GPA and scholarly research potential in relation to graduate admissions in BME programs. We share data from a three-year retrospective review of our own institution’s graduate BME applications and admission rates and review the role of GPA, standardized test scores (e.g., GRE), and prior research experience on graduate school admission. We also examine how the waiving of GRE requirements has changed the landscape of BME graduate applications in recent years. Finally, we discuss efforts taken by our institution and others to develop and implement holistic reviews of graduate applications that encourage students from underrepresented backgrounds to apply and successfully gain admission to graduate school.

Historical Metrics of Academic Performance

Grade Point Average (GPA)

GPA is an average obtained by dividing the total number of grade points earned by the total number of credits taken. It appears on every student’s transcript, college application, and graduate/professional school applications. It is one of the oldest metrics of academic achievement, didactic knowledge, and student performance; often ranging from 0.0-4.0, the coveted 4.0 GPA has been attributed to “academic excellence” and “perfection”. Yet it has been often asked, “Does GPA matter?” and “Are good grades and high GPA statistically shown to correlate with career success?35 The answer is not straightforward.

An “A” grade (or “high” GPA, e.g., > 3.5) is generally a good measure of the student’s mastery of coursework taken. For some students, the GPA may also reflect a composite score of their grades and personality traits. For example, students who are extrinsically motivated by grades or other rewards, diligent in turning in homework assignments, and have self-discipline and perseverance to study tend to have higher GPAs.8 Conversely, students who aren’t motivated to turn in assignments or study for exams or have life circumstances affecting their course performance may be inclined to have low GPAs, but does this mean they don’t know the subject matter well? Or will they not succeed in a graduate program aligned with their intrinsic interests? Not necessarily, as evidenced by the examples of Steve Jobs graduating high school with a 2.65 GPA or Bill Gates dropping out of college. For decades, psychologists and educators have noted the divergence between academic performance and outcomes in creative and entrepreneurial careers. Researchers have found an inverse relationship between GPA creativity and innovation; that is as GPA decreased, innovation tended to increase.25

Decades of research substantiate that GPA hardly tells the whole story for a person’s academic performance. GPA doesn’t reflect who the students are outside of an academic environment. It does not measure the student’s leadership or comfort with risk/uncertainty, nor is GPA an indicator of emotional intelligence or interpersonal skills such as networking. All these traits have been shown to be essential for business/financial and career success.35 Despite this, GPAs continue to be the gatekeepers for many opportunities, including graduate school, medical school, internships, scholarships, and job applications. Upon a quick search of top BME graduate programs, most schools adopt a “minimum GPA” ranging between 3.0 and 3.5 for graduate admissions. In fact, our own BME program and institutional guidelines indicate a preference for admitting students with a minimum GPA of 3.5 and a TOEFL score of 100. Based on a three-year retrospective review of our BME program’s graduate admissions consisting of 628 domestic and international applications from 2016 to 2018, we observed that the average GPA of applicants was 3.5 ± 0.4 (Table 1). This means prospective students with low undergraduate GPAs (i.e., < 3.0) are less likely to apply. Furthermore, those individuals with GPAs < 3.0 who apply to our institution are rarely considered for graduate program admission because of institutional policies that require a minimum of 3.0 GPA for graduate BME admission.

Table 1 Summary of applicant demographics and academic record.

To determine the predictive power of GPA scores on graduate school acceptance, financial and stipend offers, and subsequent enrollment into our BME graduate program, we performed a logistic regression analysis on our 2016–2018 cohort. Our analysis revealed that undergraduate GPA explained a relatively large portion of the variability in graduate admissions, financial offers, and enrollment, but it wasn’t the strongest predictor. The strongest and most consistent predictor variable from our graduate application was “How the applicants heard about the graduate program”. The answer to this question was categorized as: Online, Conference, Department Research, Faculty, Internship-REU, Peer and Student-Alumni. Responses not fitting any of these categories were classified into an “Other” category. Using, “online” as our referent group in our statistical analyses, “How the students’ heard about the program” was strongly associated with not only graduate admission but also receiving a financial offer and ultimately enrolling in our BME program (Figure 1). Interestingly, prior research experience (coded as a binary yes/no) was not strongly associated with graduate program acceptance at our institution. However, applicants that had participated in a REU program were indeed at higher odds of getting accepted into our institution, as we discuss in the next section. These findings underscore the value networking, interpersonal skills and REU program participation.

Figure 1
figure 1

Illustration of associations between applicant metrics and Graduate BME program Acceptance, Financial Offer and Enrollment. Bars represent the three outcomes: graduate program acceptance (black), program acceptance with a financial offer (gray), and enrollment (gray striped) into the BME graduate program. Asterisks represent the strongest Nagelkerke R2 estimates that were statistically significant (p < 0.05). “How hear” refers to the categorical variable related to how the applicant’s heard about the program and were categorized as: Online (referent group), Conference, Department Research, Faculty, Internship-REU, Peer and Student-Alumni and Other. How students heard about the program returned Nagelkerke R2 values of 0.40 (p < 0.0001), 0.37 (p = 0.0002) and 0.37 (p = 0.007) for admissions, financial offers, and enrollment respectively. Hence, nearly 40% of the variation in program acceptance is explained by how the students heard about the program. This variable was the strongest predictor of graduate admissions (stronger than GPA and standardized GRE scores). Prior research experience coded as a binary variable (i.e., yes/no) yielded a low Nagelkerke R2, presumably because > 90% of our applicant pool indicated that they had prior research experience. Clearly REU-type experiences were strongly associated with graduate admissions. Full description of our statistical methods have been provided as a supplement.

So how do we break the need for, or reduce emphasis on GPA? There are some non-quantitative alternatives to GPA that have been introduced by select colleges. For example, Brown University does not calculate GPAs and only reports letter grades on their transcripts. Brown University’s College Curriculum Council6 states that, “Employers as well as graduate and professional schools seek Brown graduates for their analytical ability, independence, creativity, communication, and leadership skills, qualities not necessarily reflected in a GPA.6 Other universities have adopted Pass/Fail grading, where students who achieve and demonstrate core competencies in each class earn a passing grade. This type of grading has become a bit more prevalent during the COVID-19 pandemic but unless all classes a student takes are graded based on the Pass/Fail system, this can create an unbalanced calculation of GPA and potentially give a distorted view of the applicant’s academic performance. Furthermore, there are some concerns of grade inflation or unequal weighting of GPAs from prestigious top-tier institutions or ivy league schools compared to lower tiered academic institutions. This can be a potential source of bias convoluting the effect of GPA on graduate admissions. Graduate BME programs, like ours have tackled this issue by forming a graduate admissions committee that individually reviews any applicant who has a low undergraduate GPA but stellar letters of recommendation and/or research or work experience. This process has helped to support students who don’t meet the traditional metrics of academic performance as measured by GPA alone and has become part of our holistic review process in recent years.

Standardized Tests: GRE or GRExit?

The GRE has been one of the second metrics historically used to evaluate prospective graduate applicants. While the exam has evolved over time, it currently has three scored sections: Quantitative Reasoning, Verbal Reasoning, and Analytical Writing. For most engineering programs, including BME, there has been an expectation for students to score well, particularly in the Quantitative section. However, there is very little data that shows a strong relationship between GRE scores and graduate student success.19 In the early 1990s systematic analyses and reviews were first conducted by national agencies to determine the relationship between GRE and graduate student acceptance and performance.39,38,9 These studies and others revealed that GRE was not an indicator of success, was biased toward physical science majors, and selects against socioeconomically disadvantaged populations. Further, there were major disparities in GRE scores between gender and race; women and URMs often had lower GRE scores compared to white males.21,39,36,32,9,20,12,26,28 However, it wasn’t until the early 2000s (nearly a decade later) when universities began to adopt “GRE optional” policies for graduate school admissions.19,23,22 In a study conducted by faculty at Vanderbilt University, the authors failed to detect a correlation between GRE scores and graduate student success or performance.28 In our own retrospective review from 2016-2018, we found that GRE scores were not strong predictors of graduate BME admissions, financial offers, or enrollment at our institution (p = 0.2, Table 2). Only the analytical writing score from the GRE explained a portion of the variability in graduate program acceptance, financial offer, and enrollment, respectively (Nagelkerke R2 values of 0.22, 0.17, and 0.12 from Figure 1, and Odds ratio of 1.96 from Table 2). This was not the case for quantitative and verbal reasoning GRE scores. Finally, aggregate GRE scores (i.e., composite score of analytical writing, quantitative reasoning, and verbal reasoning scores) explained less than 24% of the variability. In terms of prediction, the GRE Analytical Writing score (range 0-6) was modestly predictive of grad program acceptance, predicting doubling odds of acceptance per each point increase in the GRE Analytical Writing score (Odds Ratio = 1.96, p = 0.0006, Table 2). Counter to our intuition, higher quantitative reasoning scores were statistically significant in predicting lower odds of actual enrollment into our program, potentially suggesting that these top performing students have multiple acceptance offers from various universities and may choose to enroll elsewhere.

Table 2 Logistic Regressions for predicting graduate acceptance.

Despite these findings, the requirement for GRE scores wasn’t fully removed by institutions until many years later. Between 2017 and 2019, there was growing opposition to the GRE related to the disproportionate scores and biases affecting URM populations and students from low socioeconomic statuses.32 However, it wasn’t until 2020 during the global COVID-19 pandemic when action for change was adopted by hundreds of universities. The inability for students to schedule and take the GRE during the pandemic was the final nudge needed for institutions across the US to make GRE test scores optional or not required at all—a trend that has been dubbed “GRExit”.19 According to King et al. more than 300 departments and programs in the biomedical sciences have accepted GRExit as of 2020. As a result of this trend, graduate programs have been forced to take a closer look at non-quantitative metrics when evaluating student potential. Further removing the need for GRE scores has alleviated the financial burden, stress and anxiety associated with taking a standardized exam. In many ways, the removal of the GRE has made applying to graduate school more open/accessible to a broader and more diverse student population.

Non-quantitative Applicant Metrics: A Step Towards Holistic Reviews

Prior Research and Work Experience: Assessing Research and Scholarly Potential

Prior research experience has become undoubtedly one of the most important factors in accessing prospective graduate applicants. Graduate training at both the MS and PhD levels (thesis-based programs) requires one to pursue some sort of independent research project. Therefore, students who have prior experience with methodologies, techniques or skills that are directly aligned with the faculty mentor’s research are particularly attractive and gain a competitive edge over their peers. In our own retrospective review of graduate admissions from 2016 to 2018, nearly 95% of our applicants reported some form of prior research and/or relevant work experience. But are all research experiences equal? The short answer is no.

For most schools and BME programs there is a strong preference towards hands-on immersive research experiences and/or internships where the undergraduate students are allowed to showcase their creativity in problem solving, aptitude for teamwork and leadership, as well as dissemination and communication of their scientific work. For example, Research Experiences for Undergraduates (REU) programs funded by the National Science Foundation (NSF) have been one of the preferred modes of research experience. This is because students often engage in meaningful research experience and gain valuable professional development skills that easily translate to graduate school. Further, students who have a positive REU experience are more likely to apply to graduate programs.

Data from our 2016–2018 cohort illustrate this point. Simply considering prior research experience as a binary variable (i.e., yes/no) yielded low correlations to BME admissions, financial offers, and enrollment (Nagelkerke R2 estimates < 0.2, Figure 1). However, applicants who participated in an REU summer program at our institution (at either campus location) had a very high rate of success (~ 50%) in receiving an acceptance letter. Furthermore, compared to students who reported hearing about our BME graduate program online, former REU students were up to four times (odds ratio = 3.69, p = 0.005) more likely to be accepted, receive financial offers, and enroll in the graduate BME program, while adjusting for their GPA and GRE scores (Table 2). These findings reveal the power of not only “REU-like” experiences but also the power of personal connections with the faculty prior to joining the lab as a graduate student.

To some these statistics may be worrisome because it suggests that there may be an unfair advantage for those who have access to REUs. While REUs are one pipeline program for URMs and other disadvantaged populations, they are not the only way for students to participate in meaningful research. We will discuss this concern more in the “lessons section”. However, we also recognize that our BME program allows faculty mentors to directly recruit students into their labs based on research funding. Unlike other BME programs that may have formal rotations, our program directly matches students to a faculty mentor and thus our findings may not be generalizable to all BME graduate programs. Still, there are other means to assess research/scholarly potential via recommendation letters, number of publications, awards/honors earned, skill sets and mastery of techniques. Importantly, academic research experience should not trump other valuable work experience. As holistic reviews become more prevalent, there is a need to better quantify and reliably assess each candidate’s research and scholarly potential.

Commitment, Persistence and Leadership

Traditionally, demonstration of leadership or commitment would come from the applicant’s CV and their list of prior involvement with student organizations and/or community outreach programs. However, this section has truly evolved to allow applicants to share how they have personally overcome adversity. Applicants now can share any personal hardships or obstacles they may have encountered and explain how they have overcome them. Moreover, there is a growing appreciation for the evaluation of the individual and their personality traits. This is because the pursuit of graduate education is long and arduous and as such successful students need to have a strong level of commitment, perseverance, and leadership. In fact, personality has been shown to be one of the most important predictors of career success.35,4 Therefore, offering applicants space to voice their personal stories provides a glimpse into their personal attributes or intrinsic self and can be helpful for recruitment of diverse graduate students. We believe it can also aid in their successful retention in graduate schools by matching them to appropriate faculty mentors.

Quantifying the Unquantifiable: Measuring Up One’s Life Experiences, Diversity, and Background

Diversity and inclusion in an academic research and engineering environment is of utmost importance. There is growing evidence supporting the value of diverse multidisciplinary teams and this must be adopted in graduate admissions. Formidable life experiences and one’s ethnic/cultural background have emerged as an important non-quantitative metric for holistic review. But how does one assess the “unquantifiable”?

Most universities are adopting a personal narrative that allows the applicant to share how their diverse background brings value to the academic and research environment. Even referees are encouraged to speak on the applicant’s contribution to diversity and the community at large. However, these letters and narratives become subject to individual interpretation. There is no single way of quantifying one’s background. Other indicators of diversity may include attendance at Historically Black Colleges and Universities (HBCUs), tribal schools, community colleges, or number of languages spoken. Undoubtedly, this section is one of the hardest to assess.

Given that historically marginalized groups and URMs often require additional financial support, should the approach be to reserve graduate school spots and funding specifically for disadvantaged and URMs? While the exact solution remains unknown, several institutions and graduate programs have taken steps towards bridging the gap for URM students. For example, the Virginia Tech Initiative for Maximizing Student Development (IMSD) is a training program (https://imsd.apsc.vt.edu/) designed to increase the number of minorities with a PhD in biomedical and behavioral sciences and engineering. This program and several others across the US, aim to not only support URM PhD students financially, but also provide custom mentorship and training to ensure the retention of diverse scientists and biomedical engineers.

From Retrospective Review to Action: Next Steps

Lesson 1: What Does Your Applicant Pool Look Like (and Why)?

Performing the retrospective review highlighted key gaps in our applicant pool and we describe the specifics in this section. Over the 2016–2018 study review period we conducted, 628 graduate school application records were received, roughly evenly distributed across the three years, out of which 201 (32%) were admitted into the program, 146 (23.2%) were offered a financial stipend, and 102 (16.2%) ultimately accepted the offer and enrolled in the graduate BME program (Figure 2). On average, we accepted 27.6% of MS applicants and 33.6% of PhD applicants. Half of the students who received acceptance letters enrolled in our graduate BME program.

Figure 2
figure 2

Number of graduate applicants and acceptance rates between 2016 and 2020. We performed a retrospective review of our graduate admissions from 2016 to 2018 and then implemented changes to recruitment methods, waiving the need for GRE scores and adoption of more holistic reviews at the end of 2018. We’ve observed significant improvements in recruitment of URMs between 2019 and 2021. We offered 9 (28%) acceptance letters to URM students in 2019, 16 (48%) in 2020 and 24 (40%) in 2021.

The demographics and characteristics of our graduate applications are summarized in Table 1. Between 2016 and 2018, nearly half of the applicants (44.1%) were female; and predominantly Caucasian (55.3%), with Asian applicants (32.5%) being the second most common ethnic group. The majority of applicants were US citizens (61.8%), although there was a substantial pool of international applicants (n = 229) and few with permanent residency status (i.e., Green Card, n = 11). Almost all applicants (93.5%) reported having some form of prior research experience in BME. Quantitative admission metrics were stable across all three years with averages of 3.5 ± 0.4 (mean ± standard deviation) for GPA, 102.6 ± 8.4 for TOEFL scores, 4.0 ± 0.7 for GRE Analytical Writing, 161.2 ± 5.5 for GRE quantitative reasoning and 155.3 ± 6.5 for GRE verbal reasoning. These standardized test scores and GPAs illustrate the high performing level of the applicants and the stiff competition students face.

What we learned from conducting a thorough retrospective review was the scarcity of URM applicants to our program. Less than 5% of our applicants (pooled over 3 years) were African American and less than 10% were Hispanic. These statistics were alarming and encouraged our department to make a more concerted effort to engage with URM populations in 2018. We began to offer application fee waivers to URM students and did more outreach at targeted conferences including the Annual Biomedical Research Conference for Minority Students (ABRCMS), the annual National Society of Black Engineers (NSBE) convention, and regional meetings by the Society of Hispanic Professional Engineers (SHPE). Through these efforts, we have significantly improved our applicant pool demographics. In 2021, we had tripled the number of African American applications and we currently have ~ 18% URMs in our graduate BME class (Table 3). In addition, we learned that we did not have a gender disparity. Nearly half of our applicant pool was female and our we maintain 40–50% female in our graduate student population. By performing a deep retrospective review, we were able to objectively identify areas of weakness and developed targeted strategies to improve. We recognize that each institution may suffer from unique/individual challenges and therefore we strongly recommend that all graduate programs conduct periodic quantitative reviews of their admissions data.

Table 3 Number of total graduate students and URMs enrolled in our BME program.

Lesson 2: Engage the Faculty and Overcome Implicit Bias

When we first conducted our retrospective review, there were some faculty in our department who were skeptical of waiving GRE requirements. Despite overwhelming data published by others, it wasn’t until we showed that GRE was not predictive within our own graduate cohort that there was more willingness to change the admission criteria. This is a common human characteristic and something we cannot ignore. The implementation of periodic reviews of admissions data as stated in Lesson 1 can help, but we cannot ignore our own biases that may affect how we review prospective applicants.

As we move towards a more diverse, equitable, and inclusive environment in graduate school, we need to become aware of our own inherent biases, first as individuals and then collectively as a department, program, or institution. One of the best ways to address this is to engage in “inclusive mentorship” and DEI seminars that discuss how to overcome (or become aware) of your own implicit bias. Our department, like many others, now offers such resources and require completion of workshops and/or mini-courses (e.g., Center for the Improvement of Mentored Experiences in Research-CIMER). Even funding agencies, including the NIH and NSF are requiring faculty participating in training grants (e.g., NIH T32s and NSF REUs) to undergo formal and regular training pertaining to specialized mentorship of URMs at all levels of education. Secondly, within the context of BME graduate admission committees, we strongly encourage every program to perform a retrospective review of their own data and practices. We found that this process was quite informative regarding which factors were strong predictors of admissions at our institution. We recommend that BME graduate admissions committees aim to periodically review their admissions data every 3–4 years. These reviews are not only useful for the purpose of accreditation but to inform any new changes to admission procedures that may be unique to one’s program or geographic region.

Lesson 3: Writing Skills Matter

One of our noteworthy findings related to the GRE was the influence of analytical writing scores (Table 2), which were rarely considered in prior analyses. Specifically, the results of our analysis indicated that strength in writing is an important factor for graduate student selection, with each point increase in the GRE Analytical Writing score nearly doubling the odds of acceptance. While this finding is new to the literature, it is intuitive. Effective oral and written communication skills are essential to the success of any graduate student, and our data suggest this is a factor that faculty and our program weigh heavily in selecting students to admit. Perhaps future standardized testing could be tailored around the writing section of the GRE only. Or rather, objectively assess the writing quality of student’s personal statements (or answers to short questions). We believe that the evaluation of the quality of writing could be useful in identifying applicants who may not necessarily have the highest academic scores. One approach for this type of evaluation is the use of natural language processing (NLP) methods and informatics-based tools to gain an objective assessment of writing samples provided by the prospective applicant.

Lesson 4: Prior Research Experience and Importance of BME Tracks

In performing our retrospective review, we found that not all prior research experience is evaluated the same. Nearly 95% of our applicants reported having some form of prior research experience, but students who had participated in a REU (or hands-on immersive type program) had significantly higher odds of getting accepted into graduate school compared to those who simply applied online. What this means, is that undergraduate students who have access to research labs are more likely to succeed in getting into grad school. This may be worrisome to URMs and students from lower socioeconomic statuses because they are known to have limited access (and sometimes no access) to research programs. While the NSF REU programs are indeed a pipeline for URMs and disadvantaged individuals, they are not enough. Furthermore, if the expectation in the holistic review process is to weigh students who have already excelled in prior research at a higher level, we are again creating a hidden bias against URMs. If we know that URMs are not getting the same opportunities as their Caucasian counterparts, then we cannot expect them to have the same levels of research exposure. There is a need for weighing prior work vs. research experience fairly. Strategies that aim to identify skills and competencies can perhaps provide a more objective method for assessing scholarly and research potential.

Additionally, we found that defining competencies for prospective BME students to be very research track specific. For example, assessing an applicant’s research potential for a biomechanics-based project is very different than one who is performing cell culture. However, current holistic review rubrics don’t differentiate between these skill sets. Moving forward, we think it would be useful to generate BME track-specific competencies as a potential alternative to general grad school competencies (e.g., communication, organization, time-management skills). For example, in tissue engineering and biomaterials-based labs: one may prefer students who have prior experience with 3D printing, cell culture, biological assays, etc. Whereas for transportation safety labs, one would prefer students who have prior experience in automotive safety, motion capture, mechanical engineering. By identifying these track-specific competencies, we can also better inform prospective students on what skill sets are most helpful for succeeding in graduate school. This could also lead to the generation of new pipelines for historically underrepresented groups to pursue BME related careers.

Lesson 5: Domestic vs. International Applicants

Within the context of diversity, equity, and inclusion, it is important to acknowledge that most current holistic review rubrics are often developed with the domestic applicant in mind. International applicants face a number of unique challenges, including differential financial costs and visa obligations that entangle the graduate application review process. Indeed, from an admissions committee perspective, it is more challenging to objectively assess the quality of students from international schools due to differences in grading, curriculum, and training. Furthermore, international applicants face the challenge of not being able to visit campus or have on-site interviews due to travel and/or visa restrictions. Furthermore, budgetary constraints may preclude programs from hosting international applicants. During the COVID-19 pandemic where travel was banned (or restricted) for most individuals, several graduate programs began to develop online content and virtual tours to attract students. Our graduate program now requires virtual interviews, irrespective of whether a candidate is domestic or international, before extending admission and financial offers to applicants. These online/virtual forums have significantly improved the ability of international students to interact with faculty and program directors and have made the graduate recruitment process more equitable between international and domestic applicants. We also encourage schools and BME programs to consider showcasing more of their current research and lab tours online to reach more prospective candidates. Through the use of virtual “Meet the Faculty” and “Meet the Graduate Student” sessions, our program has been able to effectively share content and network with both domestic and international students who are unable to visit our campus in person.

Conclusions

In summary, we found that GPA continues to be a quantitative metric used to assess academic potential in graduate school applications. Although it has been found to be a statistically significant predictor of grad school acceptance; its effect is relatively small (possibly because there are so many applicants with high GPAs). Rather, we found that how students heard about our graduate BME program was the best overall predictor of graduate school acceptance, financial offers, and enrollment, particularly for PhD applicants. Higher GPA (3.5 on average) and higher GRE analytical writing scores explained a portion of the variability in graduate school acceptance, but not as much for actual enrollment into our BME program, possibly because high scoring students receive more than one acceptance to a graduate program. Importantly, the waiving of GRE requirements (or GRExit) has been a positive movement that has allowed for more inclusive and holistic review of prospective graduate students.

Our retrospective review and findings from other BME programs highlight the importance for students to have formal contact with their prospective institutions (i.e., conference contact and faculty referrals) to increase their likelihood for graduate acceptance and receiving financial offers compared to students who apply to the program online (without contact). Students having had personal contact with the institution and/or faculty members via an internship or REU at the institution or student/alumni reference were also more likely to enroll in the program. These non-quantitative factors are included in most holistic review rubrics, but their fair assessment remains challenging. Holistic assessment of prospective BME graduate applicants over the past five years has been increasingly adopted and the results thus far have been encouraging, but there remain some inherent biases that each institution and faculty member may need to account for. Efforts aimed at engaging faculty to overcome their own implicit biases, offering more opportunities for URM students to pursue hands-on, research-intensive programs like REUs and other internships, and networking with diverse student populations has the strong potential to improve the diversity of BME graduate programs and ultimately our future STEM workforce.