Introduction

The use of surveys to better understand students’ experiences and teaching quality in higher education has a long history of implementation and critical review, with the introduction of the survey in United States universities in the mid-1920s (Reisenwitz, 2016). Although countries and universities have varying approaches to survey processes and data, scholars such as Marsh (2007) and Spooren et al. (2013) have identified their main functions as: providing information for academics on the effectiveness of their teaching and for students to use in the selection of units of study; collecting data on teaching and learning for promotions/appointment and for research purposes; and contributing to quality assurance processes. Although research on student feedback surveys has highlighted a number of areas of concern, discussed below, a well-designed student survey on teaching and learning will produce a strong foundation of evidence to inform action by academics and universities (Flodén, 2017; Winstone et al., 2021).

In 2020, the impacts of the global pandemic on higher education and the delivery of teaching and learning activities changed in rapid and radical ways (García-Peñalvo et al., 2020; Gewin, 2020). The year witnessed the sector respond to the demands of COVID with rapid changes across its teaching and learning, research, policy and systems spaces. The response to the virus created a platform in which change was the only course of action that was possible. This change might however be set to continue with universities considering future delivery of learning material post-pandemic (Ross, 2021).

In Australia, 2 weeks after the beginning of the first semester in 2020, on-campus students had to move their learning fully online (Gewin, 2020). This was particularly significant for first year students who had only had 2 weeks of on-campus experience. While non first year students had to face similar circumstances, they had at least one semester of on-campus experience prior to 2020. By focussing on the behaviour of these first year, first semester students compared to other students, this article explores students’ behaviour with regard to these surveys. As the literature points out, historically there has been a decrease in response rates by moving student surveys online (Jara & Mellar, 2010), but what of a circumstance when the teaching itself moved into a fully online environment for on-campus students in otherwise trying and unforeseen circumstances?

This question is aligned with the very call from Williamson et al. (2020) on the need for future research on the impact of the COVID-19 pandemic on higher education, especially with regard to the rise of digital forms of teaching and learning, seen by some as an “emergency remote education” (Williamson et al., 2020, p. 108).

During this time of change, the analysis of the feedback received by universities should be critically examined to take on the challenge of Green et al., (2020, p. 1309) “of reflecting in action”, to “gain a new perspective on a problem, rather than necessarily arrive at a fixed or final solution”. It is the aim of this article to take on this challenge and to complement the expanding list of literature (Green et al., 2020) on the impacts of COVID-19 on the experiences of higher education students. To reach this aim, this article first explores the literature on survey response rates and afterwards introduces Bourdieu’s conceptual tools. It then analyses the survey response rate for Students’ Feedback on teaching and learning at one university, Western Sydney University, and highlights the importance of peer interaction among students when it comes to increasing survey completion rates.

Survey response rate

Prior to the pandemic of 2020, analysis of the student surveys on teaching and learning (e.g. Fosnacht et al., 2017; Reisenwitz, 2016) highlighted lower rates of participation than in previous decades. Further to this, with the move to online surveys, this low rate gets even lower (Lowe et al., 2019; Reisenwitz, 2016; Tucker, 2013). Specific to the Australian context, Tucker (2013) explains this overall drop through various factors such as survey fatigue, non-response to excessive emailing to students, lack of confidence in the anonymity of these surveys, and technical problems when administering them.

This raises concerns about the validity of the data collected to inform teaching quality, as low response rates could provide biased results. Recently in Australia, Ward et al. (2020) conducted research into the representativeness of respondents to the national Student Experience Survey (SES) and drew a difference between response propensity (i.e. the likelihood of students with specific characteristics to respond to the survey) and the response rate (i.e. the percentage of the invited population answering the survey). They used as predictors of non-response such variables as age, study area, course level and gender. While there are differences in response rates across these characteristics, there was however a minimal impact on the survey outcomes with little effect on the analysis of the results. Fosnacht et al. (2017) conducted an analysis of the national college surveys in the United States, the National Survey of Student Engagement, and found that low response rates (even from 5 to 10%) can still provide reliable data, providing that the sample frame includes at least 500 students. They concluded that more time should be spent evaluating and using the data instead of aiming to achieve a high response rate. Reisenwitz (2016) on the other hand, argues that based on the data collected by his own survey at a US institution, there is a non-response bias based on gender, race and grade point average (GPA) and thus a low response rate can provide unrepresentative results.

In the hope of increasing reliable data, academics might motivate their students to answer these surveys. This can involve asking students to fill out the survey in-class and/or online on their personal device. Other strategies not necessarily supported by the sector might “include reward or coercion, withholding grades or giving additional grades for survey submission, and mandatory participation” (Tucker, 2013, p. 624).

The question of the importance of an on-campus learning experience is however specifically explored to understand impacts on these rates, as we postulate that the lack of on-campus forms of social interaction is a contributing factor to the drop in survey response, even if online. The attempt to increase response rates is usually operationalised by having academics and professionals reminding students of their importance, which tends to occur via e-mail or social media reminders. However, what we explore in this article is the importance of peers’ interaction. Students might indeed make reference to these surveys in a coffee line, at a desk at the library, during group work at an on-campus tutorial, or simply by sitting beside someone and engaging in a conversation; an activity difficult to perform in Australia in 2020 during COVID-19. As the Tertiary Education Quality and Standards Agency (TEQSA) report into the student experience of online learning in Australia during 2020 noted, students across Australia were indeed suffering from lack of academic and peer interaction (TEQSA, 2020).

Student feedback

This article examines the impacts of the pandemic on the uptake of student surveys on teaching and learning at Western Sydney University (WSU), one of the eight Innovative Research Universities within Australia. WSU is a multi-campus university in the Western Sydney region of New South Wales with close to 50,000 students. The Student Feedback on Units (SFU) Survey (also called subjects in some other institutions) is run for every unit in every teaching session, including those from third party providers. Students are asked to provide feedback and rate various aspects of the delivery of their unit such as the learning materials they had been provided, the assessments they had to complete, and whether they were given opportunities for collaboration. They are also asked to comment on the technological environment, their workload, the feedback and support they received, their perceived development of critical and analytical skills, their work related skills, and their overall satisfaction. The Likert scale front end of the survey is complemented by two qualitative areas where students are asked to comment on best aspects of the unit and areas that are in need of improvement. In compliance with WSU policy, the results are anonymised and provided to the teaching staff responsible for the unit (unit coordinators), to the relevant University executive members and to several professional offices involved with quality assurance and improvement. Unit Coordinators are required to use this information to review their units every year and to make use of these data to improve their teaching delivery. Aggregate survey data are used in annual course reporting to measure course quality and effectiveness. These data can also be used by staff for promotion and by executive staff to assess the viability of certain units.

WSU also provides an opportunity for students to engage with the Student Feedback on Teaching (SFT), which focusses on the perceived performance and ability of an individual teaching staff (being a lecture and/or tutor). Contrary to SFUs, SFTs results are only used by individual teachers to evaluate their teaching and can be discussed with their supervisor to identify areas of improvement.

In addition to Institution specific surveys such as the one from WSU, the Australian Government Department of Education, Skills and Employment runs a national survey for higher education. The Student Experience Survey (SES) is the only comprehensive nationwide survey for both public and private higher education providers of undergraduate and postgraduate coursework students in Australia. The survey is completed by commencing students and later years’ students—generally students in their last year of study. It measures teaching and learning outcomes, and reports on multiple facets of the student experience. The survey is one of four surveys that contribute to the Quality Indicators for Learning and Teaching (QILT) performance data for Australian higher education, with the remaining three dedicated to graduate outcomes and employer satisfaction. QILT, funded by the Australian Government, provides data “to drive quality improvement” in higher education institutions, but also for prospective students to compare the “quality of education and student experiences” (QILT, N.D.). In this space, prospective students are able to get feedback on the experiences of current and past students, but this is at the course/degree level rather than the unit/subject level. These results are used to compare and contrast the performance of higher education institutions which are publicly accountable. This institutional and national survey practice has been strong in Australia for the past 30 years, and this is due to combined institutional and national interest in capturing the student focussed learning perspective on teaching quality and in the higher education sector moving towards perceiving the student as consumers of higher education services (Barrie et al., 2008, p. 2). While there is no clear agreement of what defines the satisfaction of a student and what is a good educational experience, and while there are deficiencies in survey assessment of higher education quality, these are nevertheless widely used (Langan & Harris, 2019).

Since 2020 this use of survey results has been further complicated by Australia witnessing the transformation in federal legislation and funding of universities, both of which underscore performance indicators. The Performance-Based Funding for the Commonwealth Grant Scheme moved from being capped in 2020 to being in line with population growth in the 18–64-year-old age bracket for bachelor level places. Funding is contingent, however, on universities meeting specified performance requirements. These include graduate employment outcomes (40% weighting in the performance funding formula), student experience (20%), student success/attrition (20%) and equity group participation (20%) (Parliament of Australia, 2020).

Social capital in higher education

Bourdieu’s (1984) classic theory of capital can help us theorise peer interaction on campus. His theories have already been used to study higher education. For example, Beattie (2017) explored the multiplicity of power struggles within this field of higher education in the United Kingdom. She found that, for example, students with a richer cultural capital have an advantage at being admitted than those with lower capital, and that in the administrative sub-field, cultural capital, as well as economic and social, provide symbolic capital to those in the higher end of the hierarchy. Mishra (2020) conducted a systematic review on social networks and capital in higher education with a focus on “underrepresented students”. This study found that minority students should not rely on their own networks, being families and communities, for support and that more services from the university such as peer tutoring programmes and counselling services should be provided to support them. Morrison (2017) used Bourdieu’s theory to analyse class-based inequalities in the higher education system.

For Bourdieu, capital is not only financial, but rather any resource (be it financial, cultural, social and/or symbolic), in any setting, by which one can profit. The profit can be achieved by gaining it or by preventing someone else from gaining it and thus from accumulating a higher amount. Bourdieu understands capital in four ways:

Cultural Capital: the culture that a person has accumulated throughout his or her life. This can be the way a person speaks, the cultural awareness this person has gained with regards to his or her society and/or the arts in general, his or her aesthetic preferences, knowledge of social and hard science, and degrees accumulated in the education system.

Economic Capital: all the material and financial assets that a person owns.

Symbolic Capital: a form of power which may be based on honour, fame or social prestige. People who respond to people who have symbolic capital may offer recognition, deference, obedience, and can also offer their services.

Social Capital: all the social contacts that a person has accumulated in his or her lifetime. These contacts can be family, family friends, friends from schools and universities, contacts from work networks, and so on. Knowing the ‘right’ people and having access to these social resources affords a person high social capital. In a university context, this capital builds on the connection that a student makes with fellow students and academics. The higher the capital during the candidature, the more a student will get access to resources (e.g. significant conversation with a top student or a tutor outside of a room) to increase his or her university experience but also results (e.g. getting extra and informal tips outside of the teaching space). This social capital continues to accumulate post-graduation as the students will have gained some key contacts during their candidature that would be of benefit later in life (e.g. having a university friend who has become a politician, a business person, or an influential person in the community).

Specific to this research, the article hypothesises that commencing students did not have the opportunity to develop their social capital with fellow students due to COVID. Lack of on-campus opportunities to connect before or after the classroom, standing in line for lunch or a coffee, starting a conversation at the library, being part of a group work on campus, or connecting at a social event have been missing for these students. This deficit in communication has reduced their opportunity to reflect with their counterparts about the teaching quality, be it good, simply regular, or bad, of their units, and this would not have incentivised, or even sensitise, them to completing a survey about their learning experience. Continuing students, as we will explore, would have already had accumulated this social capital before COVID-19, for at least one semester, and would have already been familiar with these surveys. We will revisit this theory at length in the discussion section below after the response rate analysis.

Data collection

Students at WSU fill out their online surveys through what is called their vUWS site, housed by Blackboard. At the end of their semester, each student is sent a Student Feedback on Units (SFU) survey for each unit they studied in a session, and a Student Feedback on Teaching (SFT) survey for each teacher they had on each unit. Individual links to the surveys are hosted in a ‘survey portal’ which is embedded in a Blackboard site.

To access the online surveys, whether on or off campus, students must first log into Blackboard using their student ID. They can then access the “Student Feedback on Units and Teaching” Blackboard site, where all their survey links will be listed. Within Blackboard, students can complete each survey and return to the survey portal for the next link.

Each unit has a site and by having students accessing this survey through this site and by using their student number to access it, rather than simply clicking on a link provided in an e-mail, it prevents them from answering multiple surveys for the same unit. While the information provided remains anonymous, it is possible to track their survey participation behaviour, especially between those who never complete any survey, and those who are more active and involved (see below).

As this article explores the impact of unplanned change between Autumn 2019 and Autumn 2020 there are three other factors that could have impacted on the survey response rate, beyond the restriction on social capital gain on campus. These would however have affected all students, be them commencing or continuing:

  • Due to COVID-19, from mid-March 2020, students were required to study online and as such there were no longer opportunities for face-to-face reminders or encouragement from teaching staff to complete the survey.

  • The many challenges to students posed by the pandemic in 2020 (and the potential impacts on retention) led to a substantial increase in student feedback surveys from university officials in this year. Students were frequently being asked how they were coping, what they needed, what they thought of the University’s approach throughout Autumn semester, potentially magnifying the effects of digital and survey fatigue by the time the SFU was issued in May.

  • Usual promotional efforts established in previous semesters were interrupted in 2020. On-campus promotional material via digital screens was no longer accessible to students. The University’s social media team had also changed their approach to campaigns, decreasing the number of scheduled posts on social media platforms.

Rates of completion

Comparing and contrasting the rate of completion of these two student feedback surveys in the Autumn session between 2019 (pre COVID-19) and 2020 (at the start of COVID-19) for the whole of the university (see Table 1), Figs. 1 and 2 indicate a drop from 19 to 17% for SFT (Fig. 1) and from 24 to 21% for SFU (Fig. 2). What is of interest is that these rates only start to deviate from day 15 for both charts. Before that time, the rate of completion is identical. This indicates that a difference between response rates between the online and on-campus settings starts to differentiate 2 weeks after the release of the survey. While it can be argued that these days correspond to the remaining regular contact with the lecturer/tutor being face-to-face or online, it is after that time that we can observe a departure. We are assuming that this drop is due to a reduction of on-campus activities, as we continue to explore.

Table 1 Students characteristics and response rate
Fig. 1
figure 1

SFT cumulative daily response rate, Autumn & 1H

Fig. 2
figure 2

SFU cumulative daily response rate, Autumn & 1H

With regard to the content of these surveys, students did not significantly deviate from their satisfaction level between 2019 and 2020, except for one category, that of group work, which is a significant drop of 13%. This is largely explained by the move to fully online delivery where students had to do group work remotely. This would indicate a gap in satisfaction in this online ‘communal’ space where students interact in a different fashion than if they were on campus. This might also indicate a lack of connectivity between students in class which might have helped build up social capital outside of their virtual classrooms. This drop in satisfaction might also indicate a drop in social capital in the university context that would have impacted commencing first year students more.

Table 1 drills down the information per student for SFU only. It does not include information on their disciplines as the intention of this research is to focus on the attributes of the students rather than on their aspirations. The counts are based on unit enrolments which means that a student will be counted multiple times based on the number of units in which they are enrolled. In 2019, there were 27,641 responses, that is a response rate of 24.5% for a total of 110,868 units. In 2020, the 23,204 completed surveys out of a population of 110,868 had a response rate of 20.9%. The overall drop rate from 2019 to 2020 is 3.6% (from 24.5% to 20.9%). In our analysis, anything higher than this amount indicates a higher drop in response rate than average. While students aged 25 and above do not seem to have been affected (− 2.1% for those between 25 and 29; − 1.8% between 30 and 39; and − 0.8% between 40 and 49), those of less than 21 years have been significantly affected, from a response rate of 24.1% in 2019 to 18.5%, that is a drop of 5.6%. Those between 21 and 24 have a high drop (− 3.1%) but it is still below the average (− 3.6%).

The second significant finding is between commencing and continuing students. In terms of response rates, those who started university dropped from 30.5 to 25.3%, that is a drop of 5.1%. However, even if the drop is high, compared with other factors, the completion rate (25.3%) is still higher than the whole population (20.9%). The drop also tends to be located within domestic and English-speaking students, rather than international (− 1.8%) and NESB students (− 0.8%). Indigenous students are also higher among the overall drop and there is not much difference with regards to gender.

The recent research from Ward et al. (2020) points out that more advanced students in their degree are more likely to respond to these types of surveys. However, this is not what we have found in this study. If continuing students (that is from Semester 2 to the end of their degree) completing the survey are more numerous (e.g. 74,860 in 2019) than commencing ones in their first semester (e.g. 16,370 in 2019), the proportion is quite different (30.5% of commencing students and 21.9% of continuing ones in 2019). Focussing on the two variables that indicate the biggest drop in response rate from Table 1 (commencing status and age), Table 2 provides a bi-variate analysis. It confirms the initial observations from Table 1 as those above 25 years tend to have a higher response rate (30.4% in 2019 and 28.5% in 2020, that is a drop of 1.9%). Those below 25 years have a response rate of 22.4% in 2019 and 18.2% in 2020, that is a drop of 4.2%. When looking at their status as commencing or continuing students, we can observe that the rate is still higher among those commencing (30.5% compared to 21.9% in 2019) than those continuing (21.9% in 2020 and 19.0% in 2020), however this biggest drop is among the commencing students in 2020 who were below 25 years old, a drop of 5.9%

Table 2 Commencing status and Age

Table 2 could indicate that COVID-19 has affected in greater proportion the students below 25 years in their commencing years, a cohort that, in this case study, outperforms other student cohorts in terms of response rates to these surveys. These commencing students are also the only cohort of students who only had a campus experience for 2 weeks before moving online in response to the pandemic. While it can be argued that survey fatigue would have impacted this drop, this should have happened more among continuing students who have been exposed to more than a full semester of surveys in their student life, compared to those who have just started and been exposed to a lesser quantity of surveys in their university experience. While the argument of survey fatigue might explain in part this drop, it does not give a full picture of what happened in Autumn 2020 at WSU.

To shed more light on these findings, data from WSU Online, a virtual campus of the University was brought in for comparison. Beyond being a fully online environment, the teaching year is organised across three trimesters rather than the two semesters (not including the summer session) scheduled on the physical campuses of WSU. Trimester 1 commences 1 week before the autumn session and in 2019, 22.4% (compared to 24.5% on WSU main campuses) filled out the survey. This went down to 19.3% in Trimester 1 2020 (compared to 20.9% on WSU main campuses). This is a drop of 3.1% from WSU Online compared to 3.6% on WSU main campuses. In accordance with the literature (Reisenwitz, 2016), fewer online students filled out these surveys compared to on-campus ones; however, in this context, WSU Online students experienced a slightly lower drop due to COVID-19.

Using a bi-variate analysis for WSU–online the same way that we did for the main campuses in Tables 2, 3 indicates a sharp contrast. Those below 25 years and commencing saw a slight increase (+ 0.4%) in response rate between the two years. The highest drop is for the continuing students above 25 years (− 5.3%). Our hypothesis is that all students were affected by survey fatigue but some were more affected than others. This would be expected of those more advanced in their life and studies, who would have been exposed to a higher demand for completing surveys at university. As such, the drop of continuing students above 25 years at WSU online shows a category of students used to, and even perhaps fatigued, with surveys. This, as Table 3 infers, affected commencing and younger students less.

Table 3 Commencing status and age—WSU Online

The following section seeks more evidence to back up this hypothesis and explores more data to understand why commencing students of a younger age at WSU main campuses would show such a drop of response rate at the start of COVID-19.

Students behaviour

The overall response rate is based on students completing the survey per unit. However, this rate does not take into account the behaviour of these students. If full-time, students tend to enrol in four units per semester and would thus have been invited to fill out one SFU survey per unit, that is a total of four. One might wonder if among the students who answered, if they participate occasionally (e.g. one or two surveys per semester) or fully (four surveys per semester)? And if this is the case, what of those who never fill out a survey? If one is to attempt to increase the response rate, would it be more impactful to convince those who are not involved at all or those who are somewhat involved but not fully?

For Autumn 2019, Table 4 indicates in the total line that 64% of the students did not complete a survey at all. Thirty-six percent did at least one, and more specifically 14% for just one, 8% for two, 6% for three and 8% for four and more. This means that among those who filled out a survey, around one third only completed one survey, and less than a quarter completed four and above. Looking at each line of Table 4, for those who were enrolled part-time for only one unit and were asked to complete only one survey, 78% did not respond. For those enrolled in two, it went down to 71%, as at least 10% of them agreed to fill out one out of two. This goes down to 58% for those in full-time who were asked to fill out four or more surveys. The fewer surveys one is asked to fill out the more likely one will not participate at all.

Table 4 Response rates of FSU surveys across units- Autumn 2019

Table 5 focusses on Autumn 2020, during COVID-19 and reveals some interesting results. The percentage of students who completed at least one survey dropped from 36 to 29%. There is however only a slight decrease of those completing two surveys or more (e.g. from 8% in 2019 to 7% in 2020 for two surveys, or from 8% in 2019 to 7% in 2020 for four+ surveys). The drop due to COVID is not only about students completing less surveys but also about not completing a survey at all. The data thus indicate that the drop in response rate is mainly about students not completing any student evaluation surveys at all rather than simply less.

Table 5 Percentage of responses of all FSU surveys- Autumn 2020

If we compare these results with those from WSU online in the total line, we find an important contrast. Table 6 indicates that 25% completed at least one survey and this went down to only 24% in 2020 (Table 7). In this online context, COVID-19 does not seem to have impacted that much on response rates. Also, contrary to WSU students on main campuses, those online do tend to be affected if asked to fill out one or more surveys. Except for those who are taking only two units and where the non-participation rate is lower (72% in 2019 and a slight increase to 75% in 2020), the proportion of non-engaged students remains stable albeit higher than students from main campuses. This is in contrast with the main campus where, as we have observed above, there was an increase in the total withdrawal from this process. This means that there has been an increase of a specific “disengaged” group on main campuses, one that we argue below has not been able to accumulate much social capital outside of the digital world.

Table 6 WSU Online Trimester 1 2019
Table 7 WSU Online Trimester 1 2020

Discussion

With regard to survey response rate, the results of our analysis point out that on campus (at least for 2 weeks) first year, first semester students were more affected in terms of survey responses due to COVID. What could explain this when the lines of communication between the university and academics towards students have been the same during COVID-19, albeit mainly online? Our findings indicate that it is not because the pedagogy moved online, but because students moved online. Indeed, the results of the SFUs are clear that students had major issues with group work, meaning lack of connection with fellow students. COVID-19 brought to these students a deficit in their connection with fellow students that continuing students did not face to the same extent. As such it can be assumed that compared to previous years, the commencing students’ social capital (in terms of being networked with fellow students and staff at university) was at its lowest level due to Covid-19.

This crisis has highlighted a key factor that has been completely omitted in the literature about the decreasing response rate of student feedback surveys. Making calls at lectures and tutorials, or sending reminders via e-mails or on Blackboard are of course important, but this is not what will necessarily push a non-engaged student to fill out at least one survey, it can also be the social capital they accumulate while having conversations with their peers on campus.

While the research on social networks and social capital in higher education is vast (e.g. Mishra, 2020), the need that students have to develop the connection with other students and get the impetus to fill out these surveys has not been extensively addressed. In their exploration of the world of art, Lipovetsky and Serroy (2013) have argued that what really stands out when one piece becomes successful is what people are talking about among themselves, being friends, members of a family, which is part of social capital. The key for promotion is not always to only have large marketing strategies but is to get people to talk among themselves.

Conclusions

This research points to a new factor to take into account when discussing the rates of survey completion by students, that of social capital. The move online during Covid-19 has affected first year students stronger than the more advanced ones due to the fact that they have not been able to build a strong social capital on campus. This has been reflected in a drop of survey response rates among this cohort.

While the data of this research indicate that the key to improve the survey response rate is to have students speak to each other, this remains a theory that needs to be tested. The decrease in social capital on campus due to COVID-19 has potentially highlighted the importance of this factor but further research (e.g. interviews and/or focus groups asking students about their experience in completing these surveys) on proving the validity of this theory needs to be undertaken. Another avenue would be to promote the importance of completing these surveys during Peer Assisted Study Sessions (PASS), a peer-facilitating learning programme available at some universities, where the message could be carried from student to student and made more meaningful to them. Also, the further promotion of these programmes will certainly increase the social capital of first year students. The analysis of the response rate of these survey does point to that direction but stronger evidence is indeed needed.