Keywords

Introduction

Academic administrators and faculty members have long lamented students’ disrespect for academic integrity. Angell’s (1928) description of students’ academically dishonest behaviours is just as fitting today as it was when he wrote his book in the early twentieth century. There has also been no shortage of advice from administrators and faculty on ways educational institutions can improve their culture of academic integrity (e.g., Christensen Hughes & McCabe, 2006b; McCabe et al., 2012; Morris, 2018; Whitley & Keith-Spiegel, 2002). To put it bluntly, some percentage of students violate academic integrity, many faculty and administrators complain that they do, and at various points in an institution’s history—often in response to a cheating scandal or a growing unease that the situation has gotten out of hand—faculty and administration introduce new programs, policies and/or pedagogical innovations designed to improve academic integrity (e.g., Raman & Ramlogan, 2020).

For these efforts to improve academic integrity to be successful, however, administration, faculty and students must recognize that there is a problem, be motivated to solve the problem, and be willing to change their attitudes and behaviours accordingly (Burnes & Jackson, 2011; Christensen Hughes & McCabe, 2006b; Vakola, 2013). Change is also more likely to be enduring when solutions incorporate the concerns and recommendations of all affected stakeholders (Eury & Treviño, 2019).

In our review of the literature we found that faculty members and university administrators frequently suggested ways to improve students’ adherence to academic integrity. Many of the suggestions were valuable, particularly those related to best practices in pedagogy (e.g., Murdock & Anderman, 2006; Tammeleht et al., 2019) and those derived from students’ self-reported behaviours and attitudes (e.g., Chapman et al., 2004; Fontaine et al., 2020). Largely missing from the discussion, however, was student-generated advice on ways to improve a school’s culture of academic integrity (cf. Eury & Treviño, 2019; Hendershott et al., 2000; McCabe & Pavela, 2000). In short, students were frequently surveyed on what they do and why, but only sometimes consulted on whether they perceived their behaviour as problematic, and if yes, what they thought could be done to improve that behaviour.

Our research addressed this shortcoming in the literature by using a methodology—computer-facilitated or electronic focus groups—that to our knowledge had not been used previously to study academic integrity. Computer-facilitated focus groups combine anonymous written entries with oral conversation making it an ideal method for discussing confidential and sensitive issues. Students should have felt as comfortable supplying their honest opinions about their views on academic integrity as they would have in anonymous surveys. Unlike anonymous surveys, however, they also engaged in a conversation with their peers and the facilitator, which enabled a potentially deeper evaluation of the topic. The outcome was a window into conversations among students about how they viewed academic integrity and what they thought were the best ways to improve the culture of academic integrity in their program.

Sources of Student-Derived Insight on Academic Integrity

To encourage individuals to respond openly and honestly about stigmatized behaviours such as academic dishonesty, researchers have used several methods to maximize the likelihood of accurate responses. For example, many have used anonymous surveys with assurances of confidentiality to encourage truthful responses that have generated a reasonable, quantifiably-comparable, understanding of a population’s attitudes toward and engagement in academic integrity violations. Bowers (1964, 1966) in his landmark census-style analysis of university students across the United States, McCabe, Treviño, Butterfield and colleagues in their 20-plus year longitudinal study of students’ academic integrity behaviour around the world (e.g., Christensen Hughes & McCabe, 2006a; McCabe et al., 2012; McCabe & Bowers, 1994; McCabe & Treviño, 1997) and innumerable other researchers have used survey-based methods to provide us with a good understanding of the personal and situational factors which have influenced attitudes toward, and self-reported engagement in, academic dishonest behaviours (see e.g., Whitley (1998) and Lang (2013) for literature reviews).

To overcome some of the limitations of fixed answers, a hallmark of surveys, some authors have included open-ended questions to understand in a more nuanced way why students violated academic integrity (e.g., LaBeff et al., 1990; McCabe et al., 1999). For example, in related research, we used neutralization theory (Sykes & Matza, 1957) and moral disengagement (Bandura, 1999) to categorize students’ volunteered rationales for violating academic integrity to demonstrate that students relied on different mechanisms to justify specific trivial violations (e.g., unauthorized collaboration) versus violations in general (Packalen & Rowbotham, 2021). Researchers have also asked students to predict how they would behave in scenarios where there was a potential to cheat, with key contextual factors modified among scenarios to enable systematic comparison of factors (Bernardi et al., 2004; Rettinger et al., 2004; Steininger et al., 1964).

A shortcoming of the scenario method, however, is that people, when asked “what would you do?”, have tended to predict that they would behave more morally than they actually would in said circumstances (Kang & Glassman, 2010). As such, researchers have tried numerous creative approaches designed to capture rates of actual versus self-reported or predicted behaviour. One of these methods was to compare the self-graded and independent-graded scores on tests (Antion & Michael, 1983; Ward, 1986). More recently, behavioural economists have used experimental or quasi-experimental designs to record participants’ tendencies to act dishonestly when put into tempting situations. Although each individual experiment has been limited to a narrow situation, as a group these experiments have provided us with information about which situations lend themselves to more dishonest behaviour and the extent to which individuals behave dishonestly.

Moving beyond surveys, scenarios, and (quasi-) experimental designs, a limited number of researchers have used methodologies designed to more fully engage with students. For example, Gullifer and Tyson (2010) used traditional focus groups to elicit students’ perceptions on plagiarism; McCabe (1999) and Aljurf et al. (2020) used them to understand students’ views on academic dishonesty more generally. Moreover, Blum’s (2009) ethnography, which used semi-structured interviews, has been an exemplar for gaining an in-depth understanding of students’ attitudes toward plagiarism specifically and conceptions of authorship and individualism more generally.

From these rich pools of data, faculty members and university administrators have made recommendations on how to improve adherence to academic integrity. These data driven recommendations, however, have tended to be top-down and may or may not have resonated with the students to which they have been directed. For example, McCabe et al. (2012) explained that the post-hoc investigation of the surprising failed vote among students at one school attempting to implement an honour code revealed that students were, in general, in favour of an academic honour code, but they were not in favour of adopting the code if it meant that they were required to report peers who they witnessed violating academic integrity. If the honour code had not included this one clause then the vote very likely would have passed; this suggested that students had not been adequately consulted during the development phase.

To address the dearth of student-driven recommendations in the prior literature, we asked a student population with diverse attitudes toward academic integrity not only how they thought about academic integrity but also what they perceived to be effective solutions for improving the culture of academic integrity in their institution.

Method

Research Setting and Context

Our participants were students in an undergraduate business program at a research-intensive Canadian university. The 1912 students in the 2018–2019 academic year were divided about equally among the four years. Most students in the program spent much of their time together whether it be in class, involved in extracurricular activities, socializing, and/or cohabiting in shared accommodations on or near campus.

In a companion study to this chapter we reported survey results from 852 students (45 percent of all students) in the same program in March 2019 (Packalen & Rowbotham, 2020). The results from that survey provided us with a general and representative understanding of the population and thus the environment in which our research participants were situated. Specifically, 85 percent of those surveyed self-reported engaging in at least one questionable behaviour in the 2018–2019 academic year and their average rate of academic misconduct was 7.05 (standard deviation = 6.50).Footnote 1 Respondents thought that on average 78 percent of their peers violated academic integrity and their assessment of the culture of academic integrity within the school, as measured by an 11-item scale, was just above neutral (3.46 on a five point Likert scale from strongly disagree to strongly agree, standard deviation = 0.73). Table 18.1, which was adapted from Packalen and Rowbotham (2020), highlights the differing behaviour, beliefs and attitudes in each of the four years of the program.

Table 18.1 Academic integrity (A.I.) related behaviours, perceptions and attitudes by year among the population from which our focus group participants were drawn (n = 852)

Recruitment

Following the methodology approved by our general research ethics board, in January 2019 we posted four separate recruitment ads on the research pool portal seeking up to 14 participants per pre-scheduled focus group.Footnote 2 To participate, students had to be available at the time the session for their year was scheduled and be members of the research pool affiliated with the business school. We restricted each focus group to a particular year because we assumed that students within the same year would both feel more comfortable participating among their peers and have similar cohort-related experiences on which to build.Footnote 3 In return for participating in three hours’ worth of research studies students received a grade bump of one third of a letter grade in a maximum of one course per semester. Students who participated in our computer-facilitated focus groups received 1.5 hours of research credit. In the final week of January 2019, 10 first year, 13 second year, 10 third year and 11 fourth year students participated in their year-specific sessions for a total of 44 focus group participants.

From prior experience running sessions on academic integrity and being responsible for an anonymous email address to report questions, concerns and violations, we knew that gathering students voluntarily to discuss academic integrity tended to draw those who had strong opinions, particularly those who had not violated academic integrity themselves and who also thought that academic integrity was a very serious problem. While these students were vocal about their opinions, they may not have been representative of the student body overall. Thus, the main benefit of recruiting students through the research pool was that a little over two-thirds of the students, associated with 43 different courses in the program, participated in the research pool in Winter 2019 and students in the research pool tended to select studies primarily to obtain research credit and secondly because they were interested in the topic being studied. Importantly, this meant that we recruited students whose opinions on academic integrity were diverse.

Data Gathering and In-Situ Analysis Using Computer-Facilitated Focus Groups

To promote forthright and honest responses, numerous steps were taken to protect the confidentiality of the students who participated in the computer-facilitated or electronic focus groups. For example, we were not in the room during the sessions, we used a professional facilitator whose reputation was based on maintaining his clients’ confidentiality, students anonymously typed their comments and we did not include any identifying information (e.g., names, gender, ethnicity) in the transcription of the oral component of the session. In short, the only identifying information that we retained about the students was the year in which they were enrolled in the program.

Computer-facilitated or electronic focus groups have combined the facilitator-led, discussion-based aspects of verbal discussion in traditional focus groups with computer-based written interactions using a group software-based decision support system (GDSS). We used the software ThinkTank 4.9 (GroupSystems, 2018), which enabled participants to type anonymous comments in response to open-ended survey-style questions and vote, rank and evaluate their agreement with participants’ statements to gauge the (lack of) consensus among the group. ThinkTank was designed to overcome many of the downsides of traditional group discussions such as domination by a select few members, interruptions, not getting a turn to speak until the topic has passed, evaluation apprehension and pressure to conform to a dominant idea (Nunamaker et al., 1991).

To enable post-hoc comparison and/or aggregation of responses across groups the facilitator used the same semi-structured format of questions in all sessions. Specific prompts that the facilitator used were:

  1. 1.

    What is the attitude among yourself and your peers with respect to academic integrity? Why?

  2. 2.

    Do you think academic integrity is a pressing problem?

  3. 3.

    What steps can students take to improve the culture of academic integrity?

  4. 4.

    What steps can faculty and administration take to improve the culture of academic integrity?

For prompt one the facilitator asked participants to anonymously electronically brainstorm a list and then clarify their comments with oral discussion and/or additional anonymous written explanation. For prompt two the facilitator set up a yes/no vote to which he asked students to respond.

For prompts three and four the facilitator asked participants to anonymously electronically brainstorm a list and then clarify their suggestions with oral discussion and/or additional anonymous written explanation. Next the facilitator engaged in in-situ analysis and combined similar options in the electronic file into higher order constructs in real time. This step was conducted with participant involvement to ensure agreement on how comments were aggregated. After aggregation was completed, the facilitator asked students to vote on the grouped suggestions that they thought could be most successfully implemented. Specifically, the facilitator asked students to select up to half of the grouped suggestions. For example, if there was a list of ten grouped suggestions, students would be asked to pick up to five suggestions.

Post-sessions Analysis

At the end of each session we received a text document that contained the record of all written comments, how they were aggregated and the results of any voting (12,086 words total for the four sessions). The software clearly distinguished original comments from later additions (such as the facilitator’s additions to clarify a specific comment or headings used to label a group of comments). We also received an audio file which the first author transcribed (20,190 words total for the four sessions). Our transcripts of the oral discussion distinguished between the facilitator and students and between two students in a back-and-forth exchange. We were unable, however, to track a student’s comments throughout the entire session (i.e., we could not reliability determine if the student who made comment 1 at the start of the session was the same student who made comment 15 three-quarters of the way through the session).

Next, we systematically read and manually categorized students’ written responses to the open-ended question asked at the start of the session: “What is the attitude among yourself and your peers with respect to academic integrity? Why?” For example, we coded perceived level of adherence to academic integrity. When students provided a rationale for their attitude, we also coded the source of that rationale (e.g., competition). We applied the same process to the oral discussion that accompanied this first question and to the facilitator’s ending question: “Is there anything else you want to add?”.

Once we had finished coding within each year, we then compared responses across years; where appropriate we further aggregated the responses into themes. For example, we grouped the recommendations students made for themselves under themes of: perspective and attitude; foster and respect a culture of integrity; and proactive actions.

Results and Discussion

The first noticeable difference between the groups was that students in first year were unwillingly to speak and made all their comments electronically and anonymously. Only once did a student respond to a request for clarification from the facilitator and that was only after the student could no longer bear the awkward silence following the facilitator’s repeated request for clarification. The second year students spoke a bit, the third year students more so and the fourth year students were very open in sharing their opinions verbally with the facilitator and their peers in the session. This was especially true with the prolonged conversation that followed the official end of the session and which provided further explanations for the low levels of adherence to academic integrity among themselves and their peers. In other work drawing on survey data from the same population we found that the students in fourth year as compared to those in first year were significantly more morally disengaged in that they viewed it acceptable to engage in trivial violations of academic integrity in many more scenarios (Packalen & Rowbotham, 2021). We didn’t find it surprising, therefore, that these students who came from a cohort that perceived more trivial violations to be acceptable were also more willing to speak openly about those behaviours.

Student-Perceived Attitudes Toward Academic Integrity

Perceived Levels of Adherence

Many of the students answered the question “What is the attitude among yourself and your peers with respect to academic integrity? Why?” by writing about what they perceived to be the level of adherence to academic integrity in the program. We summarized their responses and explanations for given levels of adherence in Fig. 18.1. Our first observation was that there were diverse opinions on the level of adherence, a fact which three students, including the following third year student noted when they wrote, “It seems as though there are three types of attitudes regarding academic integrity. People either don't know what exactly it is, completely disregard it or try to follow it.”

Fig. 18.1
figure 1

Students’ perceived attitudes towards academic integrity among themselves and their peers

These three individuals were the only students who spoke to there being a group of students who made their best effort to adhere to the policies and behave with academic integrity. As Fig. 18.1 demonstrates, those who made their best effort did so because they felt that academic integrity was serious, doing so was important and/or they were motivated by what they perceived to be significant consequences for not following academic integrity. These explanations were markedly different than the reasons that students in Miller’s and colleague’s (2011) study provided for what they would do if put in a situation where a professor left the answer key to an upcoming exam visible. In that situation about 94 percent of students said that they would not violate academic integrity and provided reasons that could be categorized into four main groups: they were afraid of the punishment or consequences; it was not in line with their moral character; it was simply wrong; or it undermined their goals of learning. The students in Miller et al.’s (2011) study, however, also had much lower rates of academic integrity violations when compared to the population from which our sample was drawn and also appeared to have a much stronger culture of academic integrity.Footnote 4

Disregard for Academic Integrity. At the opposite end of the spectrum from those who did their best to adhere were those who perceived that they and their peers had a complete disregard for academic integrity. This viewpoint was shared by four students, all of whom were in fourth year. The main rationale for this disregard, as summed up by one fourth year student was: “Most people don't care about it or follow the rules. There's so much pressure to do well and get a high GPA that people will do whatever is necessary to get a high grade.”

In their open-ended oral discussion, the fourth year students provided additional explanations for their disregard for academic integrity. Chief among these was their view that if professors did not make an effort to provide sufficient practice resources and new tests and assignments each year there was little reason for them to not copy resources from prior years. As another fourth year student noted:

Something that bothers me at least. I find it very hard to find motivation to do what we are supposed to do when the professors are very lazy on their end when they repeat tests, assignments, questions. When they don’t provide ample resources for you to learn the content on your own, not necessarily for an assignment, but for a test. They should have resources for you to be able to do that. And it’s very frustrating on our part when we don’t have those resources. When you see the laziness and then you don’t feel motivated to not be lazy yourself. Like we pay a lot of money to be here and they shouldn’t be doing that. So that’s something I feel very strongly about.

The students in our focus groups were not unique in morally disengaging by euphemistically labelling (Bandura, 1999) their academically dishonest acts as laziness in light of their perception that their professors were lazy. Christensen Hughes (2017, p. 58) found similar explanations among the nearly 15,000 Canadian students that she and McCabe had surveyed (2006a, 2006b), noting that “students cheat when they feel cheated.”

Poor Adherence to Academic Integrity. Five students, representing all four years of the program, perceived their peers and themselves to poorly adhere to academic integrity. Reasons given for poor adherence included that people frequently violated academic integrity because they were unaware of the rules, while others did so because “professors often make it easy to violate academic integrity as they reuse material year after year.” (Third year student).

Although the aforementioned disregard for academic integrity and poor adherence had similar outcomes—frequent violations—the groups differed. Unlike those who had no regard for academic integrity, students often prefaced their statement about poor adherence with an ideal. Consider the difference between this fourth year student’s response: “Academic integrity is a non-essential concept in terms of succeeding in this program as most students disregard any policies, warnings, or ideas given to us by the Program Office.” with the following response from a first year whom we classified as believing they and their peers adhered poorly to academic integrity: “The majority of my peers appear to be very concerned with AI, but in reality there are AI violations being committed every day.”

Selective Adherence to Academic Integrity. The most common attitude mentioned was selective adherence to academic integrity. This was the attitude described by 14 of the participants, half of whom were in second year. As one fourth year student noted, “I believe that there is almost an unwritten rule when it comes to academic integrity among students that outlines what is okay and what is not.” As such, as another fourth year student told us, “I think we all care deeply about the grades on our transcript and if it is easy to get away with cheating then we will do it.”

This group of students did not approach decisions about violating academic integrity as a moral decision, but as a rational cost–benefit decision not unlike the types of decision they were often encouraged to make when analyzing various case studies in their business courses. Again, these students were not unique in their approach. Christensen Hughes (2017) found a similar attitude among some of the nearly 15,000 Canadian students she and McCabe had surveyed (2006a, 2006b). This business, rather than ethical, mindset was also one of the blind spots that Bazerman and Tenbrunsel (2012) identified in their work on infamous business decisions in American corporate history.

Students may have thought that this selective adherence was a smart way to approach academic integrity; they knew that some of the most common types of violations, like unauthorized collaboration and use of material (e.g., textbook answer keys or case solutions), were the most difficult to “catch” and typically connected to less significant assignments (i.e., those worth a smaller percentage of the student’s final grade) meaning that in the unlikely chance they were caught, sanctions tended to be minimal. Unfortunately, this process of moral disengagement where students convinced themselves that it was okay to violate academic integrity in some circumstances was found to be susceptible to turning into another one of the blind spots that Bazerman and Tenbrunsel (2012) identified, namely the slippery slope (Gino & Bazerman, 2009), whereby small violations led to more significant violations overtime. In other work, we demonstrated that unlike what the fourth year student above stated, all students did not share the same opinion of what was okay and what wasn’t okay and the more situations in which students believed it was okay to violate academic integrity the higher rates of violations they had both in the specific trivial behaviours they were evaluating as well as minor and major violations more generally (Packalen & Rowbotham, 2021).

Unknown if Adhering. The last group, based on type of adherence, was mentioned by five students from first and second year. This group was distinguished by the fact that they generally had good intentions, but sometimes unknowingly violated academic integrity. As one second year student wrote, “Some people breach academic integrity because they do not know the rules rather than it being intentional.”

These students were “nervous that they will break the rules without meaning to and get kicked out of the program” (first year student). They also spoke to the fact that “some forms of academic integrity are hard to distinguish. What is allowed vs what is not” (third year student), and that they thought “people don't intentionally commit academic integrity for the most part” (second year student).

Student-Written Recommendations for Students, Faculty and Administration

In the computer facilitated focus groups we also asked students for their suggestions of actions students could take to improve the culture of academic integrity and those that faculty and administration could take. Table 18.2 provides the summary of suggestions for students and Table 18.3 provides the summary of suggestions for faculty and administration.

Table 18.2 Summary of students’ suggestions of actions students can take to improve the culture of academic integrity and the percent of each year that agreed with the suggestion (n = 44)
Table 18.3 Summary of students’ suggestions of actions faculty and administration can take to improve the culture of academic integrity and the percent of each year that agreed with the suggestion (N = 44)

We aggregated students’ individual comments into representative ideas and grouped those ideas by themes. The original comments upon which the representative ideas were based are included in an online Appendix on SpringerLink’s website for this book. Within each theme we grouped ideas roughly by years in which such ideas were mentioned. In this way we could see which themes were more predominant among different groups and how suggested actions within a theme changed by cohort. For example, as shown in Table 18.2, first year students tended to focus on changes in individual perspective and attitude as a way to improve the culture of academic integrity, while second year students largely made suggestions around proactive actions designed to limit the likelihood of both violating academic integrity themselves as well as acting as a facilitator in others’ violations of academic integrity. Third and fourth year students took a more holistic view and provided suggestions meant to foster and respect a culture of academic integrity.

Such cohort patterns were not observed to the same extent among the suggestions to faculty and administration. Rather, we saw an increase in the overall number of suggestions as compared to the number of suggestions students had for themselves and some of the suggestions were mentioned by at least three of the four years. We grouped suggestions into those that addressed the policy, structure and culture of the program, those that were specific to the policy and its enforcement, and suggestions that addressed several aspects of assignments.

Reflections on Students’ Recommendations

Recommendations for Themselves

Our initial reaction when reading through the recommendations that students provided was that they understood many of the drivers of the relatively poor culture of academic integrity in the program. These included a culture of competition, a pressure to excel in all aspects of life (academics, extracurricular activities, social and professional), and an academic environment that made the ease of violating academic integrity high and the likelihood of consequences low.

Our second reaction was that there was a big difference between knowing what you should do and doing the work needed to accomplish that task. For example, as one first year student suggested,

There are different ways by which to measure success! Not just marks or people’s opinions - maybe creativity, the interesting books you've read and learned from them, how much you learn in general, etc.

Yet, while half the students in the first year focus group agreed with this statement as a means to help improve the culture of integrity in the program, and 80 percent agreed with another first year’s comment to “Learn to accept failure and not be so competitive to prove yourself to others,” without specific guidance on how to change their mindset and repeated messaging from their peers, faculty, and program administrators to help them improve their resilience, we suspect that students would struggle to appropriately change their mindset in this respect.

Perhaps this is why the first year students, who had the most frequent and reoccurring messaging about academic integrity and were relatively new to the program and its demands, were the cohort to take the most responsibility for their own perspective and attitudes. By the second year, students, who were in their most challenging year academically, were much more focused on the low hanging fruit as a means to improve academic integrity. These included suggestions like avoiding public spaces where they would be pressured to share information on quizzes and/or work together on individual assignments. While these suggestions might make a dent in the number of violations, they did not address the underlying culture of academic integrity.

In the third year, students were beginning to perceive that the system in which they were operating was broken and by the fourth year students were struggling to manage recruiting, interviewing and coursework (Packalen & Rowbotham, 2020). Thus, we saw a movement in the students’ suggestions from suggestions that were individually-focused to suggestions on what they could do to improve the respect for and culture of academic integrity more broadly. For example, 60 percent of third year students agreed with their peer’s suggestion to “Create an environment where breaches of academic integrity are looked down upon.” Interestingly, several of the fourth year students’ suggestions, including the three suggestions which received the most votes, were not suggestions on how to improve the culture among their own cohort, but what they as a cohort could do to improve the culture for those in lower years in the program. One interpretation of this finding was that the fourth year students viewed themselves as a lost cause and thus felt their efforts would be better directed to providing solutions that would improve the situation of those for whom all was not lost.

Recommendations for Faculty and Administration

Turning to the suggestions that students had for faculty and administration we saw a more consistent message across cohorts. First and foremost, students said faculty should do everything possible to eliminate the temptation for students to violate academic integrity. These suggestions spanned policy, attendance (which was often required in courses), providing ample supports for learning course material, logistics related to assignments and tests, and pedagogical best practices related to assignment design and evaluations. Faculty and administrators need not agree with all suggestions—we for one didn’t think that being more generous with grades, as first and second year students suggested, was the answer—but instead of outright dismissing them we considered what might be driving these suggestions. In this case, the requests for easier grading and suggestions to faculty and administration to stop focusing so much on grades likely connected to the aforementioned pressure to succeed.

The first author of this chapter has repeatedly said “academic integrity violations are often a symptom of a larger problem.” At the individual level the underlying problems have regularly been extenuating circumstances related to mental health and/or addiction concerns. At the program level, as students in our focus groups have identified, these have tied back to the issues around respect towards and the culture of academic integrity. Yet when we considered some of the other suggestions that students had for faculty and administration, particularly from those in later years as they attempted to manage school work, find a job post-graduation and do the right type and quantity of extracurricular activities to stand out among their peers and get that desired job, we were struck by a new possibility that we are excited to investigate more in future research.

At a preliminary level we wondered if one way to improve a culture of academic integrity was to better align expectations of both faculty and students. For example, for many faculty members, teaching is only one aspect of their job. Yet many of the students’ suggestions and comments implied that they thought that the primary, if not exclusive, responsibility of faculty members was to teach. Their expectations for 100 percent new material each year was neither realistic nor sound from a pedagogical perspective. Certain lectures, cases and assignments have been repeated because those have been the material that best serve the learning goals of the course. At the same time, however, we thought students’ recommendations to not use the same exams, midterms or problem sets from one year to next were some of the fastest and easiest ways to reduce students’ abilities to violate academic integrity. We were also encouraged to see that students welcomed assignments designed to encourage learning rather than memorization of facts. In this respect they validated the advice that education specialists have shared with faculty members for years regarding ways to proactively decrease the likelihood of students violating academic integrity.

The idea of workload, however, was not just about improving students’ understanding that faculty had other responsibilities than teaching, it was also about faculty having a better understanding of students’ workload and competing demands. If we thought of students’ school-related workload of comprising three or four main activities—coursework, job search and resume building undertakings such as extracurricular activities and/or part-time job(s)—we wondered how students would allocate the percentage of their time between the activities and how faculty would do the same. Although future research is required to answer this question we are quite confident many instructors view students’ number one priority as coursework and as such they think students should allocate the greatest proportion of their school-related workload to said activity. In contrast, when we looked at students’ suggestions for faculty and administration, such as allowing for extensions when asked, not penalizing students when they missed class for extracurricular or recruiting activities, and having an official form to fill out for recruiting absences, many of these suggested that students thought faculty were unsupportive or unwilling to help students manage these competing demands. If these tensions were clarified, not only in terms of priorities, as one fourth year student writes, “Administration making goals of program more clear: they pride the program on the student conferences and how well students do in recruiting but provide little support to help students achieve this.”, but also in terms of percentage of reasonable time commitment, perhaps there would be more room both to implement the suggestions that students made for faculty and administration, but also for students to implement the suggestions made for themselves.

Limitations of Study and Computer-Facilitated Focus Group Methodology

Our sample size was small and we conducted a limited number of focus groups. In addition, the focus groups prioritized depth of discussion over standardized responses. For example, only one student explicitly distinguished between themselves and their peers, stating “I personally take academic integrity seriously but my friends think that it's a grey area where they can get away with it time to time” (second year student). As such, it was impossible to determine how much the statements reflected students’ own attitudes or the attitude that they thought was prevalent among the student body. Lack of standardization was also evident when we looked at the guidance provided and then voted upon (Tables 18.2 and 18.3). These lists were generated first by students brainstorming possibilities and second with them voting on those options. Therefore, unless a student in a group offered a solution, their group could not vote on the suggestion. Practically this meant that while we could see agreement between years on certain features, the absence of a vote on a particular item did not mean that students would not have voted for the item if they had been given a chance to do so.

Closely related to the aforementioned limitations, the facilitator who ran the computer-facilitated focus groups was a professional who has been facilitating these types of sessions for well over a decade. Thus, he has developed a good sense of how to establish a level of trust and openness with the groups he has facilitated and with the group’s help identify themes in the data to aid the aggregation of participants’ comments for later voting. He was not, however, a subject matter expert. As such, some of the ideas that he combined were ones that we would not have combined given that we would expect students would react differently to specific comments within the grouped set. Thus, when students ranked or voted on these combined items we were unable to sort out whether they are reacting to what they perceived to be the most favourable, least favourable or averaged opinion in the group of combined ideas.

Finally, the demographic and socioeconomic profile of the student body was perhaps less diverse than programs in other universities and thus some of the attitudes, behaviours and suggestions could be less applicable to larger programs, those that rely more on commuter or part-time students, and/or those that have a more demographically diverse population. For example, in this program the passing of notes and assignments from one class to the next through membership in exclusive clubs has been an ongoing challenge that was acknowledged both in terms of suggestions of behaviours students should not do, but also in terms of advice to faculty. These shared cloud-based file folders have been the modern-day version of assignment filing cabinets in fraternities (Stannard & Bowers, 1970). Nevertheless, for schools that do not have such a tightknit group and/or strong connections between program years, we suspect that paid note sharing sites have become the digital era equivalent to assignment filing cabinets that are available to anyone who is willing and able to pay a fee to obtain them. Moreover, the similarity in behaviours and attitudes among our students and a larger Canadian population of students (Christensen Hughes, 2017; Christensen Hughes & McCabe 2006a, b) reassured us that the general trends we observed among our small group of students were more reflective of the larger population than not.

Conclusion

We undertook this study because we recognized the importance of obtaining the student voice not only as related to students’ self-reported engagement in violations of academic integrity but also with respect to their attitude towards academic integrity and their own suggestions on what they, as well as faculty and administration, might do to improve the culture of academic integrity. Their responses revealed that students understood how the environment in which they were situated could foster a culture which undermined academic integrity; they also understood what they could do at both a macro- and micro-level to improve their own academic integrity and the culture of academic integrity in the program. Importantly, the students also reminded us that absent faculty and administration support and willingness to make macro- and micro-level changes such as the ones they suggested, their efforts would meet limited success. Academic integrity is not a student issue, but an institutional issue that requires administration, faculty and students alike to all do their part in fostering a culture of academic integrity.