Challenge Statement

In coping with the effects of the COVID-19 pandemic, student mental health has become extremely important to creating a successful learning environment. The stresses and restrictions associated with the pandemic have put university students at greater risk of developing mental health issues, which may significantly impair their academic success, social interactions, and their future career opportunities. In particular, students reported heightened stress, sadness, loss of routines, anxiety, and depression46 due to shifts in academic and social environments, which can impact testing and grades.8, 24 Moreover, during the spread of COVID-19 remote education has become necessary, and student motivation and engagement in the classroom have been adversely impacted.42 Student anxiety in higher education has been also a growing concern in recent years.7, 20, 38

There is a strong, direct path from general anxiety to test anxiety.34 Test anxiety can have a significant negative impact on student well-being,33 academic performance,25, 45 and persistence of marginalized or underrepresented students in science, technology, engineering, and mathematics (STEM) degree programs.8, 27 Test anxiety is defined as a set of phenomenological, psychological, and behavioral responses that accompany concern about possible negative consequences or failure of an exam or similar evaluation situations.13 Different factors influence test anxiety among college students such as student perception regarding the knowledge they have, as well as instructor’s teaching and evaluation methods. With the transition to remote learning during the pandemic, test anxiety may be exacerbated by several factors related to student access to resources, study habits, and exam formats,24 and student performance is adversely affected.8 As we return to in-person learning, investigating the contribution of these factors will inform future data collection and analysis to better distinguish the effects of exam and learning formats on student anxiety and academic performance. This is the motivation for our study.

Test anxiety has two fundamental elements: cognitive and physiological components.29 The cognitive component refers to worry which affects one’s confidence in their problem-solving and test-taking abilities. On the other hand, the physiological component of the test anxiety, also referred to as emotionality component, is manifested in bodily indications such as nervousness, sweaty palms, and heart rate increase arising from being in an evaluative situation.39 Another theory about test anxiety is the deficit model. This model hypothesizes that students with test anxiety are impacted by feelings of expected failure and self-criticism in testing situations.36 This theory is observed more often in engineering and STEM classrooms, where students have reported low confidence levels when it comes to taking exams.27 Both theories suggest that students dealing with test anxiety can experience a myriad of different thoughts and feelings that divide their attention between stress management and test-taking.18 Without the proper tools to handle test anxiety, worry and decreased productivity can lead a student to be overcome with self-criticism and concerns about performance.34 These experiences can manifest in classroom disengagement and frustration with learning, two factors that can exacerbate testing anxiety.

Learning may be measured in various ways using homework assignments, class activities, and exams. Instructors need to find new evaluation and assessment strategies to decrease the anxiety level of learners and help students engage with learning in their courses.30 In this regard, incorporating academic games is a powerful technique to potentially relieve test anxiety,40, 41 reduce student’s overall stress, improve academic performance,23, 45 and promote engagement23, 40 as well as student learning by allowing alternative interfaces to traditional tests.26 Game-based learning, or gamification, is an approach that incorporates game design elements into a non-game context such as education.9 Gamification can be used for many aspects of education whether in teaching, studying, or assessment. In particular, for STEM education, this type of pedagogy is uniquely suited to create an environment that subverts the traditional classroom setting,6 especially since research shows that students in STEM have a higher level of stress when taking exams in their field.27

Moreover, incorporating more game-based learning into classrooms within a STEM context could be one way to engage students by forming positive perceptions,15 and motivate them to pursue undergraduate and graduate careers in STEM-related areas.31 Gamification techniques have been previously reviewed in certain STEM areas31 and in particular in engineering education.2 Different aspects including the effect of gamified STEM practices on the learning experience and student satisfaction,3 as well as on academic performance40 have been explored.

However, practices of gamifying evaluation activities have been a challenging task (see22 for a comprehensive review of the gamification research), since the associated mechanisms and methods are not all well-known, and the knowledge of how to adequately gamify an activity in accordance with the specifics of the educational context is still limited.11 More precisely, due to limited literature on the implementation guidelines of the gamified designs, there is a gap between theory and practice in the study of gamification.1 However, most of the studies were descriptive in nature and rarely explained what was meant by gamification and how it worked in engineering education; and no work to this date has focused exclusively on biomedical engineering (BME).

The inclusion of game-based assessments and their successful impact on test anxiety has been previously explored for an English course using Quizzizz platform,33 a foreign language writing class using Edmodo gaming software,44 for evaluation of first aid knowledge,37 and for software engineering education in a self-learning computer-based environment called GSEELS.40 These studies suggest that incorporating games into course evaluations can alleviate the anxiety associated with tests, and therefore promote student academic performance and contribute positively to their mental well-being. However, we would like to highlight that prior research does not address test anxiety in STEM education. Moreover, the GSEELS platform presented for software engineering education in40 cannot be easily developed by other educators or integrated as a part of exams in face-to-face classes. Adopting game-based practices for assessment is something that is not usually promoted in traditional BME courses and is our goal. We provide a description of how to incorporate academic games as a method to alleviate test anxiety, with a detailed plan for using games in a BME classroom.

Novel Initiative

The goal of this manuscript is to explore gamification application and its effects on alleviating student test anxiety and to discuss the process for the inclusion of game-based assessment as a part of a traditional exam in an undergraduate BME classroom. Students were evaluated using game activities in their midterm exam and the exam review session. Detailed instruction on content development, the specific implementation of the games, and preliminary assessment using student feedback are provided, to help BME educators envision how these games could be implemented in classes on various topics. The proposed strategy would be applicable in both virtual and in-person teaching. While gamification in pedagogy is not new, its effect on test anxiety has not been previously explored in a BME setting.

Game activities, such as Pictionary and crossword puzzles, were implemented in Biofluid Mechanics class, which is a third-year level BME core course. The course teaches introductory concepts of cardiovascular fluid mechanics and the characteristics of biological systems. The objective of the course was to become familiar with the cardiovascular system and understand and apply fluid mechanics concepts to characterize biological flows. Student performance in the course is evaluated using the following components: 3 exams, 8 assigned homework, and 12 in-class exercises and group activities. Each exam and homework addressed multiple course objectives such as: calculating biofluid properties given appropriate information, using the concepts of viscosity and shear stress, determining the pressure at various locations in a biofluid at rest, applying the Bernoulli equation and conservation of mass to solve fluid flow problems, analyzing the flow around a catheter, identifying and understanding various characteristics of blood flow modeling in the cardiovascular system, as well as fluid dynamics of cerebrovascular and cardiovascular diseases. Each course objective was addressed through the lecture and in-class activities, assigned homework, and specific exam questions.

The games were incorporated as a part of the exam review session and one of the midterm exams. The total enrollment in the class was 29 students and the participation in the game activity during the review session was voluntary, however, students were expected to attend the exam session and complete the puzzle activity. We observed that 27 students participated in the review session (93% of the course enrollment) and 100% of the class participants attended the midterm exam session. Students were not notified of the inclusion of the gamified activity as a part of the test before the exam day.

Figure 1 illustrates the steps for playing these academic games in the BME classroom for course evaluation. Six key elements must be present in an academic game including goal, preparation, tools, implementation of the activity, monitoring, and assessment.4 We have previously elaborated on these elements to implement an academic game in an online biomedical engineering course to improve motivation through the engagement.12 In this study, we ensure that the aforementioned sequential phases are addressed to create a game for course evaluation.

Figure 1
figure 1

Overview of steps for academic game for evaluation including game preparation (top), Pictionary as a part of exam review session and a crossword puzzle as a part of a classic test (middle), as well as assessment of the gamified activity using student feedback survey.

The first step, the “goal”, starts with the instructor establishing a well-defined learning objective and addressing what students need to achieve by engaging in the gaming process. The “preparation” and “tools” steps involve the design of the activity. The incorporation of BME concepts is in the “preparation” stage of gamification, regardless of the type of game. For game preparation, a list of keywords is composed and included in both crossword puzzles and Pictionary activities (Fig. 1). The selected keywords should reflect course material to fulfill the learning objective of practicing and reviewing the terminologies. For example, the keyword list can include but is not limited to: streamline, density, artery, viscosity, Bernoulli, stent, stenosis, blood flow, velocity profile, pressure gradient, laminar flow, inviscid, volumetric flow rate, aneurysm, and cardiac cycle. Part of preparation also includes creating appropriate tools to play the game such as hand-out and materials required for the game. Uploading instructions about the tools into Blackboard Learn management system ahead of time is helpful. Instructors can record themselves and post a video to deliver a message to the class about the planned game and its goal. The “implementation” phase involves the ways the game is initiated, and the activity unfolds during student evaluation activities in face-to-face BME classrooms. The presented approach can be also used in remote learning and hybrid classes as well with some minor alterations. While students are playing the game, it is helpful to observe students’ performance and intervene when necessary to ensure the students are playing the game in the desired manner in “monitoring” phase. Finally, quantitative data can also be collected from pre-, mid-, and post-surveys for the “assessment” phase.

Pictionary was employed for the assessment of student learning by using an in-class group activity during the exam review session. Pictionary can be played in two different methods. In the first method, the instructor presents a picture or a demonstration relating to a course concept, such as a diagram of laminar flow, and students explain the meaning and application of the picture. In biofluid mechanics, the definition of viscosity would be explained by showing that blood has a higher viscosity than water. The second method to play Pictionary involves students drawing a picture for each keyword or concept. Both methods require students to have a previous understanding of the material to guess or draw the pictures.

During the exam review class, students played Pictionary using the second method (Fig. 1, “implementation” panel). Instead of diving into a formal lecture for exam review, the instructor used the first part of the class to complete a hands-on activity with the students so that they could explore some of the concepts they have learned. Students were asked to work in small groups (3-4 students per group) and given a group of 10 biofluid key terms associated with what they were learning during this course. The instruction behind the Pictionary activity was to find or draw associated pictures with each term. Students were asked to illustrate and explain each term using a picture, preferably related to the BME field. During the exercise, the instructor walked around, observed discussions, and helped students think through the gamified activity. After the Pictionary activity, the instructor went through a practice exam and made sure to vocalize certain keywords that were included in Pictionary and summarized how these terms would relate to the concepts of Biofluid. This method was used to test whether students could identify the keywords, definitions, formulas, and concepts applied in the exam problem set. A sample of Pictionary activity is presented in Fig. 2.

Figure 2
figure 2

A sample of completed Pictionary activity during the exam review session: the instructor provided a list of 10 keywords from course materials related to learning objectives. Students divided into teams and were asked to find or draw a picture that explained the meaning or application of the concept.

Additional game ideas that can be used for assessment, particularly for an exam review session, are provided in Fig. 1, including Bingo and Kahoot. In Kahoot, which is a trivia game-based learning platform, students answer the trivia questions on their own instead of in groups. Questions and answers are input into a game on kahoot.com. When it is time to play, the instructor will share a screen that gives students a code to join the game on their phones or mobile devices. During the game, multiple-choice or true-false questions come up on the screen and students select the corresponding answer on their phones. After everyone answers the question, Kahoot shows the distribution of how many students selected each answer option. The preparation of Bingo involves creating a list of keywords and using an online Bingo card generator to make and distribute the cards. A sample of the generated Bingo card is shown in Fig. 1. Bingo cards can be used for any content area to reinforce definitions. In the classroom, the instructor asks students to select a random card and specifies the manner in which students mark the keywords on their Bingo cards. One method is that the instructor solves exam review problems and students mark keywords as they notice definitions, formulas, or concepts applied in the problem set. Students who mark 5 keywords in the same horizontal, vertical, or diagonal row call out the word “Bingo”. We have previously presented a tutorial for the implementation of Bingo using different gameplay in BME classrooms12 and we refer interested readers to this reference for additional details.

During the midterm exam in this course, a crossword puzzle was incorporated as one of the questions and handed out to students in the classroom during the exam to determine if gamification during testing effectively reduces student test anxiety. The exam also included two computational problems: to apply the Bernoulli equation to solve a flow problem in a hypodermic syringe containing medicine and to calculate flow characteristics and streamline patterns based on the given velocity field inside the bulge of an aneurysm. The puzzle was based on biofluid concepts and terminologies, and students used their knowledge of fluid dynamics to fill out the puzzle. To achieve the learning objective, the instructor created a list of course concepts and definitions to use in the crossword puzzle. This list was loaded into an online crossword generator (https://puzzlemaker.discoveryeducation.com/) to create the crossword puzzle. The puzzle was also modified with images for potential hints (Fig. 3). Clues would be definitions of terminology or formulas for students to identify. Students were required to find the word that fit the description of a fluid mechanics principle or a cardiovascular concept and fill in its correlating blank spot. After exam time was over students were given till the end of the day to upload a picture or screenshot of their completed puzzle on the course website. The option of puzzle submission after the exam time is considered since we did not want the lack of time to cause anxiety for students. The midterm was scheduled during the 50-minute class period, and the provided time may not be enough for some students to solve the computational questions on the test, as well as to complete the puzzle question. Similar games to crossword puzzles such as word search can also be incorporated into homework assignments or course exams. The puzzle can also be distributed in an online link or downloadable PDF via email or a file-sharing service, such as Blackboard or Canvas.

Figure 3
figure 3

Sample of a completed crossword puzzle that was included in the midterm exam of biofluid class. Puzzle answers are provided next to each statement in the bottom panel. Clues would be definitions of terminology or formulas for students to identify while they apply these concepts in problems on the exam.

Evaluation and Reflection

The goal of including puzzles and Pictionary within the student’s exam process was to reduce test anxiety. The effectiveness of the approach is evaluated by collecting student feedback in an anonymous survey using Google Forms. The survey contained no sensitive personal questions or any personal information that could stigmatize an individual and was deemed exempt by Institutional Review Board. We distributed the survey after students completed the exam and answered fifteen questions on a five-point Likert response scale ranging from 1 (strongly disagree) to 5 (strongly agree). 93% of the class participants (27 students) completed the questionnaire. The statements on the post-game survey are listed in Table 1 and the results are presented in Figs. 4, 5, and 6.

Table 1 Survey questions for student engagement, emotional response to test anxiety, and their perception of academic performance and study skills.
Figure 4
figure 4

Anonymous survey data on student perception of engagement. Results represent the percentage of students participating in the game and responding to the statements Q1 to Q5 provided in Table 1.

Figure 5
figure 5

Anonymous survey data on student perception of test anxiety. Results represent the percentage of students participating in the game and responding to the statements Q6 to Q10 provided in Table 1.

Figure 6
figure 6

Anonymous survey data on student perception of game activity and their academic performance. Results represent the percentage of students participating in the game and responding to the statements Q11 to Q15 provided in Table 1.

The survey questions broke down into three sections. There are five questions gauging student emotional and behavioral engagement while playing the games, five questions gauging test anxiety, and five questions about student perception of gamified activity and their confidence and performance in the class. The first section covers key aspects of the test anxiety involving engagement and understanding of the material. This section is included because test anxiety is identified as a variable related to the emotional engagements.28 More precisely, reducing test anxiety has been repeatedly linked to learning strategies that increase student engagement. It has been shown that test anxiety negatively affects engagement and student perception of academic performance.5 It should be noted that student engagement is measured as interest in the activity and subject matter35 and can be defined by student participation both in the game and during class, such as asking questions, as well as positive emotional reactions, to peers, instructors, and self. The questions for this portion of the survey are taken from the established engagement-disengagement scale17, 19 that are designed to assess various aspects of student motivation and engagement styles, such as emotional, cognitive, and behavioral. Interested reader is referred to10 and12 for a description of each of these engagement categories.

The second section of the survey is on cognitive and emotional components of test anxiety which is focused on how the student feels before, and during an exam. Lastly, the perception of academic performance section considers how a student feels about their academics and feedback on the gamified exam experience. Individual perception is critical to encourage self-reported reflection about the game after it has been concluded to get students to truly connect the activity with the lesson. Therefore, we considered questions that can demonstrate the effectiveness of our gamification strategy from a student point of view and how students felt about their academic performance while playing the game. Questions for these two sections are obtained from Cognitive Test Anxiety Scale (CTAS),39 the Motivated Strategies for Learning Questionnaire (MSLQ),32 as well as27 which used the MSLQ to study the effects of test anxiety on engineering students.

In the engagement part of the survey, 96% of students noted enjoying learning new things and participating in discussions in class (Q1). Furthermore, 86% of students agreed that, when playing games during this class, they try to summarize the key concepts in their own words (Q2), which shows that student engagement during gamified activity is high, as the game captivated their attention and encouraged them to focus on the concept. Next, 85% of students agree or strongly agree that they had fun participating in gamified activities during this class (Q3) which suggests that gamification can bring an entertaining aspect that keeps students engaged in learning. Along with this, 100% of students agree that playing academic games encourages them to ask questions (Q4), which stimulates student learning. Moreover, 100% of the students either agree or strongly agree that they attend class regularly (Q5). Because of these agreeable survey results in different questions for the engagement portion of the survey, we can conclude that gamification increased engagement (Fig. 4).

In the test anxiety portion of the survey, 52% percent of students agree or strongly agree that they focus on items they don’t know how to answer when they are taking an exam (Q6). This number is significant because almost half of the class experiences a type of negative thoughts that can disrupt performance, and this stress response is related to test anxiety components.32 when presented with questions they don’t know how to answer.

Another important piece of data is that 41% of students either agree or strongly agree that they have an upset or uneasy feeling while taking an exam (Q7), which affects their ability to perform. 41% of students agree or strongly agree that they think about how poorly they are doing compared to other students (Q8) and 44% of the students agree or strongly agree that their heart rate increases while taking an exam (Q9). One piece of crucial data was that 85% of the students agree or strongly agreed they feel more confident or relaxed before an exam if they know that the test will include game activities (Q10). This result suggests that even if a student does not show symptoms of test anxiety, they find relief and confidence in knowing that games will be included. The increase in percentage to Q10 question suggests those students may experience test anxiety at varying levels and might not even know that it affects them. The distributions of ratings for the results of the anxiety section of the survey are shown in Fig. 5.

The final section of the survey involved student perception of academic performance. In this section, 89% of the students either agree or strongly agree that games help them study for the class and clarify course concepts (Q11). This result indicates that participating in games is helpful in acquiring a greater level of understanding of the material in this course, and stimulated student thinking during assessment activities. The survey data also showed that 93% of the students felt that playing a game during the review session decreased their anxiety surrounding the upcoming exam (Q12), which demonstrate positive students feedback on the usefulness of the game in decreasing their stress. Additionally, most students agree or strongly agree that the game activities were integrated well into the course and material review (Q13), while 93% made it evident that they are confident they can do an excellent job on the assignments and tests in this course (Q14). The high percentage in this survey question indicates that the majority of students have confidence in their academic performance. Likewise, 89% of students either agree or strongly agree that the puzzle was less intimidating than a classic exam format (Q15), which implies that the students felt less stress than in a traditional exam question. The results of this section of the survey are shown in Fig. 6.

Based on the survey, students agreed that they were more engaged in class and felt more comfortable and relaxed when gamification was included in course evaluation. Also, our data suggest that gamification results in reduced test anxiety and increases student perception of their performance during the exam of a BME course. Therefore, game-based learning can contribute to the feeling of confidence and seems to be effective in lowering test anxiety and nervousness. This suggests that utilizing games for exams is beneficial to students who experience test anxiety and have trouble performing in traditional exam settings.

This study offers a simple strategy for evaluation and assessment of test anxiety that can be easily employed by other BME educators; however, there are some situations where games might offer no useful elements at all (see14 for details on the controversy). For example, integration with the curriculum is a key challenge, it might be difficult at times for the instructor to match the game with instructional goals and a poor fit will hamper learning.43 Moreover, if the game is not simple, it might be sometimes more time-consuming and distracting than engaging. Therefore, relying on games only for course assessment and student performance without an understanding of course objectives can be detrimental. Games cannot be used to replace pedagogy but can be integrated into the course assessment to alleviate test anxiety and enhance the overall learning experience.

For future studies, the success of the game approach will be measured using course marks, lecturer evaluations, lecture attendance, and more detailed student feedback. Our assessment in future studies can be strengthened by incorporating more comprehensive established assessment tools available in the literature Student feedback will be obtained using reflection assignments, course surveys, engagement scales, blog posts, activity evaluation rubrics, and individual meetings with students, to enhance our perception of student thought processes, and continuously improve the game activity. The current post-survey was not designed to be strenuous or time-consuming for students to generate high student participation in the post-game questionnaire, as students are unlikely to respond to long surveys. Moreover, we did not offer any reward to evaluate if students have intrinsic motives for participating in the game. A reward or incentive can be offered to induce motivation and participation. However, this reward must be carefully selected. In,21 some reward ideas are listed without the undermining effect on intrinsic motivation. Furthermore, our study is limited to small sample size. The number of participants should be increased, and parametric statistics should be used in future studies. Finally, the time-consuming nature of the gamification process may outweigh the potential benefits. Games often require a great amount of time to implement in the classroom, which is a challenge considering the time constraints in most formal classrooms. Moreover, before implementing gamification into the classroom evaluation for the first time, instructors should plan to devote a significant amount of time to the preparation, adjusting assessments and content sequence, and ensuring the game mechanics and components are present in the design of the course. Instructors wishing to use a computer-based game or simulation should expect to devote additional preparation time to integrate their content, assignments, and storylines. It was found that bringing gamification into the classroom, on average, has doubled course preparation time.16 The time commitment required for future semesters will be significantly reduced. Yet, the time and effort required to develop games that are both engaging and educational may still make this approach time prohibitive for many instructors.