Skip to main content

E-proctored exams during the COVID-19 pandemic: A close understanding

Abstract

Researchers have focused on evaluating and exploring the online examination experience during the COVID-19 pandemic. However, understanding the perceptions of using an e-proctoring tool within the online examination experience is still limited. This study explores the first unique experience for students’ attitudes and concerns using an e-proctoring tool in their final exams during the COVID-19 pandemic. It also highlights the e-tools’ impact on students’ performances to guide educational institutions towards appropriate practices going forward, especially as the pandemic is expected to have far-reaching consequences. A mixed-methods analysis was used to examine heterogeneous sources of data including self-reported data and officially documented data. The data was analyzed by a qualitative analysis of the focus group and quantitative analyses of the survey questions and exam attempts. In June 2020, students participated in a focus group to elaborate on their attitudes and concerns pertaining to their e-proctoring experience. Based on the preliminary outcomes, a survey was developed and distributed to a purposive sample (n = 106) of students from information technology majors who had taken at least one e-proctored exam during the COVID-19 pandemic. Finally, 21 online exams with 815 total attempts were analyzed to assess how well students performed under an e-proctored test. The study’s findings shed light on students’ perceptions of their e-proctoring experience, including their predominant concerns over privacy and various environmental and psychological factors. The research also highlights challenges in implementing the e-proctoring tool as well as its impact on students’ performance.

Introduction

E-Learning has become an increasingly popular teaching method in the past few years. In fact, more than 6 million students took at least one online course in the USA back in 2016 (Seaman et al. 2018). These online courses were not only aimed at attracting learns from abroad but also allowed students on campus to pursue their education as well, leading to the localization of online classes and e-learning environments (Seaman et al. 2018). However, the COVID-19 pandemic has forced nearly all students globally to either stop their education or depend entirely on online distance learning.

E-Learning has been studied thoroughly in the literature from different points of view; i.e., attitudes and performances (Kumi-Yeboah et al. 2017; Yeh et al. 2019), learning management systems (Janson et al. 2017), and online courses and e-activities (Bovermann and Bastiaens 2020). However, conducting an “e-assessment” (Boitshwarelo et al. 2017) of students’ performances under e-proctoring tools is still very limited (Milone et al. 2017; Boitshwarelo et al. 2017), especially during the COVID-19 pandemic.

However, academic communities should consider students’ difficulties and concerns about e-proctoring tools before implementing them.

One of the primary concerns for the educational system is the integrity of online assessments (Milone et al. 2017); that is, the need to conduct the exams using the appropriate tools and methods (Burgess and Sievertsen 2020). With the sudden increase in online distance learning and the need to ensure academic integrity, universities have adopted different e-proctoring technologies to monitor online exams. This technology validates students’ identities and flags suspicious activities during the exam to discourage cheating. During the COVID-19 pandemic, in particular, the rapid digital transformation has been astounding. One online proctoring company switched from having 100 customers per year to having 120 customers per day (Drew 2020). Although there has been a vast movement towards online proctoring, schools are still expecting to “have larger measurement errors than usual” (Burgess and Sievertsen 2020).

In this study, students’ attitudes and experiences are explored regarding the use of e-proctoring tools in their final exams during the COVID-19 pandemic. The study aims to investigate: (1) students’ concerns over the e-proctoring tool, (2) students’ attitudes towards academic integrity under e-proctoring, (3) the impact of using e-proctoring tools on students’ performance, and (4) students’ satisfaction with the e-proctoring tool.

To achieve the objectives of the study, Section 2 covers the essential background for this research, Section 3 focuses on the methods and materials used, and Section 4 analyzes the results. Finally, a discussion is conducted in Section 5, and the conclusion is revealed in Section 6.

Background

Online proctoring tools

The online proctor can watch students during online exams by accessing their webcams, screens, and microphones (Drew 2020) to ensure that students are complying with the rules. Usually, the online proctor is associated with an artificial intelligence system that analyzes students’ movements and their environment to decide whether to mark up the potential cheating behavior. E-proctoring tools are usually sourced from for-profit companies that contract with educational institutions to provide real-time, online proctoring services from any location with internet access. Students schedule their exams and then connect with their proctor through these online services (Hollister and Berenson 2009; Hylton et al. 2016). The e-proctor has access to students’ computer microphones and webcams, including a 360° view of the students’ workspaces to ensure that no unauthorized materials are present. Students are required to maintain both an audio and visual connection to the proctor throughout and must first verify their identity by showing their ID. As such, the use of e-proctoring tools requires technical capabilities above and beyond the baseline requirements set by the university (González-González et al. 2020).

Context of UAE higher education during the COVID-19 pandemic

Since February 2020, schools and universities in UAE have been deeply affected by the COVID-19 pandemic. Spring break, for example, was moved from March 29th to March 3rd (The National 2020), and shortly afterward, the Ministry of Education enforced distance learning to protect students, faculty, and staff (Godinho 2020). In fact, on March 30th, the spokesperson for the Ministry of Education announced that “the implementation of the distance learning system will continue throughout the 2019-2020 academic year” (Godinho 2020). This declaration was followed by Ministerial Decree No. 237, which asked universities to switch entirely to distance learning until September 2020 (ECT 2020). The decree also stated that appropriate remote assessment tools should be put in place to preserve academic integrity and maintain educational standards. In response to the decree, universities and colleges began the process of obtaining the tools to oversee the educational process, implementing the needed internal policies to ensure the success of this unprecedented period, and training the faculty and students on how to use these tools. Perhaps the most critical requirement for the full distance learning system was the online proctor tool. Colleges have been racing to obtain a good online proctor tool to maintain the integrity of their education system (Drew 2020).

Previous studies

The transition to online learning during the COVID-19 pandemic has occurred “on an untested and unprecedented scale” (Burgess and Sievertsen 2020). Different factors such as job security and health anxiety could affect students’ achievements and experiences in this context. However, our focus is on exploring the online exam experience associated with e-proctoring tools and how this technology can affect students’ perceptions and testing outcomes. Many studies have examined e-proctoring tools from different dimensions, which are outlined below. (Milone et al. 2017) studied the impact of using e-proctoring systems on the educational experience of pharmacy students in a non-pandemic situation. Based on their study, their college decided to discontinue online proctoring, although they found some benefits when handling courses with a large number of students. The researchers stated that the need for extreme technological requirements, the presence of technical difficulties, and the additional cost associated with each exam were the key factors that outweighed the benefits of the e-proctoring system.

(Wesley Schultz 2001) researched environmental concerns for college students from 10 countries. The study predicted that the “concerns for the consequences of environmental damage would form three correlated factors organized around self, other people, and the biosphere”. Also, Gloria and Ho (2003) investigated the environmental, social, and psychological experiences of undergraduate students. They outline the interrelation between the three domains affecting the students’ self-efficacy, self-attitudes, and achievement expectancies which were significant predictors of the students’ academic performance.

Research has also been conducted to understand the effects of “relationship factors and intra-individual psychological factors” on online exam experience (Vayre and Vonthron 2019). Psychological factors, including self-efficacy and peer support, were considered to be indirect antecedents that affected the online learning process, including the online exam experience. Moreover, cultural factors were considered “determinant[s] of e-learning success and [moderators] of the relationship between use and individual performance” (Aparicio et al. 2016).

Privacy has been identified as the leading concern with eLearning, and in fact, is the most integral aspect covered by the online learning literature (Majeed et al. 2016). Accordingly, the level of trust in the privacy of these eLearning systems was found to be the most decisive factor when considering the adoption of e-proctoring technology (González-González et al. 2020).

In addition to privacy concerns, academic integrity is another critical issue (Drew 2020)(Hollister and Berenson 2009). Even though online exams can be more challenging to students and are all subject to rigorous monitoring processes, studies have revealed that students may think it is easier to cheat on online tests (King et al. 2009). However, other research has provided evidence that this type of cheating does not pay off (Arnold 2016).

Based on the previous research, this study aims to assess the likelihood of success with e-proctoring technology as well as students’ attitudes and experiences with it during the COVID-19 pandemic. Therefore, this paper is set up to answer the following research questions:

  1. RQ1:

    Were students well-prepared technologically for online exams?

  2. RQ2:

    What are the concerns of students while using e-proctoring tools?

  3. RQ3:

    Do students believe that using an e-proctoring tool is important for academic integrity?

  4. RQ4:

    Are students satisfied with the overall experience of using e-proctoring tools in exams?

  5. RQ5:

    Does using an e-proctoring tool affect students’ academic performance?

Based on the above research questions, the study will contribute to an understanding of the holistic experience of using e-proctoring tools, especially during a crisis. The findings will give educational bodies insights into how to use these tools in different situations to increase students’ motivation and engagement. They will also provide guidance for academics in enhancing the richness of assessments and foster their integrity.

Methods and materials

Qualitative and quantitative research methods were used to explore the underlying beliefs, attitudes, and performance of students who conducted their final online exams’ invigilated by the e-proctoring tools. The population included 350 undergraduate students at two universities in Abu Dhabi, UAE who were enrolled in different information technology courses during Spring 2020 when the COVID-19 pandemic hit.

Distance learning during COVID-19 was a critical period regarding changes in learning behaviors in students and education tools. It is agreed in literature that self-reported data does not give an accurate picture of a specific subject unless it is combined with official data (Althubaiti 2016). Thus, to achieve the study objectives, the researchers used three investigation tools (as shown in Fig. 1) (1) a focus group to identify and contrast the key issues faced by students to recognize the main indicators of their experiences; (2) to reach more participants and to gain a deep understanding for further factors; a questionnaire was developed and distributed using the indicators generated by a focus group, and (3) quantitative analysis for exams’ attempts was done for additional insights about students’ performances while using the e-proctoring tool.

Fig. 1
figure1

The research data sources and methods

Focus group

According to recommended focus group methodology (Billups 2012) and (Suh 2002), a semi-structured question guide was developed by the research team, aiming to identify factors influencing university students’ e-proctored exams and related behaviors. After intensive collaboration with experts with ample focus group experience, the questions were carefully developed using appropriate literature (Billups 2012).

When the development was completed, the question guide was tested within and revised by the research team as well as pilot-tested in a group of ten university students. Because no major changes had to be made, ‘pilot’ discussion results were included in the later analysis (Morgan et al. 1998).

The focus group was conducted via a Zoom meeting in June 2020 with students from two universities in Abu Dhabi, UAE. To ensure diversity of perspectives and opinions, an email was sent to all instructors from both universities to nominate two students from their classes who participated in the e-proctoring experience to be part of the focus group. After one week, 57 names were received and listed. A Zoom meeting invitation was emailed to students asking to confirm their participation in the meeting the following week. The invitation also informed participants about their consent to have the meeting recorded. Confirmation was received from 37 students (16 and 21 from each university). However, only 20 students ultimately attended the meetings, 9 and 11 from each university, respectively.

A clarification about the aim of the examination was given toward the start of the focus group, and informed consent was verbally affirmed by every member. The focus group went on for 120 min and was encouraged by an associate moderator (observer), who took notes during the discussion and ensured that he didn’t neglect any members attempting to add remarks.

The question guide contained opening and early on inquiries which permitted members to acclimate and feel associated, and to begin the discussion towards the primary aim of the discussion. Finally, students were approached to share thoughts concerning their experiences in the e-proctored exams to counter undesirable perceptions and attitudes. During focus group discussions the moderator asked some side questions to get more information and depth knowledge about some specific topics.

Survey

In accordance with the previous literature and the findings obtained from the focus group, the understanding of students’ attitudes and their concerns about the e-proctoring experience were translated into main factors and indicators. Consequently, an online survey has been generated. Multifaceted themes were measured using e-proctored exam concerns, technical readiness, and attitudes towards integrity, besides students’ overall satisfaction with the e-proctoring experience. The survey also included several demographic variables to investigate any possible relationships.

The survey consisted of two sections: demographic variable questions included gender, age, marital status, and academic level, while the second section was divided into four subsections with a total of 21 questions to solicit students’ attitudes and beliefs. The questions were three-Likert scale ranging from Agree (3) to Disagree (1) along with a neutral option (2).

A purposive sample included all students from the information technology field in the two universities who took at least one e-proctored exam during the COVID-19 pandemic; the total number of students was 350 (240 and 110 from each university, respectively). The survey was distributed to the students via email two weeks after completing their online exams.

Exam attempts analysis

Quantitative analysis for exams’ attempts was done to gain insights about students’ performances while using the e-proctoring tool. As there have been several debates on how to measure the academic performance of students (e.g., (Kirschner and Karpinski 2010)), this study adopted the conventional measurement of using the exam score as the main indicator by examining the relationship between demographic variables, exam score, exam duration, and the number of automated reported incidents. The reported incidents were generated by the e-proctor tool based on the exams’ perceived integrity level. Both universities held their exams in May 2020. The exam data was available for the 21 courses offered by the two universities and spanned a total of 815 attempts.

Analysis and results

Focus group findings

As the main purpose of the focus group was to generate a list of factors perceived by students as influencing their e-proctored exams’ experience. Immediately after completing the session, data obtained from the recorded Zoom meeting were transcribed. The collected data were reviewed and organized by the two researchers. To guarantee the reliability of data, interpretations were conducted by the two researchers independently. Discrepancies were deliberated with an expert colleague in qualitative analysis until the agreement was reached. To facilitate the process, criteria of internal homogeneity and external heterogeneity were used (Braun and Clarke 2006); that is, no data can fall between two categories, nor can it fit into more than one category. Using an inductive thematic approach, quotes were examined for repeated occurrences, which were then systematically recognized across the data set, and grouped under the manual coding method (Morgan et al. 1998). Similar codes were grouped together into more general main categories. Accordingly, the researchers created thematic categories from the collected data. These categories were then narrowed down to specific e-proctoring indicators to solicit students’ comments and concerns in a questionnaire.

The students in the focus group expressed diverse opinions about their experiences with their e-proctored online exams. The following extracts taken from students’ discussion on the experience of e-proctoring exams provide an example of their perceptions and attitudes:

  • S1 (male): “The feeling of being watched by a camera makes me anxious and nervous

  • S2 (male): “I can’t force my family to stop moving or making noises”.

  • S5 (female): “My reserved family did not accept this type of exam”. Almost all students agreed regarding their social, psychological, and cultural concerns about e-proctoring.

Technological difficulties while navigating the exam questions also arose during the session. S6 (male) remarked that “the signal of my wifi was very poor sometimes. It affected my exams”.

Overall, students were not in favor of e-proctoring tools, mostly due to privacy concerns. S8 (male) stated: “I wouldn’t let a stranger into my house”. Furthermore, doubts were raised regarding the value of this tool to preserve academic integrity and prevent cheating. Students have admitted that the occacional cheeting was true. One comment from a student S11 (female) was “We understand the importance of academic integrity, but there should be a balance.” Unfortunately, while it is difficult to cheat in proctored exams, students have many options to use; like smartphones, advanced scientific calculators, or even some students will use the traditional methods.

Four factors and indicators identified by students as being the most influential on their experiences included: e-proctored exam concerns, technical readiness, attitudes towards integrity, and satisfaction with the e-proctoring experience.

Survey statistical analysis results

Out of the 350 students who attempted online exams during the COVID-19 pandemic, 106 responded to the survey. All were eligible for analysis. The internal consistency of the collected data was calculated using Cronbach’s alpha, which was α = 0.79. This result is considered satisfactory and indicates a good level of internal consistency.

The descriptive statistics obtained from the first section showed that 64% (n = 68) of the sample were male, 85% (n = 90) of students were aged 27 years and below, 83% (n = 88) of students were single, and 39% (n = 42) were in their first academic year (Tables 1 and 2).

Table 1 The analysis of environmental, psychological, cultural, and privacy concerns
Table 2 The descriptive analysis for all the exams’ attempts

In the second section, the participants (n = 106) reported their attitudes and experiences with the four indicators pertaining to the e-proctoring tool. In keeping with Laan et al. (2017), the simple averaging test for a pool of attitudes was used as an ideal way to extract knowledge. One-sample t-tests compare the means of a single sample against a predetermined value to determine if the sample mean is significantly greater or lesser than that value. Since the survey questions used a three-point Likert scale, from ‘strongly disagree’ (1) to ‘strongly agree’ (3), the predetermined value was the mean score of the scale, which was 2. Hence, one-sample t-tests were conducted to determine if the average responses for students were significantly greater or less than 2 with p < 0.05. Below is the analysis of the first four research questions:

  1. RQ1:

    Are students well-prepared technologically for online exams?

The average opinion of the student responses was 3.08 (SD = 0.60), which was significant, t(105) = 18.524, p = 0.00; this indicated that students believed they had good and excellent technical skills. Approximately 77% (n = 81) of the students believed that they were prepared and had the necessary technical skills to take the online exams. As such, the majority of students did not face technical problems, and these issues did not negatively affect their performance or experiences.

In line with the survey statements, the majority (88.7%, n = 94) of students responded to the statement: “My computer skills are good/excellent” (M = 3.29, SD = 0.717), while 77.4% (n = 82) rated their “Internet connection required for the online exams as good/excellent” (M = 3.00, SD = 0.78). Furthermore, 74.6% (n = 79) of students indicated that “getting support, guidance, and training before the e-exam was good/excellent” M = 2.94, SD = 0.82).

  1. RQ2:

    What are the concerns of students while using the e-proctoring tool?

Four types of concerns were analyzed (environmental, psychological, cultural, and privacy) to address the research question:

  • Environmental concerns

Only one third (38.7%, n = 41) of the students agreed that the place where they conducted their exams was comfortable and free from distractions (M = 2.25, SD = 0.68). however, 79.2% (n = 84) of students indicated that their small apartments prevented their family from moving or making noise during the exam (M = 2.14, SD = 0.73), and 81.1% (n = 86) of students agreed that it was too difficult to manage between studying and taking care of themselves and their family during the COVID-19 pandemic (M = 2.14, SD = 0.71).

  • Psychological concerns

The majority of students (77.4%, n = 82) agreed that “fears from COVID-19 reflect negatively on my academic performance”, (M = 2.05, SD = 0.70). Furthermore, 85.8% (n = 91) of students stated that the “E-exam makes me feel more stressed than the paper-based exam” (M = 2.34, SD = 0.71), and 91.6% (n = 97) agreed that “Being watched through the webcam makes me anxious and causes poor performance”, (M = 2.41, SD = 0.64).

  • Cultural concerns

Only 23.6% (n = 25) of students agreed that “using the e-proctoring tool is not acceptable by my culture and family” (M = 1.9, SD = 0.75), with the majority (76.4%, n = 81) disagreeing. The statement, “I think an open webcam during the online exam is insensitive to Islamic traditions and culture”, was equally as divided, with identical percentages of students agreeing and disagreeing with it as in the statement above (M = 1.9, SD = 0.75). Finally, about 40% (n = 43) agreed about negative attitudes toward the acceptance of webcams was attributed to their reserved families (M = 2.25, SD = 0.71).

  • Privacy concerns

More than 86% of students (n = 92) agreed with the statement, “I feel that opening the webcam during online exams is impractical, and would breach my privacy” (M = 2.29, SD = 0.69). Moreover, 89.6% (n = 95) agreed that “the main concern of using an e-proctoring tool for me was privacy”, (M = 2.25, SD = 0.633), 85.9% (n = 91) agreed that they have “some concerns regarding the recorded videos and pictures of me during my exams” (M = 2.28, SD = 0.70), and 89.6% (n = 95) agreed that they “feel e-proctoring tools are invading my personal life and reducing my learning satisfaction” (M = 2.25, SD = 0.63).

The table above indicates that, on average, students had significant concerns over privacy, psychological factors, and environmental indicators. However, no statistically significant difference was noted regarding attitude towards culture concerns (p = 0.74). The students average responses regarding environmental concerns was 2.18 (SD = 0.45), which was significant, t(105) = 4.018, p = 0.00; this indicated that 72% of students (n = 76) believed that their environment affected their testing experience while using e-proctoring tools. Across all participants, around 75% of students (n = 79) expressed privacy concerns over using the e-proctoring tool (M = 2.26, SD = .52), t(105) = 5.271, p = 0.00, and the same percentage of students had psychological concerns as well (M = 2.26, SD = 0.50), t(105) = 5.380, p = 0.00.

  1. RQ3:

    Do students believe that using an e-proctoring tool is important for academic integrity?

The average opinions of the students’ responses was 2.28 (SD = 0.45), which was significant, t(105) = 6.55, p = 0.00; this indicated that 76% of students (n = 80) believed in the importance of e-proctoring tools in maintaining academic integrity. Nearly 98% of students (n = 103) agreed on the importance of integrity (M = 2.55, SD = 0.53). Over 90% of students (n = 95) also agreed that using the e-proctoring tool can maintain integrity (M = 2.08, SD = 2.24), and about 80% (n = 85) had a positive attitude towards the e-proctoring tool’s ability to prevent cheating (M = 2.08, SD = 0.67).

  1. RQ4:

    Are students satisfied with the overall experience of using e-proctoring tools in exams?

In general, the students’ satisfaction did not show significantly positive attitudes towards the e-proctoring experience (M = 2.00, SD = 0.58), t(105) = 0.083, p = 0.93, with only 21.7% of the students (n = 23) agreeing that they were satisfied, 78.4% of students (n = 83) disagreed with the statement: “Overall, I am satisfied with the e-proctoring experience”. Regarding the statement, “If using the e-proctoring tool was optional, I would still choose to use it”, students displayed a similarly strong reluctance (77.3%, n = 82), (M = 1.93, SD = 0.72).

Exam attempts’ characteristics analysis

In order to understand to how extent using an e-proctoring tool affect students’ academic performance, descriptive data were aggregated and examined for the attempts taken by 350 students in 21 classes across the two universities. Some students took more than one course in the semester; therefore, the number of attempts is 815. These attempts were conducted under the invigilation of an e-proctoring tool in Spring 2020, as shown in the table below. A total of 227 attempts were ignored from the analysis since some data were missing, such as the attempted score, exam duration, or the number of automated reported incidents. Therefore, only 588 attempts were eligible for analysis.

  1. RQ5:

    Does using an e-proctoring tool affect students’ academic performance?

To follow the previous procedure for examining the data, a predetermined value should be designated for testing. Since the full mark for the final exam was 20, the predetermined value for the score was designated to be the value of 10. To answer this research question, two dimensions were examined as follows:

  • Average student scores in online exams invigilated by the e-proctoring tool

The mean of exam score for all students was 12.09 SD = 3.99). A one-sample t-test was conducted to determine the significance between the predetermined value and the current score average. The overall finding for all courses showed that there was a positively significant difference between the predetermined value and students’ scores. In fact, 49.6% of students’ attempts (n = 291) earned a score above 12.0, which was significant; t(587) = 12.709, p = 0.00.

We also computed the comprehensive mean scores for items aligned with the online exams for each course separately. T-tests for independent means revealed positively significant differences for 11 of the courses (p < 0.05), two courses had scores significantly below the average, and eight courses showed insignificant differences (p > =0.05). In general, students’ academic performances seemed acceptable in terms of pass/no pass criteria, which was followed in all UAE universities during the pandemic. The average scores for each course are listed in Table 3.

Table 3 The students’ average scores in each course separately
  • The relationship between academic performance and the number of automated reported incidents, exam duration, and gender

To further investigate the exam characteristics, a two-way ANOVA test was conducted to investigate the interaction between gender and the number of automated reported incidents with exam score. There was insufficient evidence to support an interaction between gender and the number of automated reported incidents with scoring F(15.0) = .91, p = .55. However, gender showed significant differences in exam scores, F(1) = 4.00, p = .05. The female students’ attempts (n = 200) showed a better performance indicator (M 12.507, SD = 3.89) than the male students’ attempts (n = 345, M = 11.898, SD = 4.01). No statistically significant differences were noted between the number of automated reported incidents (F(1, 543) = 0.249, p = 0.62, R2 = 0.00) and exam duration (F(1,572) = 1.032, p = 0.31, R2 = .002) toward the exam score.

Discussion

Findings from this study have been derived from three investigation tools that aim at a deeper understanding of factors associated with e-proctored exams. This broad range of tools along with appropriate design and analytic procedures give meaningful knowledge that can lead to better understandings and addressing these important academic issues. The influencing factors of student experience come with a particular emphasis on the socio-environmental factors included environmental, psychological, cultural, and privacy.

Responses from students indicated that they were well-prepared technologically for having an online learning experience, including the e-proctored exams. This is perhaps unsurprising given the fact that the participants came from technology-related majors, which may eliminate its negative effect. However, technical difficulties may still arise regardless of academic backgrounds, such as Wi-Fi/connectivity issues or device status as in the study by Milone et al. (2017). Also, this factor may still pose an issue for students from other majors.

Considering the second research question, students raised several concerns about their experience with the e-proctoring tool. Environmental and psychological factors were particularly serious concerns for students, which is in line with previous researchers such as (Palmer 2007) and (Williams and Williams 2011). Feelings of stress and anxiety during the exam, especially given living conditions during the shutdown period, also affected students. This may be due to the fact that the majority of UAE residents are ex-pats living in different types of apartments during the shutdown rather than being able to take their exams in university facilities. Williams and Williams (2011) stated that the “more comfortable students feel in their environment and learning process, the easier it is to concentrate and achieve good performance”. Therefore, the fewer concerns that students have, the more positive the impact will be on their academic performance and learning effectiveness.

Results from this study have shown that students are mature in how they distinguish between privacy issues and cultural concerns. They have also made it clear that using e-proctoring tools is not a breach of their culture. This is in line with the study by Gómez-Rey et al. (2016), who concluded that culture has an unnoticeable effect on students from different nations. However, students expressed significant concern regarding privacy invasions, a finding which corroborates findings in Gogus and Saygın (2019). Students’ concerns centered on being monitored via webcam during the exam and the destination of the recorded videos and pictures. These concerns escalate even further, given that the e-proctoring tool can seize control of the computer and can not only peer into the students’ homes but also can observe and interpret the students’ movements throughout the exam. Ultimately, the corresponding psychological concerns over being watched by a webcam contributes to students’ feelings of fear and stress.

Unfortunately, the COVID-19 pandemic mandated the use of e-proctoring technology, causing different problems for both academics and students. One of the most serious concerns for educational institutions is the possibility of cheating. Since it is the university’s responsibility to promote academic integrity (McCabe 2005), schools have resorted to e-proctoring tools to have extra control over online exams. The results of the current study revealed that students believe that academic integrity is an important issue. Previous studies have also stated that student cheating might be associated with beliefs and values more than with situational factors (Roth and McCabe 1995). As the use of e-proctoring tools has been shown to deter cheating (Hylton et al. 2016), the students’ responses were in line with the existing literature.

Furthermore, the current study has shown that students’ overall satisfaction with e-proctoring was below their expectations. Given that the pandemic itself has disrupted many aspects of people’s lives, this finding is not all that surprising. Mandatory choices can negatively affect student satisfaction (Owen 2015; Daffin Jr and Jones 2018), leading to the involuntary switch to e-proctoring being met with resistance. This is compatible with the theory of reactance (Brehm and Brehm 2013) which stated that that people resist what is seen as eliminating their choice/freedom even if what they are reacting against is for their own benefit.

Another aspect of the present investigation was to examine the aggregated data from the exam attempts to explore how gender, exam duration, and the number of automated reported incidents affected students’ scores (e.g., performance). It was clear that using the e-proctoring tool did not negatively affect the students’ performance overall, with female students achieving statistically higher results than their male counterparts. Although there was no comparable analysis between e-proctored and un-proctored exams, the results contradicted the findings of studies by (Hylton et al. 2016) and (Milone et al. 2017), who stated that proctoring tools adversely impacted how students performed on tests. In general, the results indicated that students’ academic performances were not sabotaged as a result of using the e-proctoring tool.

This study had several limitations and involved complex linkages between students, universities, and policymakers. The study was not designed to compare the results and students’ perceptions from different situations. Its cross-sectional nature prevented the ability to make any comparison between e-proctored and conventional proctored exams in terms of inputs and outputs. Despite applied an appropriate sample plan, the sample data was still limited to technology-related majors in only two universities. It is evident that many factors interfered with the exams during the pandemic which was beyond the control of this study; therefore, they were unlikely to be addressed within this study.

Conclusion and recommendations

E-proctoring tools already exist to allow complete remote proctoring without the physical presence of students and examiners. However, the need of using e-proctored exams during the COVID-19 pandemic presented an unprecedented challenge to the entire community. The educational institutions were forced to cope with the pandemic requirements to achieve the balance between the quality of teaching and maintain the necessary educational processes. Thus, this study has sought to determine the effect of related e-proctoring factors in shaping students’ experiences with this new automated process. These factors were focusing on the existing technical, environmental, psychological, cultural, and privacy concerns, as well as other academic issues. By considering this list of influencing factors before adopting e-proctoring tools, education systems will enhance the likelihood of their success with this new form of technology.

Despite reporting serious concerns about their overall experience with e-proctoring tools (e.g., privacy, environmental, and psychological concerns), the majority of students scored above average on their online exams. Academic integrity also seemed to matter to students, in addition to academics. However, most of the students were not satisfied with the e-proctoring experience and would not continue with it, if given the option. Based on the students’ performance, it seemed that students cope with the negative side of the experience due to the potential benefits available during the pandemic. Finally, this study contributes important evidence to the academic body by highlighting the most prominent student concerns about the use of e-proctored exams. The insights from this study can help minimize difficulties and relieve students’ concerns about the technology, highlighting the need to weight the benefits of the integrity and the convenience of its implementation, as well as enhancing the academic understanding of the challenges faced by students.

The COVID-19 pandemic has tested the readiness of educational institutions to cope with extraordinary health and social conditions. However, some critical facts about the e-proctoring tool can be raised: first, e-proctoring cannot fully replace the traditional, in-person proctoring experience. As such, these online systems can be used as a supplementary, short-term option for schools during sudden, critical situations. Second, the COVID-19 pandemic forced universities to modify their assessments and communication tools to counter the crisis. Third, universities should exploit the capabilities gained during this technological transition to transmit to new skylines of learning and education. To that end, assessments, including online exams, need to be utilized as they can both challenge and inspire curiosity and creativity going forward.

References

  1. Althubaiti, A. (2016). Information bias in health research: Definition, pitfalls, and adjustment methods. Journal of Multidisciplinary Healthcare, 211. https://doi.org/10.2147/JMDH.S104807.

  2. Aparicio, M., Bacao, F., & Oliveira, T. (2016). Cultural impacts on e-learning systems’ success. The Internet and Higher Education, 31, 58–70. https://doi.org/10.1016/j.iheduc.2016.06.003.

    Article  Google Scholar 

  3. Arnold, I. J. M. (2016). Cheating at online formative tests: Does it pay off? The Internet and Higher Education, 29, 98–106. https://doi.org/10.1016/j.iheduc.2016.02.001.

    Article  Google Scholar 

  4. Billups, F. D. (2012). Conducting focus groups with college students: Strategies to ensure success. Association for Institutional Research-Professional File, 127, 1–12. Retrieved from https://scholarsarchive.jwu.edu/research_methodology/2

  5. Boitshwarelo, B., Reedy, A. K., & Billany, T. (2017). Envisioning the use of online tests in assessing twenty-first century learning: A literature review. Research and Practice in Technology Enhanced Learning, 12, 16. https://doi.org/10.1186/s41039-017-0055-7.

    Article  Google Scholar 

  6. Bovermann, K., & Bastiaens, T. J. (2020). Towards a motivational design? Connecting gamification user types and online learning activities. Research and Practice in Technology Enhanced Learning, 15, 1. https://doi.org/10.1186/s41039-019-0121-4.

    Article  Google Scholar 

  7. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3, 77–101. https://doi.org/10.1191/1478088706qp063oa.

    Article  Google Scholar 

  8. Brehm, S. S., & Brehm, J. W. (2013). Psychological reactance: A theory of freedom and control. New York: Academic Press.

    MATH  Google Scholar 

  9. Burgess, S., Sievertsen, H. H. (2020). Schools, skills, and learning: The impact of COVID-19 on education. VoxEu. Org. https://voxeu.org/article/impact-covid-19-education. Accessed 12 Feb 2021.

  10. Daffin Jr., L. W., & Jones, A. A. (2018). Comparing student performance on proctored and non-proctored exams in online psychology courses. Online Learning, 22, 131–145. https://doi.org/10.24059/olj.v22i1.1079.

    Article  Google Scholar 

  11. Drew, H. (2020). Mass school closures in the wake of the coronavirus are driving a new wave of student surveillance. In: Washington Post. https://www.washingtonpost.com/technology/2020/04/01/online-proctoring-college-exams-coronavirus. Accessed 15 Jun 2020.

  12. ECT. (2020). ECT Implementation of MOE Ministerial Decree (237) 07 04 2020 | Emirates College of Technology. https://ect.ac.ae/en/professor-sabouni-president-weekly-message-sunday-april-05-2020-week-3-arabic-version-2-2/. Accessed 23 Aug 2020.

  13. Gloria, A. M., & Ho, T. A. (2003). Environmental, social, and psychological experiences of Asian American undergraduates: Examining issues of academic persistence. Journal of Counseling and Development, 81, 93–105. https://doi.org/10.1002/j.1556-6678.2003.tb00230.x.

    Article  Google Scholar 

  14. Godinho, V. (2020). All UAE schools, universities to extend e-learning programmes until June. In: Gulfbusiness. https://gulfbusiness.com/uae-schools-universities-extend-e-learning-programmes-june/. Accessed 12 Feb 2021.

  15. Gogus, A., & Saygın, Y. (2019). Privacy perception and information technology utilization of high school students. Heliyon, 5, e01614. https://doi.org/10.1016/j.heliyon.2019.e01614.

    Article  Google Scholar 

  16. Gómez-Rey, P., Barbera, E., & Fernández-Navarro, F. (2016). The impact of cultural dimensions on online learning. Educational Technology & Society, 19, 225–238.

    Google Scholar 

  17. González-González, C. S., Infante-Moro, A., & Infante-Moro, J. C. (2020). Implementation of E-proctoring in online teaching: A study about motivational factors. Sustainability, 12, 3488. https://doi.org/10.3390/su12083488.

    Article  Google Scholar 

  18. Hollister, K. K., & Berenson, M. L. (2009). Proctored versus Unproctored online exams: Studying the impact of exam environment on student performance. Decision Sciences Journal of Innovative Education, 7, 271–294. https://doi.org/10.1111/j.1540-4609.2008.00220.x.

    Article  Google Scholar 

  19. Hylton, K., Levy, Y., & Dringus, L. P. (2016). Utilizing webcam-based proctoring to deter misconduct in online exams. Computers in Education, 92–93, 53–63. https://doi.org/10.1016/j.compedu.2015.10.002.

    Article  Google Scholar 

  20. Janson, A., Söllner, M., & Leimeister, J. M. (2017). Individual appropriation of learning management systems—Antecedents and consequences. AIS Transactions on Human-Computer Interaction, 9, 173–201.

    Article  Google Scholar 

  21. King, C., Guyette, R., & Piotrowski, C. (2009). Online exams and cheating: An empirical analysis of business students’ views. Journal of Education, 6, n1. https://doi.org/10.9743/JEO.2009.1.5.

    Article  Google Scholar 

  22. Kirschner, P. A., & Karpinski, A. C. (2010). Facebook® and academic performance. Computers in Human Behavior, 26, 1237–1245. https://doi.org/10.1016/j.chb.2010.03.024.

    Article  Google Scholar 

  23. Kumi-Yeboah, A., Dogbey, J., Yuan, G. (2017). Online collaborative learning activities: The perspectives of minority graduate students. Online Learning, 21. https://doi.org/10.24059/olj.v21i4.1277

  24. Laan, A., Madirolas, G., & de Polavieja, G. G. (2017). Rescuing collective wisdom when the average group opinion is wrong. Frontiers in Robotics and AI, 4, 56. https://doi.org/10.3389/frobt.2017.00056.

    Article  Google Scholar 

  25. Majeed A., Baadel S., Haq A.U. (2016). Global Triumph or Exploitation of Security and Privacy Concerns in E-Learning Systems. In: Jahankhani H. et al. (eds) Global Security, Safety and Sustainability - The Security Challenges of the Connected World. ICGS3 2017. Communications in Computer and Information Science, vol 630. Springer, Cham. https://doi.org/10.1007/978-3-319-51064-4_28.

  26. McCabe, D. L. (2005). Cheating among college and university students: A North American perspective. International Journal for Educational Integrity, 1. https://doi.org/10.21913/IJEI.v1i1.14

  27. Milone, A. S., Cortese, A. M., Balestrieri, R. L., & Pittenger, A. L. (2017). The impact of proctored online exams on the educational experience. Currents in Pharmacy Teaching & Learning, 9, 108–114. https://doi.org/10.1016/j.cptl.2016.08.037.

    Article  Google Scholar 

  28. Morgan, D. L., Krueger, R. A., & Scannell, A. U. (1998). Planning focus groups. Thousand Oaks: Sage Publications.

    Book  Google Scholar 

  29. Owen, P. M. (2015). Maximizing student motivation: A course redesign. Procedia - Social and Behavioral Sciences, 186, 656–659. https://doi.org/10.1016/j.sbspro.2015.04.097.

    Article  Google Scholar 

  30. Palmer, D. (2007). What is the best way to motivate students in science? Teaching Science, 53, 38–42.

    Google Scholar 

  31. Roth, N. L., & McCabe, D. L. (1995). Communication strategies for addressing academic dishonesty. Journal of College Student Development, 36, 531–541.

    Google Scholar 

  32. Seaman, J. E., Allen, I. E., Seaman, J. (2018). Grade increase: Tracking distance education in the United States. The Babson Survey Research Group, MA: USA.

  33. Suh, J. (2002). Estimation of Non-market Forest Benefits Using Choice Modelling. Focus. In: Harrison, S.R., Herbohn, J.L., Mangaoang, E.O. (Eds.), Socio-Economic Research Methods in Forestry: A Training Manual Leyte State University, ViSCA, Baybay, The Philippines, 149–164.

  34. The National. (2020). Coronavirus: UAE schools to close for a month. In: National. https://www.thenational.ae/uae/education/coronavirus-uae-schools-to-close-for-a-month-1.987668. Accessed 23 Oct 2020.

  35. Vayre, E., & Vonthron, A.-M. (2019). Relational and psychological factors affecting exam participation and student achievement in online college courses. The Internet and Higher Education, 43, 100671. https://doi.org/10.1016/j.iheduc.2018.07.001.

    Article  Google Scholar 

  36. Wesley Schultz, P. (2001). The structure of environmental concern: Concern for self, other people, and the biosphere. Journal of Environmental Psychology, 21, 327–339. https://doi.org/10.1006/jevp.2001.0227.

    Article  Google Scholar 

  37. Williams, K., & Williams, C. (2011). Motivation, five key ingredients for improving student. Research in Higher Education Journal, 12, 104–122.

    Google Scholar 

  38. Yeh, Y.-C., Kwok, O.-M., Chien, H.-Y., et al. (2019). How college students’ achievement goal orientations predict their expected online learning outcome: The mediation roles of self-regulated learning strategies and supportive online learning behaviors. Online Learning, 23, 23–41. https://doi.org/10.24059/olj.v23i4.2076.

    Article  Google Scholar 

Download references

Availability of data and material

Data is confidential for the educational institutions. Authors seek approval to use the data without revealing it to the public.

Code availability

All analyses were performed with SPSS V14. No custom code was used.

Funding

This word is not supported by any fund.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Faten F. Kharbat.

Ethics declarations

Conflicts of interest/competing interests

The authors report no conflicting or competing interests in relation to this work.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Kharbat, F.F., Abu Daabes, A.S. E-proctored exams during the COVID-19 pandemic: A close understanding. Educ Inf Technol (2021). https://doi.org/10.1007/s10639-021-10458-7

Download citation

Keywords

  • E-proctoring
  • Online exams
  • COVID-19
  • Environmental and psychological factors