Start Submission Become a Reviewer

Reading: An Evaluation of Online Proctoring Tools

Download

A- A+
Alt. Display

Research articles

An Evaluation of Online Proctoring Tools

Authors:

Mohammed Juned Hussein ,

The University of the South Pacific, FJ
X close

Javed Yusuf,

The University of the South Pacific, FJ
X close

Arpana Sandhya Deb,

The University of the South Pacific, FJ
X close

Letila Fong,

The University of the South Pacific, FJ
X close

Som Naidu

The University of the South Pacific, FJ
X close

Abstract

COVID’19 is hastening the adoption of online learning and teaching worldwide, and across all levels of education. While many of the typical learning and teaching transactions such as lecturing and communicating are easily handled by contemporary online learning technologies, others, such as assessment of learning outcomes with closed book examinations are fraught with challenges. Among other issues to do with students and teachers, these challenges have to do with the ability of teachers and educational organizations to ensure academic integrity in the absence of a live proctor when an examination is being taken remotely and from a private location. A number of online proctoring tools are appearing on the market that portend to offer solutions to some of the major challenges. But for the moment, they too remain untried and tested on any large scale. This includes the cost of the service and their technical requirements. This paper reports on one of the first attempts to properly evaluate a selection of these tools and offer recommendations for educational institutions. This investigation, which was carried out at the University of the South Pacific, comprised a four-phased approach, starting with desk research that was followed with pilot testing by a group of experts as well as students. The elimination of a tool in every phase was based on the ‘survival of the fittest’ approach with each phase building upon the milestones and deliverables from the previous phase. This paper presents the results of this investigation and discusses its key findings.

How to Cite: Hussein, M. J., Yusuf, J., Deb, A. S., Fong, L., & Naidu, S. (2020). An Evaluation of Online Proctoring Tools. Open Praxis, 12(4), 509–525. DOI: http://doi.org/10.5944/openpraxis.12.4.1113
428
Views
97
Downloads
  Published on 31 Dec 2020
 Accepted on 30 Oct 2020            Submitted on 24 Apr 2020

Introduction

Interest in online learning and teaching has been on the rise for some time and accelerated by the 2019 CORONA virus pandemic. Many of these online courses also include online assessment activities which raises a number of issues and challenges in relation to plagiarism and academic integrity on the whole. One of the ways of coping with some of these challenges is the adoption of online proctoring tools for online assessments.

Online proctoring involves the use of virtual tools for monitoring student activities during assessment activity. These tools (as they continue to overcome their limitations) have the potential for students to take an online exam at a remote location while ensuring the integrity (security and trustworthiness) and reliability of the online exam. This includes the authentication of the student and their identity to secure and maintain the integrity of an exam and its administration (Foster & Layman, 2013).

Online proctoring has two major components. First, the availability of a web-camera on the student’s computing device needs to be activated to video record the physical learning space and everything the student does during the examination period. The examiner or the proctor is able to remotely monitor this video recording. The examiner or proctor is able to identify potential cheating, suspicious movements, and posture such as talking to someone in the room, looking at a book, mobile device, or other printed media for answers. Second, is lockdown, which will prevent the students from using any other computer applications including the Internet browser, and user-computing processes (such as copying, pasting or printing) that can lead to potential cheating during the exam. This is commonly referred to as “computer or browser lockdown” (Alessio et al., 2017). The proctoring system also records all the student Internet activities during the exam such as websites that the student tried to access. The video recording of the entire exam is made available for review by the instructors or examiners either simultaneously or afterwards.

There are four major features of online proctoring systems; (i) authentication: which is the process of ensuring the registered student is the valid student taking an online proctored exam, (ii) browsing tolerance: this is the process of setting the limit of student’s ability to use their computer for other tasks, (iii) remote authorizing and control: which is enabling the proctor to start, pause and end online proctored exam, and as well as flagging any suspicious student behaviours, and (iv) report generation: which is the creation of reports of a student’s activities during a proctored exam.

Generally, there are three types of online exam proctoring:

  1. Live proctoring: This is real-time proctoring taking place during the exam with a human proctor monitoring/supervising the exam virtually, online. The human proctors are usually trained professionals to ensure the authenticity of the student and look for any red flags such as suspicious eye or facial movements or the appearance of any unverified device that could indicate possible cheating (Gautam, 2007). This requires the exams to be scheduled at a specific time depending on the availability of the proctor on a given date and time. This has equal human involvement as traditional offline exam supervision. However, unlike live proctoring, online proctoring will require competence in the use of technology, and as such much closer vigilance on the approaches of online proctors will be required (Mitra & Gofman, 2016).
  2. Recorded proctoring: This involves the video recording of camera images and logs of the student taking an online proctored exam, where the proctor reviews the recording at a later time and assesses the integrity of the exam (i.e. whether or not any fraud/cheating was committed during the exam by the examinee). This allows students to take an exam at any time hence allowing multiple exams to take place simultaneously. But, this too requires human intervention for reviewing the recordings, and that can be expensive and difficult to scale as well.
  3. Automated proctoring: In automated proctoring – human proctors do not monitor (or review) the entire exam, instead, the proctoring system identifies key events of possible fraud or cheating. The proctor is alerted to review these events to determine if fraud or cheating has been committed by the student (Sietses, 2016). This form of online proctoring is generally considered more convenient for the students as they are not required to arrange live proctors for their tests and exams, as there is no schedule, location and human proctor constraints. It is also very scalable as the human component is replaced by artificial intelligence or algorithms. Hence, it is considered more cost-effective (Jose, 2016). However, students’ familiarity with this proctoring system may spawn evasive strategies for fraud prevention. This form of proctoring can also easily produce false positives such as flagging innocent events as potential fraud (Sietses, 2016).

There are many online proctoring systems available that offer the three types of online exam proctoring services mentioned earlier. But, institutions in the midst of choosing and implementing an online exam proctoring system need to consider several factors first. These include (but are not limited to): ease and flexibility of integration with the existing institutional learning management system, technical performance and robustness of the proctoring system (sometimes over low internet bandwidth, poor hardware capabilities or electrical power failures), level of efficient task automation, and reporting capabilities. Privacy protection and management, security and anti-fraud measures, and their associated cost are also other key issues that need to be examined when considering an online proctoring system (Sietses, 2016).

The research that is reported in this paper sought to:

  1. Identify online proctoring systems;
  2. Test and evaluate selected online proctoring systems;
  3. Select and dummy trial of selected online proctoring systems; and
  4. Develop procedures and guidelines for online exam proctoring.

Literature Review

Interest in the affordances of technology for learning and teaching is on the rise. This is leading to a growing interest in online learning and teaching. When used effectively, online learning is able to provide higher education institutions with flexible options to expand their offerings into the global market (Casey, 2008). However, as institutions continue to grow their online education, there is a commensurate rise in concerns about how best to ensure academic integrity (Barnes & Paris, 2013). The distance or flexibility between students and instructors in an online learning environment may, in fact, contribute to the challenges of maintaining the integrity of online assessment. This was also highlighted by Hollister and Berenson (2009) that, “the most commonly reported challenge in online assessment is how to maintain academic integrity”. While proctored exams remain a common tool for assessing student learning, ways of facilitating them continue to evolve from online exams facilitated via learning management systems (LMS) to other online testing platforms (Prisacari & Danielson, 2017). This has raised both academic and non-academic issues, such as the designing and administering of online exams, and monitoring students’ behaviour during exams (Cramp et al., 2019). These behaviours include dishonest and unethical practices by the students such as cheating and fraud.

In their study, King et al. (2009) reported that the majority of students surveyed felt that cheating was easier in an online environment compared to a traditional face-to-face classroom. Similarly, Berkey and Halfond (2015) reported that 84% of the students surveyed in their study agreed that student dishonesty in online test-taking was a significant issue. In a study of 635 students, Watson and Sottile (2010) also noted that students indicated that they would be more than four times more likely to cheat in an online class. Several other studies also found higher rates of cheating online (Lanier, 2006; Harmon & Lambrinos, 2008; Grijalva et al., 2006) and prevalence of cheating online compared to in a face-to-face environment (Etter et al., 2006; Watson & Sottile, 2010).

Ensuring and maintaining academic honesty and integrity in any learning environment is vital and significant. When putting this in the context of an online learning environment, Moten et al. (2013) explained that students in these learning environments work independently with relative autonomy and anonymity, and instructors may be uncertain who is taking exams or how best to validate student learning. Therefore, online learning must address issues and challenges of honesty and integrity in student assessment and evaluation. Online proctoring is one way to address this challenge. With technology-based aides, such as computer/system lockdowns, keystroke monitoring, the ability to stop/start a test, and many other assistive proctoring processes (Foster & Layman, 2013) now easily integrated into the monitoring process, online proctoring has now become a viable solution.

Moreover, online proctoring offers both instructors and students other significant advantages. Kinney (2001) noted that online proctoring is a valuable option for students who are geographically dispersed with time differences. Several studies (such as Bedford et al., 2009; Harmon et al., 2010; Rose, 2009; Watson & Sottile, 2010) found that when compared with traditional face-to-face settings, the technologies associated with monitoring of the online examination can provide better exam security and integrity. Karim et al. (2014) in their study found that the use of remote online proctoring decreases instances of student cheating. Similarly, Kolski and Weible (2019) posited that the importance of academic integrity could be reinforced when students are aware of the instructors reviewing their recorded exam sessions. Likewise, Tao and Li (2012) highlighted that online proctoring reduces instructional time dedicated to testing allowing instructors and students to engage more with the course content.

However, there are mixed findings in terms of student performance in online-proctored exams. Schultz et al. (2007) in their study reported that students who took the non-proctored online exams performed significantly higher than did those in the proctored settings. Similarly, Alessio et al. (2017), Richardson and North (2013), Wellman and Marcinkiewicz (2004) and Carstairs and Myors (2009) reported the same findings with non-proctored test scores being significantly better than proctored test scores in their respective studies. But, other studies (such as Ladyshewsky (2015), Yates and Beaudrie (2009) and Beck (2014) found no significant difference between the test scores in proctored versus non-proctored online tests.

For institutions, selecting the fit-for-purpose online exam proctoring technology can be challenging. While there are not many studies on how institutions selected and integrated online proctoring systems, Brown (2018) describes three factors that can impact the selection of an online exam proctoring solution: cost, security and, instructor and student comfortability with the use of technology highlighting that involving the faculty in the selection of the online proctoring technology would be beneficial. She further identifies technology support staff, teaching staff and students as the three most important stakeholders in the selection process of the fit-for-purpose online exam proctoring technology of an institution (Brown, 2018).

Moreover, Foster and Layman (2013) developed a comparison matrix that describes online proctoring functionality, and compares that functionality across various online proctoring services/products such as proctoring features (human-proctor availability, data transfer encryption, proctor management, recorded review, automated proctoring, incident logs, etc.), lockdown features (browser lockdown, computer operations lockdown, keystroke alerts, etc.), authentication options (facial recognition, photo comparison, keystroke analytics, biometrics, etc.) and webcam features (camera view angles, panning, etc.). This matrix could be useful for institutions in the process of identifying and selecting the right online exam proctoring system.

The purpose of this investigation has been to add to this body of literature with a preliminary investigation, identification, and selection of an online proctoring solution, specifically addressing the following research questions:

  1. Which are the most prominent online proctoring systems?
  2. How effective and efficient are they for wide-scale adoption in higher education settings?
  3. What are the recommended procedures and guidelines for online exam proctoring?

Methodology

This investigation was carried out at USP (the University of the South Pacific) which is a regional University that is owned and governed by twelve nations of the southwest Pacific region. These include the Cook Islands, The Republic of Fiji, Kiribati, Marshall Islands, Nauru, Niue, Solomon Islands, Tokelau, Tonga, Tuvalu, Vanuatu and Samoa. The University has campuses in all of the member countries. Its main campus is located in Suva, the Republic of Fiji where the majority of its academic Schools are based, except for the following – the School of Agriculture and Food Technology, which is situated at the Alafua Campus in Samoa, and the School of Law at the Emalus Campus in Vanuatu. The USP region spreads across 33 million square Kilometres of ocean, an area three times the size of Europe, with a total land mass about the size of Denmark. Population masses in the region vary from 2,000 in Tokelau to more than 800,000 in the Republic of Fiji. For island nations, this widely spread and sparsely populated, online learning and teaching methods, including flexible approaches to the assessment of learning, has had to feature prominently in its educational operations.

The adoption of flexible approaches to the assessment of learning has required a thorough investigation of contemporary online proctoring tools. A 4-phased approach was adopted as part of this process. The elimination of a system in every phase was based on the ‘survival of the fittest’ approach with each phase building upon the milestones and deliverables from the previous phase as per the table 1.

Table 1

Project phases 1 to 4

Activity Activity Description Milestone/Deliverable
1. Identify popular online proctoring systems. i. Desk-based research of popular online proctoring tools.
ii. Review existing research in online exam proctoring.
iii.Evaluate selected systems for further review and evaluation.
Research on possible Systems.
3 systems selected for further review and evaluation.
2. Evaluate selected systems i. Develop requirements and matrix for evaluation.
ii. Trial and evaluate 3 systems as per the requirements and evaluation matrix.
iii.Select one system for mock trial.
Requirements and evaluation matrix completed.
One system selected for mock trial.
3. Further understanding the functionalities of the selected system and preparing for mock trial i. Buy licenses to use the system.
ii. Develop quick guides for students and teachers for key functionalities of the system.
Quick guides developed.
i. Develop tests, identify mock trial students and train them to use the system as an exam-taker.
ii. Further use of the system and note how key functionalities operate.
Mock trial students identified and trained.
Mock tests and hacks developed.
4. Mock trial of the selected system with the identified students i. Mock trial carried out. The results and experiences evaluated.
ii. Student feedback discussed.
iii. Selected system was further reviewed after Mock trial.
iv. If necessary, a second mock trial to be undertaken.
v. Guidelines developed.
vi. Final Report developed.
Mock trial completed.
Final Report and Guidelines developed.

Phase 1 – Desk Research

Phase 1 comprised a rigorous desk-based research of possible online exam proctoring systems. The systems were reviewed and popular online proctoring systems that were used by other universities were selected. Phase 1’s elimination criteria was based on the following:

  1. Moodle LMS integration capability.
  2. Frequency of security updates (by the system/service provider).
  3. Costing (what type of costing model does the system/service use).
  4. Cloud-based or does it need physical servers etc.
  5. Proprietary or Open Source system/service.
  6. Proctoring type (Live/Recorded/AI-automated).
  7. How the system handles Privacy issue(s).
  8. Peripheral requirements (hardware etc.).

After the desk-based research and review, the following eight systems were identified for further reviewing/testing: ProctorU, Kryterion, Respondus, BVirtual, AIProctor, ProctorU Open Source, Examity and Proctorio.

Phase 2: Evaluation

The selected systems went through a thorough evaluation process. The primary considerations were: infrastructure the system uses, the licencing, end-user support, user verification, frequency of updates, costing models, privacy policy around recordings, type of proctoring services offered, and integration with Moodle. From the outset, the capabilities of each of these systems were as follows:

  1. ProctorU (cloud-based, proprietary licence, live proctoring, authentication needed).
  2. Kryterion (cloud-based, proprietary licence, live proctoring, authentication needed).
  3. Respondus (cloud-based, automated Proctoring, 1000seats/USD4,000).
  4. BVirtual (cloud-based, live/recorded/automated proctoring).
  5. AIProctor (cloud-based, Artificial Intelligence (AI) proctoring).
  6. ProctorU Open Source (based on ProctorU).
  7. Examity (cloud-based, live/recorded/automated proctoring, regular updates).
  8. Proctorio (cloud-based, recorded/automated proctoring, can be integrated with Moodle).

In Phase 2 even though the plan was to select 3 best systems, we ended up with five equally powerful systems for further review and evaluation: ProctorU, Respondus, AIProctor, ProctorU-Open Source and Proctorio. The evaluation in phase 2 was based on the licences, functionalities, types of proctoring services and the integration capabilities with Moodle as a learning management system.

Phase 3: Further Evaluation

An in-depth research and review was carried out for each of the selected 5 systems. Since ProctorU Open Source required more time to set-up and test than our project/research timeline, the team decided to drop ProctorU from further testing (table 2).

Table 2

Evaluation matrix used in Phase 3

Proctoring Features ProctorU Respondus Proctorio AIProctor
Live human proctors available Yes No No No
Internet required Yes Yes Yes Yes
Secure/encrypted transferring of data Yes Yes Yes Yes
Student able to book exam time Yes Yes No Yes
Training provided Yes n/a Yes Yes
Proctoring provider certified Yes n/a Yes Yes
Students can interact with proctors Yes n/a Yes Yes
Student can message issues to proctors Yes n/a Yes No
Students get live exam instructions Yes n/a Yes No
Proctor able to see students screen Yes n/a Yes Yes
Stop proctor to view students screen No Yes n/a No
Recorded video reviewing option No Yes Yes No
Pause test/ cancel test No n/a Yes No
Automated proctoring No Yes Yes No
Keystroke checking No Yes Yes No
Audio recording No No Yes No
Browser lockdown No Yes Yes No
Authentication option Yes Yes Yes Yes
Web camera needed Yes Yes Yes Yes
Log reports No Yes Yes No
recording storage option Yes Yes Yes Yes
Test review option No Yes Yes No
Incident logs with date & time No Yes Yes No
Customising options for institution No Yes Yes No
Lockdown Features
Available on both Windows and Mac Yes Yes Yes Yes
Plugin for browser No Yes Yes No
Avoids control options on the browser No Yes Yes No
Stops navigation (forward/back) No Yes Yes No
Stops concurrent tests No Yes Yes No
Stops right clicks using mouse No Yes Yes No
Stops printing No Yes Yes No
Hides taskbar No Yes Yes No
Hides desktop No Yes Yes No
Stops minimising window No Yes Yes No
Stops maximising window No Yes Yes No
Stops copying & pasting No Yes Yes No
Stops other applications No Yes Yes No
Stops starting of other applications No Yes Yes No
Authentication options
User required to authenticate Yes Yes Yes Yes
Username provided/required Yes Yes No Yes
Password provided/required Yes Yes No Yes
Student ID required Yes Yes Yes Yes
Keystroke analytics No No Yes No
Ability to do facial recognition No No Yes No
Ability to do voice recognition No No Yes No
Fingerprint scanning required No No No No
Iris scanner required/available No No No No
Webcam Features
Web camera required Yes Yes Yes Yes
Room panning allowed Yes Yes Yes Yes

Selections for mock trial

In the end Proctorio appeared more favourable than ProctorU and the costing model for Proctorio was also better than the ProctorU. ProctorU was charging an hourly rate for each exam, whereas, Proctorio has an annual fee per student with an unlimited number of online exams. Hence, Proctorio (2019) was selected for the proctoring trials.

Phase 4: Mock trials

i. Mock trials with staff

A mock-proctored online test was prepared and the research team members attempted the test and tried cheating for example: using a mobile phone, opening a new browser tab, talking to someone in the room, looking at notes in a book and looking away from the screen. The incident reports were recorded and discussed with the experts from Proctorio via a Zoom meeting. The first mock-trial team also comprised two staff from the Learning Systems team at USP to look at the technical aspects of the testing.

Using the convenience sampling method another proctored online test was prepared and Learning Designers and Educational Technologists, Electronic Publishers, Lecturers and Tutors based in regional Lautoka, Labasa, Samoa and Tonga campuses of the University were requested to attempt the test (n=34). This was a voluntary activity. After the test was attempted, the team had discussions with the participants and they were asked to share their experiences. This gave the team a starting point for the mock trials with students. Issues such as: how to install the Proctorio plugin, using nComputing computer, how to read an incident report, how to restart a test, and to get technical support from Proctorio.

During the regional testing, the staff (Lecturers and Tutors) had a face to face focus group discussion where they shared their experiences and what the felt about the examination proctoring system that is being tested.

ii. Mock trials with students

Using the convenience sampling method, Mock trials were carried out in the following regional campuses of the University: Lautoka, Labasa, Samoa and Tonga (n=128). These campuses had summer classes running at the time the Mock trials were being conducted and students were available for the mock trials. After the students took the test, they were given a set of questions and were requested to rate their experiences. These included: Proctorio as a proctoring tool; were they able to complete the test; their ability to easily navigate through the system; clarity of instructions within the system; and they were comfortable in taking the proctored test.

After the students took the mock-proctored test, focus group discussions were conducted. The students ranged from Pre-degree to Postgraduate levels. The students came from mixed ethnicity and socio-economic backgrounds. In doing so, a huge reception from the students was noticed at the regional campuses for the proctoring system to be implemented. During the face to face focus group discussions, the students thanked the team for trialling such a system as this will eliminate most of the travelling expenses for tests. The students also liked the idea of taking the test anytime within the timeframe provided for the online tests.

The teaching staff at these campuses were also given a chance to attempt a separate mock test. The student mock test incident reports were discussed with the teaching staff. This helped the teaching staff better understand why some students received high incident reports.

Results from Mock-trials

Results in Figure 1 illustrate that user experience with Proctorio was positive for students from the respective campuses. The Samoan students did report a slightly less enjoyable experience and the reason became clear when their understanding of the applications was looked at in Figure 5.

Table 3

Did you enjoy your experience with Proctorio?

Campus 1 (Not at all) 2 3 4 5 (Very Much)
Labasa 0% 6% 6% 27% 61%
Tonga 0% 0% 5% 5% 90%
Lautoka 0% 3% 5% 28% 64%
Samoa 7% 7% 7% 0% 79%

Table 4

Were you successful in completing the test with Proctorio?

Campus 1 (Not at all) 2 3 4 5 (Very Much)
Labasa 12% 0% 3% 9% 76%
Tonga 0% 0% 5% 15% 80%
Lautoka 7% 0% 2% 13% 79%
Samoa 7% 7% 0% 0% 86%
Figure 1 

Proctorio User Experience.

Results in Figure 2 illustrates that the completion of test with Proctorio was positive for students from the respective campuses. 14% of Samoan students did report that they were unsuccessful in completing the test. This was largely due to connectivity issues that they faced during the test.

Table 5

Were you able to control the system (E.g. able to navigate throughout the quiz)?

Campus 1 (Not at all) 2 3 4 5 (Very Much)
Labasa 6% 6% 6% 18% 64%
Tonga 0% 0% 0% 25% 75%
Lautoka 3% 3% 3% 16% 74%
Samoa 7% 7% 0% 14% 71%
Figure 2 

Completion of Test with Proctorio.

Figure 3 

Navigation throughout the Quiz.

A majority of the students from the four campuses had very little to no navigation issues throughout their quiz attempt. It was however noted that there were a few challenges (12% and 14% at Labasa and Samoa campus respectively) and these were evident during ID verification and when adding the Proctorio Chrome Extension.

Table 6

Is the instructions provided by the Proctorio clear?

Campus 1 (Not at all) 2 3 4 5 (Very Much)
Labasa 9% 0% 9% 6% 76%
Tonga 0% 0% 0% 10% 90%
Lautoka 5% 0% 5% 8% 82%
Samoa 7% 0% 7% 0% 86%

The high number of positive responses in all the four campuses in Figure 4 signifies that the instructions provided to students which they followed before the actual test commenced were stated clearly and in detail. Students are quite familiar with doing online quizzes on Moodle and with Proctorio embedded to the quiz, there are hardly any major changes except for the ID verification which students have to undergo before they attempt the quiz.

Table 7

Did you feel uncomfortable while doing your quiz using Proctorio?

Campus 1 (Not at all) 2 3 4 5 (Very Much)
Labasa 30% 6% 6% 12% 45%
Tonga 45% 10% 10% 5% 30%
Lautoka 36% 5% 7% 11% 41%
Samoa 36% 7% 7% 29% 21%
Figure 4 

Proctorio Instructions Clarity.

Figure 5 

Uncomfortable Experience while doing Quiz.

It can be seen that students overall experience while doing a proctored quiz is challenging and uncomfortable. This is quite understandable as this is the first time for them to verify themselves using web cameras and their identification cards. Most of the students indicated during the focus group discussions that they had never used a web camera before and they faced problems trying to verify their ID cards as they had to hold it properly before the picture could be taken off the card. This process indicated discomfort for most of the students across the campuses.

Key Findings

  1. Students incur a great deal of time and money to travel to campuses to sit for tests.
  2. This cost can be eliminated with online proctoring. The team tested the Proctoring system with very low Internet speed and received positive results.
  3. Online proctoring can easily be integrated into Moodle without additional infrastructure.
  4. Students are generally positively disposed towards the use of online proctoring.

Discussion of the Findings

How students felt (and what they requested)

The data that has been gathered clearly displays the excitement of our regional students for the implementation of the new system. A larger number of students agreed to the idea of having online tests, but were concerned about the extra step before the start of examination for user verification.

During the setup and user authentication period, students were anxious, but later they were at ease, however, some students shared that they felt uncomfortable that the camera was recording every movement of theirs. Using an automated system would make the students a little more relaxed knowing that there is no one on the other side of the camera watching. This could be a cheaper option as well and which does not require lecturers and students to book examination proctors/invigilators. Students were concerned about the privacy of the videos and its use. Even though an automated system was used for mock trials, the recording is available for the lecturers to review, if there is a need.

Concerns from the teaching staff

At first the teaching staff were concerned that this would require a lot more work from their end, but the system is such that once integrated with Moodle, it would only require lecturers to select proctored option when creating a quiz (quiz creation process remains the same on Moodle). The rest of the settings will be set as default by the administration team. After the mock trials, the teaching staff were convinced that setting up and monitoring the system is easier than what they initially thought.

General global observation (COVID’19)

The COVID’19 pandemic is causing a great deal of disruption across the globe. At a time like this, there is likely to be a greater need for educational institutions to rethink their approaches to learning, teaching, and its accreditation, including adoption on online technologies. (CAUDIT/ACODE, 2020).

However, learning and teaching online is not without its challenges. Luxuries such as access to a personal computer or laptop, a neat and tidy room with sufficient lighting, internet connectivity, food, water or even basic necessities may not be as readily available for many of our students. Students in many developing contexts, such as in the South Pacific region could be looking after the whole family (siblings or babies while studying at the same time) or looking after ill family members.. Global observations need be to taken into account when trying to consider a system that will be used for taking exams online.

Pedagogical consideration

As the subject matter experts, the course coordinators should be involved in the entire design of the assessments (CAUDIT/ACODE, 2020). We are not looking for short-term solutions that can be solved with just adding a tool. The school, subject matter expert and the learning experience designers should look for alternate assessment strategies that could be employed for learning outcomes that were purely tested in tests and exams.

We can split the exam into parts where some learning outcomes are tested with proctored exams and others in the form of written assessment. It does not matter which tool we use to assess the learning outcome, what matters is how you use it and its effectiveness in achieving the learning outcomes. With COVID’19, we should consider allowing the students with an option to opt-out of exams, yet still be given a chance to complete the course purely through coursework.

Technological consideration

Before jumping into an expensive option of using an online proctoring system, we should consider using existing technologies that an institution might have (CAUDIT/ACODE, 2020). Moreover, this could be useful as a temporary solution for a proctored exam. This could include but are not limited to video conferencing tools such as Remote Conferencing Tool for Teaching (REACT), Viber on computer or ZOOM. Where students are connected virtually to teachers and are being watched throughout the examination period. And are able to demand that a particular student shares desktop if the examiner is suspicious.

It is also possible to allow students to write answers to examination questions on a blank piece of paper and have a picture of it taken and uploaded for marking. All this could be done during the video conferencing session that is proctored. However, this would mean that we have to all come online at once or in cohorts. They will allow students to do all the necessary calculations and show working for each of the questions they attempt.

Handwritten examination scripts can be marked using Plugins such as the Crowdmark, which can be fully integrated with learning management systems including: MOODLE, Canvas, brightspace, Blackboard and Sakai (Crowdmark, 2020).

If we are going to use the online quiz modules for examinations, we should look at options where students are able to write complex formulas using a computer mouse. These could be available within the LMS or installed as an external plugin. An example of this is WIRIS, which is a proprietary software that allows students to write symbols and equations using a computer’s mouse and it converts proper fraction, equation or symbol.

Conclusion and Recommendations

Online proctoring has its challenges. Unlike a live examination, online proctoring requires students to have access to suitable technological infrastructure, without which the option will not work reliably. Naturally, this creates a divide between those with, and without access to this technological infrastructure. Then there are those students with disabilities who may require a lot more assistance than is possible while taking online-proctored exams. There are also concerns around how the recorded video is interpreted, and used by others. These issues are not likely to go away, which means that online proctoring can only be offered as just another solution alongside other options. It ought not to be promoted as the only solution and should be adopted and used carefully and selectively in contexts and situations where it is the best solution.

In light of these concerns and considerations, the following recommendations ought to be considered in the adoption of online proctoring as part of examination processes.

  1. Prepare recommended online examination procedures. Having a university-wide recommended online examination procedure would help the lecturers to facilitate the online tests in a uniform manner. This will also provide clarity around roles and responsibilities of the lecturer and that of the student.
  2. Trial the proctoring system with live courses that have large regional student numbers. At the moment we have results from mock trials, but it will be beneficial if there are test results from live courses (and during peak periods).
  3. Ensure a computer lab (equipped with the hardware and software requirements) is designated to students who do not have their own laptops. Not all students are able to find a quiet, well-lit room to sit for the examination.
  4. Ensure that the hardware and software requirements for the proctoring system are met. There are certain requirements for using the proctoring system. These are a web camera and browser plugins...

Furthermore, meeting these expectations ought not to be seen as a one-time-fix. All of these recommendations will require ongoing monitoring and maintenance. Students with different types of disabilities would require additional assistance taking online-proctored exams. Furthermore, with online exams, we will be pushing students to procure tools that they would not need if they just sat for a paper-based exam. These will include such things as digital cameras, headphones, extra lighting in the room, laptops or desktop computers. We must ensure that the end-users of the system know how to use the system, and are comfortable in using the system before rolling it out.

Acknowledgement

The team would like to acknowledge the contributions of following people who helped in the evaluation and testing phases of the research:

  1. Ms. Vasiti Delana
  2. Mr. Sanjeet Chand
  3. Mr. Daryl Abel

References

  1. Alessio, H. M., Malay, N., Maurer, K., Bailer, A. J., & Rubin, B. (2017). Examining the effect of proctoring on online test scores.Online Learning, 21(1), 146–161. http://dx.doi.org/10.24059/olj.v21i1.885 

  2. Barnes, C., & Paris, B. L. (2013). An analysis of academic integrity techniques used in online courses at a southern university. In Northwest Decision Sciences Institute Annual Meeting Proceedings. 

  3. Beck, V. (2014). Testing a model to predict online cheating—Much ado about nothing. Active learning in higher education, 15(1), 65–75. https://doi.org/10.1177%2F1469787413514646 

  4. Bedford, W., Gregg, J., & Clinton, S. (2009). Implementing technology to prevent online cheating: Acase study at a small southern regional university (SSRU). MERLOT Journal of Online Learning and Teaching, 5(2), 230–238. 

  5. Berkey, D., & Halfond, J. (2015, July 20). Cheating, Student Authentication and Proctoring in Online Programs. New England Journal of Higher Education. Retrieved from https://nebhe.org/journal/cheating-student-authentication-and-proctoring-in-online-programs/ 

  6. Brown, V. (2018). Evaluating technology to prevent academic integrity violations in online environments. Online Journal of Distance Learning Administration, 21(1). Retrieved from https://www.westga.edu/~distance/ojdla/spring211/brown211.html 

  7. Carstairs, J., & Myors, B. (2009). Internet testing: A natural experiment reveals test score inflation on a high-stakes, unproctored cognitive test. Computers in Human Behavior, 25(3), 738–742. https://doi.org/10.1016/j.chb.2009.01.011 

  8. Casey, D. M. (2008). A journey to legitimacy: The historical development of distance education through technology. TechTrends, 52(2), 45. https://doi.org/10.1007/s11528-008-0135-z 

  9. CAUDIT/ACODE (2020). CAUDIT/ACODE Forum on e-exams. Retrieved April 14, 2020, from https://www.acode.edu.au/ 

  10. Cramp, J., Medlin, J. F., Lake, P., & Sharp, C. (2019). Lessons learned from implementing remotely invigilated online exams. Journal of University Teaching & Learning Practice, 16(1), 10. Retrieved from https://ro.uow.edu.au/jutlp/vol16/iss1/10/ 

  11. Crowdmark (2020). Crowdmark is a collaborative online grading and analytics platform. Retrieved April 14, 2020, from https://crowdmark.com/ 

  12. Etter, S., Cramer, J. J., & Finn, S. (2006). Origins of academic dishonesty: Ethical orientations and personality factors associated with attitudes about cheating with information technology. Journal of Research on Technology in Education, 39(2), 133–155. https://doi.org/10.1080/15391523.2006.10782477 

  13. Foster, D., & Layman, H. (2013). Online proctoring systems compared. Retrieved from https://caveon.com/wp-content/uploads/2013/03/Online-Proctoring-Systems-Compared-Mar-13-2013.pdf 

  14. Gautam, M. (2017, December 20). 3 types of online proctoring services and how to select the best for hiring. Hackerearth. Retrieved from https://www.hackerearth.com/blog/talent-assessment/online-proctoring-for-hiring-developer 

  15. Grijalva, T. C., Kerkvliet, J., & Nowell, C. (2006). Academic honesty and online courses. College Student Journal, 40(1), 180–185. 

  16. Harmon, O. R., & Lambrinos, J. (2008). Are online exams an invitation to cheat? The Journal of Economic Education, 39(2), 116–125. https://doi.org/10.3200/JECE.39.2.116-125 

  17. Harmon, O. R., Lambrinos, J., & Buffolino, J. (2010). Assessment design and cheating risk in online instruction. Online Journal of Distance Learning Administration, 13(3). Retrieved from https://www.westga.edu/~distance/ojdla/Fall133/harmon_lambrinos_buffolino133.html 

  18. Hollister, K. K., & Berenson, M. L. (2009). Proctored versus unproctored online exams: Studying the impact of exam environment on student performance. Decision Sciences Journal of Innovative Education, 7(1), 271–294. https://doi.org/10.1111/j.1540-4609.2008.00220.x 

  19. Jose, S. (2016, December 15). Online proctoring is trending: Here is all you must know. Talview. Retrieved from https://blog.talview.com/a-complete-guide-to-online-remote-proctoring 

  20. Karim, M. N., Kaminsky, S. E., & Behrend, T. S. (2014). Cheating, reactions, and performance in remotely proctored testing: An exploratory experimental study. Journal of Business and Psychology, 29(4), 555–572. https://doi.org/10.1007/s10869-014-9343-z 

  21. King, C. G., Guyette Jr, R. W., & Piotrowski, C. (2009). Online exams and cheating: An empirical analysis of business students’ views. Journal of Educators Online, 6(1), 1–11. 

  22. Kinney, N. E. (2001). A guide to design and testing in online psychology courses. Psychology Learning & Teaching, 1(1), 16–20. 

  23. Kolski, T., & Weible, J. L. (2019). Do Community College Students Demonstrate Different Behaviors from Four-Year University Students on Virtual Proctored Exams? Community College Journal of Research and Practice, 43(10–11), 690–701. https://doi.org/10.1080/10668926.2019.1600615 

  24. Ladyshewsky, R. K. (2015). Post-graduate student performance in ‘supervised in-class’ vs. ‘unsupervised online’ multiple choice tests: implications for cheating and test security. Assessment & Evaluation in Higher Education, 40(7), 883–897. https://doi.org/10.1080/02602938.2014.956683 

  25. Lanier, M. M. (2006). Academic integrity and distance learning. Journal of criminal justice education, 17(2), 244–261. https://doi.org/10.1080/10511250600866166 

  26. Mitra, S., & Gofman, M. I. (2016). Towards Greater Integrity in Online Exams. In Americas Conference on Information Systems (AMCIS). Association For Information Systems. 

  27. Moten Jr, J., Fitterer, A., Brazier, E., Leonard, J., & Brown, A. (2013). Examining online college cyber cheating methods and prevention measures. Electronic Journal of E-learning, 11(2), 139–146. 

  28. Prisacari, A. A., & Danielson, J. (2017). Computer-based versus paper-based testing: Investigating testing mode with cognitive load and scratch paper use. Computers in Human Behavior, 77, 1–10. https://doi.org/10.1016/j.chb.2017.07.044 

  29. Proctorio (2019). A Comprehensive Learning Integrity Platform. Retrieved from https://www.proctorio.com/ 

  30. Richardson, R., & North, M. (2013). Strengthening the trust in online courses: a common sense approach. Journal of Computing Sciences in Colleges, 28(5), 266–272. 

  31. Rose, C. (2009). Virtual proctoring in distance education: An open-source solution. American Journal of Business Education, 2(2), 81–88. 

  32. Schultz, M. C., Schultz, J. T., & Gallogly, J. (2007). The management of testing in distance learning environments. Journal of College Teaching & Learning, 4(9), 19–26. https://doi.org/10.19030/tlc.v4i9.1543 

  33. Sietses, L. (2016). White Paper Online Proctoring. Questions and answers about remote proctoring. SURFnet. Retrieved from https://www.surf.nl/files/2019-04/whitepaper-online-proctoring_en.pdf 

  34. Tao, J., & Li, Z. (2012). A Case Study on Computerized Take-Home Testing: Benefits and Pitfalls. International Journal of Technology in Teaching & Learning, 8(1), 33–43. 

  35. Watson, G., & Sottile, J. (2010). Cheating in the digital age: Do students cheat more in online courses? Online Journal of Distance Learning Administration, 13(1). Retrieved from https://www.westga.edu/~distance/ojdla/spring131/watson131.html 

  36. Wellman, G. S., & Marcinkiewicz, H. (2004). Online learning and time-on-task: Impact of proctored vs. un-proctored testing. Journal of Asynchronous Learning Networks, 8(4), 93–104. 

  37. Yates, R.W., & Beaudrie, B. (2009). The impact of online assessment on grades in community college distance education mathematics courses. American Journal of Distance Education, 23(2), 62–70. https://doi.org/10.1080/08923640902850601 

comments powered by Disqus