Interest in online learning and teaching has been on the rise for some time and accelerated by the 2019 CORONA virus pandemic. Many of these online courses also include online assessment activities which raises a number of issues and challenges in relation to plagiarism and academic integrity on the whole. One of the ways of coping with some of these challenges is the adoption of online proctoring tools for online assessments.
Online proctoring involves the use of virtual tools for monitoring student activities during assessment activity. These tools (as they continue to overcome their limitations) have the potential for students to take an online exam at a remote location while ensuring the integrity (security and trustworthiness) and reliability of the online exam. This includes the authentication of the student and their identity to secure and maintain the integrity of an exam and its administration (Foster & Layman, 2013).
Online proctoring has two major components. First, the availability of a web-camera on the student’s computing device needs to be activated to video record the physical learning space and everything the student does during the examination period. The examiner or the proctor is able to remotely monitor this video recording. The examiner or proctor is able to identify potential cheating, suspicious movements, and posture such as talking to someone in the room, looking at a book, mobile device, or other printed media for answers. Second, is lockdown, which will prevent the students from using any other computer applications including the Internet browser, and user-computing processes (such as copying, pasting or printing) that can lead to potential cheating during the exam. This is commonly referred to as “computer or browser lockdown” (Alessio et al., 2017). The proctoring system also records all the student Internet activities during the exam such as websites that the student tried to access. The video recording of the entire exam is made available for review by the instructors or examiners either simultaneously or afterwards.
There are four major features of online proctoring systems; (i) authentication: which is the process of ensuring the registered student is the valid student taking an online proctored exam, (ii) browsing tolerance: this is the process of setting the limit of student’s ability to use their computer for other tasks, (iii) remote authorizing and control: which is enabling the proctor to start, pause and end online proctored exam, and as well as flagging any suspicious student behaviours, and (iv) report generation: which is the creation of reports of a student’s activities during a proctored exam.
Generally, there are three types of online exam proctoring:
There are many online proctoring systems available that offer the three types of online exam proctoring services mentioned earlier. But, institutions in the midst of choosing and implementing an online exam proctoring system need to consider several factors first. These include (but are not limited to): ease and flexibility of integration with the existing institutional learning management system, technical performance and robustness of the proctoring system (sometimes over low internet bandwidth, poor hardware capabilities or electrical power failures), level of efficient task automation, and reporting capabilities. Privacy protection and management, security and anti-fraud measures, and their associated cost are also other key issues that need to be examined when considering an online proctoring system (Sietses, 2016).
The research that is reported in this paper sought to:
Interest in the affordances of technology for learning and teaching is on the rise. This is leading to a growing interest in online learning and teaching. When used effectively, online learning is able to provide higher education institutions with flexible options to expand their offerings into the global market (Casey, 2008). However, as institutions continue to grow their online education, there is a commensurate rise in concerns about how best to ensure academic integrity (Barnes & Paris, 2013). The distance or flexibility between students and instructors in an online learning environment may, in fact, contribute to the challenges of maintaining the integrity of online assessment. This was also highlighted by Hollister and Berenson (2009) that, “the most commonly reported challenge in online assessment is how to maintain academic integrity”. While proctored exams remain a common tool for assessing student learning, ways of facilitating them continue to evolve from online exams facilitated via learning management systems (LMS) to other online testing platforms (Prisacari & Danielson, 2017). This has raised both academic and non-academic issues, such as the designing and administering of online exams, and monitoring students’ behaviour during exams (Cramp et al., 2019). These behaviours include dishonest and unethical practices by the students such as cheating and fraud.
In their study, King et al. (2009) reported that the majority of students surveyed felt that cheating was easier in an online environment compared to a traditional face-to-face classroom. Similarly, Berkey and Halfond (2015) reported that 84% of the students surveyed in their study agreed that student dishonesty in online test-taking was a significant issue. In a study of 635 students, Watson and Sottile (2010) also noted that students indicated that they would be more than four times more likely to cheat in an online class. Several other studies also found higher rates of cheating online (Lanier, 2006; Harmon & Lambrinos, 2008; Grijalva et al., 2006) and prevalence of cheating online compared to in a face-to-face environment (Etter et al., 2006; Watson & Sottile, 2010).
Ensuring and maintaining academic honesty and integrity in any learning environment is vital and significant. When putting this in the context of an online learning environment, Moten et al. (2013) explained that students in these learning environments work independently with relative autonomy and anonymity, and instructors may be uncertain who is taking exams or how best to validate student learning. Therefore, online learning must address issues and challenges of honesty and integrity in student assessment and evaluation. Online proctoring is one way to address this challenge. With technology-based aides, such as computer/system lockdowns, keystroke monitoring, the ability to stop/start a test, and many other assistive proctoring processes (Foster & Layman, 2013) now easily integrated into the monitoring process, online proctoring has now become a viable solution.
Moreover, online proctoring offers both instructors and students other significant advantages. Kinney (2001) noted that online proctoring is a valuable option for students who are geographically dispersed with time differences. Several studies (such as Bedford et al., 2009; Harmon et al., 2010; Rose, 2009; Watson & Sottile, 2010) found that when compared with traditional face-to-face settings, the technologies associated with monitoring of the online examination can provide better exam security and integrity. Karim et al. (2014) in their study found that the use of remote online proctoring decreases instances of student cheating. Similarly, Kolski and Weible (2019) posited that the importance of academic integrity could be reinforced when students are aware of the instructors reviewing their recorded exam sessions. Likewise, Tao and Li (2012) highlighted that online proctoring reduces instructional time dedicated to testing allowing instructors and students to engage more with the course content.
However, there are mixed findings in terms of student performance in online-proctored exams. Schultz et al. (2007) in their study reported that students who took the non-proctored online exams performed significantly higher than did those in the proctored settings. Similarly, Alessio et al. (2017), Richardson and North (2013), Wellman and Marcinkiewicz (2004) and Carstairs and Myors (2009) reported the same findings with non-proctored test scores being significantly better than proctored test scores in their respective studies. But, other studies (such as Ladyshewsky (2015), Yates and Beaudrie (2009) and Beck (2014) found no significant difference between the test scores in proctored versus non-proctored online tests.
For institutions, selecting the fit-for-purpose online exam proctoring technology can be challenging. While there are not many studies on how institutions selected and integrated online proctoring systems, Brown (2018) describes three factors that can impact the selection of an online exam proctoring solution: cost, security and, instructor and student comfortability with the use of technology highlighting that involving the faculty in the selection of the online proctoring technology would be beneficial. She further identifies technology support staff, teaching staff and students as the three most important stakeholders in the selection process of the fit-for-purpose online exam proctoring technology of an institution (Brown, 2018).
Moreover, Foster and Layman (2013) developed a comparison matrix that describes online proctoring functionality, and compares that functionality across various online proctoring services/products such as proctoring features (human-proctor availability, data transfer encryption, proctor management, recorded review, automated proctoring, incident logs, etc.), lockdown features (browser lockdown, computer operations lockdown, keystroke alerts, etc.), authentication options (facial recognition, photo comparison, keystroke analytics, biometrics, etc.) and webcam features (camera view angles, panning, etc.). This matrix could be useful for institutions in the process of identifying and selecting the right online exam proctoring system.
The purpose of this investigation has been to add to this body of literature with a preliminary investigation, identification, and selection of an online proctoring solution, specifically addressing the following research questions:
This investigation was carried out at USP (the University of the South Pacific) which is a regional University that is owned and governed by twelve nations of the southwest Pacific region. These include the Cook Islands, The Republic of Fiji, Kiribati, Marshall Islands, Nauru, Niue, Solomon Islands, Tokelau, Tonga, Tuvalu, Vanuatu and Samoa. The University has campuses in all of the member countries. Its main campus is located in Suva, the Republic of Fiji where the majority of its academic Schools are based, except for the following – the School of Agriculture and Food Technology, which is situated at the Alafua Campus in Samoa, and the School of Law at the Emalus Campus in Vanuatu. The USP region spreads across 33 million square Kilometres of ocean, an area three times the size of Europe, with a total land mass about the size of Denmark. Population masses in the region vary from 2,000 in Tokelau to more than 800,000 in the Republic of Fiji. For island nations, this widely spread and sparsely populated, online learning and teaching methods, including flexible approaches to the assessment of learning, has had to feature prominently in its educational operations.
The adoption of flexible approaches to the assessment of learning has required a thorough investigation of contemporary online proctoring tools. A 4-phased approach was adopted as part of this process. The elimination of a system in every phase was based on the ‘survival of the fittest’ approach with each phase building upon the milestones and deliverables from the previous phase as per the table 1.
Table 1
Project phases 1 to 4
Activity | Activity Description | Milestone/Deliverable |
---|---|---|
1. Identify popular online proctoring systems. | i. Desk-based research of popular online proctoring tools. ii. Review existing research in online exam proctoring. iii.Evaluate selected systems for further review and evaluation. |
Research on possible Systems. 3 systems selected for further review and evaluation. |
2. Evaluate selected systems | i. Develop requirements and matrix for evaluation. ii. Trial and evaluate 3 systems as per the requirements and evaluation matrix. iii.Select one system for mock trial. |
Requirements and evaluation matrix completed. One system selected for mock trial. |
3. Further understanding the functionalities of the selected system and preparing for mock trial | i. Buy licenses to use the system. ii. Develop quick guides for students and teachers for key functionalities of the system. |
Quick guides developed. |
i. Develop tests, identify mock trial students and train them to use the system as an exam-taker. ii. Further use of the system and note how key functionalities operate. |
Mock trial students identified and trained. Mock tests and hacks developed. |
|
4. Mock trial of the selected system with the identified students | i. Mock trial carried out. The results and experiences evaluated. ii. Student feedback discussed. iii. Selected system was further reviewed after Mock trial. iv. If necessary, a second mock trial to be undertaken. v. Guidelines developed. vi. Final Report developed. |
Mock trial completed. Final Report and Guidelines developed. |
Phase 1 comprised a rigorous desk-based research of possible online exam proctoring systems. The systems were reviewed and popular online proctoring systems that were used by other universities were selected. Phase 1’s elimination criteria was based on the following:
After the desk-based research and review, the following eight systems were identified for further reviewing/testing: ProctorU, Kryterion, Respondus, BVirtual, AIProctor, ProctorU Open Source, Examity and Proctorio.
The selected systems went through a thorough evaluation process. The primary considerations were: infrastructure the system uses, the licencing, end-user support, user verification, frequency of updates, costing models, privacy policy around recordings, type of proctoring services offered, and integration with Moodle. From the outset, the capabilities of each of these systems were as follows:
In Phase 2 even though the plan was to select 3 best systems, we ended up with five equally powerful systems for further review and evaluation: ProctorU, Respondus, AIProctor, ProctorU-Open Source and Proctorio. The evaluation in phase 2 was based on the licences, functionalities, types of proctoring services and the integration capabilities with Moodle as a learning management system.
An in-depth research and review was carried out for each of the selected 5 systems. Since ProctorU Open Source required more time to set-up and test than our project/research timeline, the team decided to drop ProctorU from further testing (table 2).
Table 2
Evaluation matrix used in Phase 3
Proctoring Features | ProctorU | Respondus | Proctorio | AIProctor |
---|---|---|---|---|
Live human proctors available | Yes | No | No | No |
Internet required | Yes | Yes | Yes | Yes |
Secure/encrypted transferring of data | Yes | Yes | Yes | Yes |
Student able to book exam time | Yes | Yes | No | Yes |
Training provided | Yes | n/a | Yes | Yes |
Proctoring provider certified | Yes | n/a | Yes | Yes |
Students can interact with proctors | Yes | n/a | Yes | Yes |
Student can message issues to proctors | Yes | n/a | Yes | No |
Students get live exam instructions | Yes | n/a | Yes | No |
Proctor able to see students screen | Yes | n/a | Yes | Yes |
Stop proctor to view students screen | No | Yes | n/a | No |
Recorded video reviewing option | No | Yes | Yes | No |
Pause test/ cancel test | No | n/a | Yes | No |
Automated proctoring | No | Yes | Yes | No |
Keystroke checking | No | Yes | Yes | No |
Audio recording | No | No | Yes | No |
Browser lockdown | No | Yes | Yes | No |
Authentication option | Yes | Yes | Yes | Yes |
Web camera needed | Yes | Yes | Yes | Yes |
Log reports | No | Yes | Yes | No |
recording storage option | Yes | Yes | Yes | Yes |
Test review option | No | Yes | Yes | No |
Incident logs with date & time | No | Yes | Yes | No |
Customising options for institution | No | Yes | Yes | No |
Lockdown Features | ||||
Available on both Windows and Mac | Yes | Yes | Yes | Yes |
Plugin for browser | No | Yes | Yes | No |
Avoids control options on the browser | No | Yes | Yes | No |
Stops navigation (forward/back) | No | Yes | Yes | No |
Stops concurrent tests | No | Yes | Yes | No |
Stops right clicks using mouse | No | Yes | Yes | No |
Stops printing | No | Yes | Yes | No |
Hides taskbar | No | Yes | Yes | No |
Hides desktop | No | Yes | Yes | No |
Stops minimising window | No | Yes | Yes | No |
Stops maximising window | No | Yes | Yes | No |
Stops copying & pasting | No | Yes | Yes | No |
Stops other applications | No | Yes | Yes | No |
Stops starting of other applications | No | Yes | Yes | No |
Authentication options | ||||
User required to authenticate | Yes | Yes | Yes | Yes |
Username provided/required | Yes | Yes | No | Yes |
Password provided/required | Yes | Yes | No | Yes |
Student ID required | Yes | Yes | Yes | Yes |
Keystroke analytics | No | No | Yes | No |
Ability to do facial recognition | No | No | Yes | No |
Ability to do voice recognition | No | No | Yes | No |
Fingerprint scanning required | No | No | No | No |
Iris scanner required/available | No | No | No | No |
Webcam Features | ||||
Web camera required | Yes | Yes | Yes | Yes |
Room panning allowed | Yes | Yes | Yes | Yes |
In the end Proctorio appeared more favourable than ProctorU and the costing model for Proctorio was also better than the ProctorU. ProctorU was charging an hourly rate for each exam, whereas, Proctorio has an annual fee per student with an unlimited number of online exams. Hence, Proctorio (2019) was selected for the proctoring trials.
A mock-proctored online test was prepared and the research team members attempted the test and tried cheating for example: using a mobile phone, opening a new browser tab, talking to someone in the room, looking at notes in a book and looking away from the screen. The incident reports were recorded and discussed with the experts from Proctorio via a Zoom meeting. The first mock-trial team also comprised two staff from the Learning Systems team at USP to look at the technical aspects of the testing.
Using the convenience sampling method another proctored online test was prepared and Learning Designers and Educational Technologists, Electronic Publishers, Lecturers and Tutors based in regional Lautoka, Labasa, Samoa and Tonga campuses of the University were requested to attempt the test (n=34). This was a voluntary activity. After the test was attempted, the team had discussions with the participants and they were asked to share their experiences. This gave the team a starting point for the mock trials with students. Issues such as: how to install the Proctorio plugin, using nComputing computer, how to read an incident report, how to restart a test, and to get technical support from Proctorio.
During the regional testing, the staff (Lecturers and Tutors) had a face to face focus group discussion where they shared their experiences and what the felt about the examination proctoring system that is being tested.
Using the convenience sampling method, Mock trials were carried out in the following regional campuses of the University: Lautoka, Labasa, Samoa and Tonga (n=128). These campuses had summer classes running at the time the Mock trials were being conducted and students were available for the mock trials. After the students took the test, they were given a set of questions and were requested to rate their experiences. These included: Proctorio as a proctoring tool; were they able to complete the test; their ability to easily navigate through the system; clarity of instructions within the system; and they were comfortable in taking the proctored test.
After the students took the mock-proctored test, focus group discussions were conducted. The students ranged from Pre-degree to Postgraduate levels. The students came from mixed ethnicity and socio-economic backgrounds. In doing so, a huge reception from the students was noticed at the regional campuses for the proctoring system to be implemented. During the face to face focus group discussions, the students thanked the team for trialling such a system as this will eliminate most of the travelling expenses for tests. The students also liked the idea of taking the test anytime within the timeframe provided for the online tests.
The teaching staff at these campuses were also given a chance to attempt a separate mock test. The student mock test incident reports were discussed with the teaching staff. This helped the teaching staff better understand why some students received high incident reports.
Results in Figure 1 illustrate that user experience with Proctorio was positive for students from the respective campuses. The Samoan students did report a slightly less enjoyable experience and the reason became clear when their understanding of the applications was looked at in Figure 5.
Table 3
Did you enjoy your experience with Proctorio?
Campus | 1 (Not at all) | 2 | 3 | 4 | 5 (Very Much) |
---|---|---|---|---|---|
Labasa | 0% | 6% | 6% | 27% | 61% |
Tonga | 0% | 0% | 5% | 5% | 90% |
Lautoka | 0% | 3% | 5% | 28% | 64% |
Samoa | 7% | 7% | 7% | 0% | 79% |
Table 4
Were you successful in completing the test with Proctorio?
Campus | 1 (Not at all) | 2 | 3 | 4 | 5 (Very Much) |
---|---|---|---|---|---|
Labasa | 12% | 0% | 3% | 9% | 76% |
Tonga | 0% | 0% | 5% | 15% | 80% |
Lautoka | 7% | 0% | 2% | 13% | 79% |
Samoa | 7% | 7% | 0% | 0% | 86% |
Proctorio User Experience.
Results in Figure 2 illustrates that the completion of test with Proctorio was positive for students from the respective campuses. 14% of Samoan students did report that they were unsuccessful in completing the test. This was largely due to connectivity issues that they faced during the test.
Table 5
Were you able to control the system (E.g. able to navigate throughout the quiz)?
Campus | 1 (Not at all) | 2 | 3 | 4 | 5 (Very Much) |
---|---|---|---|---|---|
Labasa | 6% | 6% | 6% | 18% | 64% |
Tonga | 0% | 0% | 0% | 25% | 75% |
Lautoka | 3% | 3% | 3% | 16% | 74% |
Samoa | 7% | 7% | 0% | 14% | 71% |
Completion of Test with Proctorio.
Navigation throughout the Quiz.
A majority of the students from the four campuses had very little to no navigation issues throughout their quiz attempt. It was however noted that there were a few challenges (12% and 14% at Labasa and Samoa campus respectively) and these were evident during ID verification and when adding the Proctorio Chrome Extension.
Table 6
Is the instructions provided by the Proctorio clear?
Campus | 1 (Not at all) | 2 | 3 | 4 | 5 (Very Much) |
---|---|---|---|---|---|
Labasa | 9% | 0% | 9% | 6% | 76% |
Tonga | 0% | 0% | 0% | 10% | 90% |
Lautoka | 5% | 0% | 5% | 8% | 82% |
Samoa | 7% | 0% | 7% | 0% | 86% |
The high number of positive responses in all the four campuses in Figure 4 signifies that the instructions provided to students which they followed before the actual test commenced were stated clearly and in detail. Students are quite familiar with doing online quizzes on Moodle and with Proctorio embedded to the quiz, there are hardly any major changes except for the ID verification which students have to undergo before they attempt the quiz.
Table 7
Did you feel uncomfortable while doing your quiz using Proctorio?
Campus | 1 (Not at all) | 2 | 3 | 4 | 5 (Very Much) |
---|---|---|---|---|---|
Labasa | 30% | 6% | 6% | 12% | 45% |
Tonga | 45% | 10% | 10% | 5% | 30% |
Lautoka | 36% | 5% | 7% | 11% | 41% |
Samoa | 36% | 7% | 7% | 29% | 21% |
Proctorio Instructions Clarity.
Uncomfortable Experience while doing Quiz.
It can be seen that students overall experience while doing a proctored quiz is challenging and uncomfortable. This is quite understandable as this is the first time for them to verify themselves using web cameras and their identification cards. Most of the students indicated during the focus group discussions that they had never used a web camera before and they faced problems trying to verify their ID cards as they had to hold it properly before the picture could be taken off the card. This process indicated discomfort for most of the students across the campuses.
The data that has been gathered clearly displays the excitement of our regional students for the implementation of the new system. A larger number of students agreed to the idea of having online tests, but were concerned about the extra step before the start of examination for user verification.
During the setup and user authentication period, students were anxious, but later they were at ease, however, some students shared that they felt uncomfortable that the camera was recording every movement of theirs. Using an automated system would make the students a little more relaxed knowing that there is no one on the other side of the camera watching. This could be a cheaper option as well and which does not require lecturers and students to book examination proctors/invigilators. Students were concerned about the privacy of the videos and its use. Even though an automated system was used for mock trials, the recording is available for the lecturers to review, if there is a need.
At first the teaching staff were concerned that this would require a lot more work from their end, but the system is such that once integrated with Moodle, it would only require lecturers to select proctored option when creating a quiz (quiz creation process remains the same on Moodle). The rest of the settings will be set as default by the administration team. After the mock trials, the teaching staff were convinced that setting up and monitoring the system is easier than what they initially thought.
The COVID’19 pandemic is causing a great deal of disruption across the globe. At a time like this, there is likely to be a greater need for educational institutions to rethink their approaches to learning, teaching, and its accreditation, including adoption on online technologies. (CAUDIT/ACODE, 2020).
However, learning and teaching online is not without its challenges. Luxuries such as access to a personal computer or laptop, a neat and tidy room with sufficient lighting, internet connectivity, food, water or even basic necessities may not be as readily available for many of our students. Students in many developing contexts, such as in the South Pacific region could be looking after the whole family (siblings or babies while studying at the same time) or looking after ill family members.. Global observations need be to taken into account when trying to consider a system that will be used for taking exams online.
As the subject matter experts, the course coordinators should be involved in the entire design of the assessments (CAUDIT/ACODE, 2020). We are not looking for short-term solutions that can be solved with just adding a tool. The school, subject matter expert and the learning experience designers should look for alternate assessment strategies that could be employed for learning outcomes that were purely tested in tests and exams.
We can split the exam into parts where some learning outcomes are tested with proctored exams and others in the form of written assessment. It does not matter which tool we use to assess the learning outcome, what matters is how you use it and its effectiveness in achieving the learning outcomes. With COVID’19, we should consider allowing the students with an option to opt-out of exams, yet still be given a chance to complete the course purely through coursework.
Before jumping into an expensive option of using an online proctoring system, we should consider using existing technologies that an institution might have (CAUDIT/ACODE, 2020). Moreover, this could be useful as a temporary solution for a proctored exam. This could include but are not limited to video conferencing tools such as Remote Conferencing Tool for Teaching (REACT), Viber on computer or ZOOM. Where students are connected virtually to teachers and are being watched throughout the examination period. And are able to demand that a particular student shares desktop if the examiner is suspicious.
It is also possible to allow students to write answers to examination questions on a blank piece of paper and have a picture of it taken and uploaded for marking. All this could be done during the video conferencing session that is proctored. However, this would mean that we have to all come online at once or in cohorts. They will allow students to do all the necessary calculations and show working for each of the questions they attempt.
Handwritten examination scripts can be marked using Plugins such as the Crowdmark, which can be fully integrated with learning management systems including: MOODLE, Canvas, brightspace, Blackboard and Sakai (Crowdmark, 2020).
If we are going to use the online quiz modules for examinations, we should look at options where students are able to write complex formulas using a computer mouse. These could be available within the LMS or installed as an external plugin. An example of this is WIRIS, which is a proprietary software that allows students to write symbols and equations using a computer’s mouse and it converts proper fraction, equation or symbol.
Online proctoring has its challenges. Unlike a live examination, online proctoring requires students to have access to suitable technological infrastructure, without which the option will not work reliably. Naturally, this creates a divide between those with, and without access to this technological infrastructure. Then there are those students with disabilities who may require a lot more assistance than is possible while taking online-proctored exams. There are also concerns around how the recorded video is interpreted, and used by others. These issues are not likely to go away, which means that online proctoring can only be offered as just another solution alongside other options. It ought not to be promoted as the only solution and should be adopted and used carefully and selectively in contexts and situations where it is the best solution.
In light of these concerns and considerations, the following recommendations ought to be considered in the adoption of online proctoring as part of examination processes.
Furthermore, meeting these expectations ought not to be seen as a one-time-fix. All of these recommendations will require ongoing monitoring and maintenance. Students with different types of disabilities would require additional assistance taking online-proctored exams. Furthermore, with online exams, we will be pushing students to procure tools that they would not need if they just sat for a paper-based exam. These will include such things as digital cameras, headphones, extra lighting in the room, laptops or desktop computers. We must ensure that the end-users of the system know how to use the system, and are comfortable in using the system before rolling it out.
The team would like to acknowledge the contributions of following people who helped in the evaluation and testing phases of the research:
Alessio, H. M., Malay, N., Maurer, K., Bailer, A. J., & Rubin, B. (2017). Examining the effect of proctoring on online test scores.Online Learning, 21(1), 146–161. http://dx.doi.org/10.24059/olj.v21i1.885
Barnes, C., & Paris, B. L. (2013). An analysis of academic integrity techniques used in online courses at a southern university. In Northwest Decision Sciences Institute Annual Meeting Proceedings.
Beck, V. (2014). Testing a model to predict online cheating—Much ado about nothing. Active learning in higher education, 15(1), 65–75. https://doi.org/10.1177%2F1469787413514646
Bedford, W., Gregg, J., & Clinton, S. (2009). Implementing technology to prevent online cheating: Acase study at a small southern regional university (SSRU). MERLOT Journal of Online Learning and Teaching, 5(2), 230–238.
Berkey, D., & Halfond, J. (2015, July 20). Cheating, Student Authentication and Proctoring in Online Programs. New England Journal of Higher Education. Retrieved from https://nebhe.org/journal/cheating-student-authentication-and-proctoring-in-online-programs/
Brown, V. (2018). Evaluating technology to prevent academic integrity violations in online environments. Online Journal of Distance Learning Administration, 21(1). Retrieved from https://www.westga.edu/~distance/ojdla/spring211/brown211.html
Carstairs, J., & Myors, B. (2009). Internet testing: A natural experiment reveals test score inflation on a high-stakes, unproctored cognitive test. Computers in Human Behavior, 25(3), 738–742. https://doi.org/10.1016/j.chb.2009.01.011
Casey, D. M. (2008). A journey to legitimacy: The historical development of distance education through technology. TechTrends, 52(2), 45. https://doi.org/10.1007/s11528-008-0135-z
CAUDIT/ACODE (2020). CAUDIT/ACODE Forum on e-exams. Retrieved April 14, 2020, from https://www.acode.edu.au/
Cramp, J., Medlin, J. F., Lake, P., & Sharp, C. (2019). Lessons learned from implementing remotely invigilated online exams. Journal of University Teaching & Learning Practice, 16(1), 10. Retrieved from https://ro.uow.edu.au/jutlp/vol16/iss1/10/
Crowdmark (2020). Crowdmark is a collaborative online grading and analytics platform. Retrieved April 14, 2020, from https://crowdmark.com/
Etter, S., Cramer, J. J., & Finn, S. (2006). Origins of academic dishonesty: Ethical orientations and personality factors associated with attitudes about cheating with information technology. Journal of Research on Technology in Education, 39(2), 133–155. https://doi.org/10.1080/15391523.2006.10782477
Foster, D., & Layman, H. (2013). Online proctoring systems compared. Retrieved from https://caveon.com/wp-content/uploads/2013/03/Online-Proctoring-Systems-Compared-Mar-13-2013.pdf
Gautam, M. (2017, December 20). 3 types of online proctoring services and how to select the best for hiring. Hackerearth. Retrieved from https://www.hackerearth.com/blog/talent-assessment/online-proctoring-for-hiring-developer
Grijalva, T. C., Kerkvliet, J., & Nowell, C. (2006). Academic honesty and online courses. College Student Journal, 40(1), 180–185.
Harmon, O. R., & Lambrinos, J. (2008). Are online exams an invitation to cheat? The Journal of Economic Education, 39(2), 116–125. https://doi.org/10.3200/JECE.39.2.116-125
Harmon, O. R., Lambrinos, J., & Buffolino, J. (2010). Assessment design and cheating risk in online instruction. Online Journal of Distance Learning Administration, 13(3). Retrieved from https://www.westga.edu/~distance/ojdla/Fall133/harmon_lambrinos_buffolino133.html
Hollister, K. K., & Berenson, M. L. (2009). Proctored versus unproctored online exams: Studying the impact of exam environment on student performance. Decision Sciences Journal of Innovative Education, 7(1), 271–294. https://doi.org/10.1111/j.1540-4609.2008.00220.x
Jose, S. (2016, December 15). Online proctoring is trending: Here is all you must know. Talview. Retrieved from https://blog.talview.com/a-complete-guide-to-online-remote-proctoring
Karim, M. N., Kaminsky, S. E., & Behrend, T. S. (2014). Cheating, reactions, and performance in remotely proctored testing: An exploratory experimental study. Journal of Business and Psychology, 29(4), 555–572. https://doi.org/10.1007/s10869-014-9343-z
King, C. G., Guyette Jr, R. W., & Piotrowski, C. (2009). Online exams and cheating: An empirical analysis of business students’ views. Journal of Educators Online, 6(1), 1–11.
Kinney, N. E. (2001). A guide to design and testing in online psychology courses. Psychology Learning & Teaching, 1(1), 16–20.
Kolski, T., & Weible, J. L. (2019). Do Community College Students Demonstrate Different Behaviors from Four-Year University Students on Virtual Proctored Exams? Community College Journal of Research and Practice, 43(10–11), 690–701. https://doi.org/10.1080/10668926.2019.1600615
Ladyshewsky, R. K. (2015). Post-graduate student performance in ‘supervised in-class’ vs. ‘unsupervised online’ multiple choice tests: implications for cheating and test security. Assessment & Evaluation in Higher Education, 40(7), 883–897. https://doi.org/10.1080/02602938.2014.956683
Lanier, M. M. (2006). Academic integrity and distance learning. Journal of criminal justice education, 17(2), 244–261. https://doi.org/10.1080/10511250600866166
Mitra, S., & Gofman, M. I. (2016). Towards Greater Integrity in Online Exams. In Americas Conference on Information Systems (AMCIS). Association For Information Systems.
Moten Jr, J., Fitterer, A., Brazier, E., Leonard, J., & Brown, A. (2013). Examining online college cyber cheating methods and prevention measures. Electronic Journal of E-learning, 11(2), 139–146.
Prisacari, A. A., & Danielson, J. (2017). Computer-based versus paper-based testing: Investigating testing mode with cognitive load and scratch paper use. Computers in Human Behavior, 77, 1–10. https://doi.org/10.1016/j.chb.2017.07.044
Proctorio (2019). A Comprehensive Learning Integrity Platform. Retrieved from https://www.proctorio.com/
Richardson, R., & North, M. (2013). Strengthening the trust in online courses: a common sense approach. Journal of Computing Sciences in Colleges, 28(5), 266–272.
Rose, C. (2009). Virtual proctoring in distance education: An open-source solution. American Journal of Business Education, 2(2), 81–88.
Schultz, M. C., Schultz, J. T., & Gallogly, J. (2007). The management of testing in distance learning environments. Journal of College Teaching & Learning, 4(9), 19–26. https://doi.org/10.19030/tlc.v4i9.1543
Sietses, L. (2016). White Paper Online Proctoring. Questions and answers about remote proctoring. SURFnet. Retrieved from https://www.surf.nl/files/2019-04/whitepaper-online-proctoring_en.pdf
Tao, J., & Li, Z. (2012). A Case Study on Computerized Take-Home Testing: Benefits and Pitfalls. International Journal of Technology in Teaching & Learning, 8(1), 33–43.
Watson, G., & Sottile, J. (2010). Cheating in the digital age: Do students cheat more in online courses? Online Journal of Distance Learning Administration, 13(1). Retrieved from https://www.westga.edu/~distance/ojdla/spring131/watson131.html
Wellman, G. S., & Marcinkiewicz, H. (2004). Online learning and time-on-task: Impact of proctored vs. un-proctored testing. Journal of Asynchronous Learning Networks, 8(4), 93–104.
Yates, R.W., & Beaudrie, B. (2009). The impact of online assessment on grades in community college distance education mathematics courses. American Journal of Distance Education, 23(2), 62–70. https://doi.org/10.1080/08923640902850601