This paper was presented at the ICDE Virtual Global Conference Week 2021: Upskilling and upscaling for quality Open, Flexible and Distance Learning (OFDL) during the week of 25 – 29 October 2021, and the contribution was preselected for publication in Open Praxis.
The shift to online assessment in higher education has generated debates on academic integrity (Farrell 2020), also highlighting good practice in safeguarding the assessment process. Academic integrity is commitment, even in the face of adversity, to fundamental values: (1) honesty, (2) trust, (3) fairness, (4) respect, (5) responsibility, and (6) courage. Academic misconduct refers to practices that are not in alignment with these values and this commitment. Academic integrity is often referred to in a more specific context of supporting students to avoid academic misconduct (QAA, 2020a; 2020b). There seem to be two dominant threads in such debates which are far from complementary. One involves promoting creative design of assessment to support academic integrity by employing the fundamental assessment principles, i.e., authenticity, inclusivity, validity and reliability. It also involves clear guidelines to students about expectations of institutions around referencing, plagiarism and collusion. The other thread provides technological and practical safeguards to protect academic integrity such as online invigilation software (proctoring systems), text-matching software which identifies either similar or exact matches between submitted student work and other digital material, moderation of marking, and the use of mechanisms such as vivas to verify student academic work.
Research on academic offences has focused mainly on technical challenges, rather than ethical and social issues (the latter has been researched in more depth in traditional assessment methods, e.g., Wright 2015). Some scholars seem to combine both threads above, by recommending online synchronous assessments as an alternative to traditional proctored examinations, while still maintaining the ability to manually authenticate (Chao et al. 2012). In these assessments, different formats are designed to test different types of knowledge, e.g., quizzes are designed to test factual knowledge, practice to test procedural knowledge, essays to assess conceptual knowledge, and oral for metacognitive knowledge. Additional measures that have supported academic integrity involve offering a randomised access to multiple question banks or essay questions as a mechanism to reduce the propensity to cheat, by reducing the stakes through multiple delivery attempts (Sullivan 2016).
Pre-pandemic, the delivery of online assessments and examinations was via options such as bring your-own-device models where laptops are brought to traditional lecture theatres or examination centres, the use of online invigilation software on personal devices in locations selected for online assessment or the use of prescribed devices in a classroom setting. The primary goal of each option has been to balance the authentication of students and maintain the integrity and value of achieving learning outcomes.
According to Butler-Henderson and Crawford (2020), as institutions begin to provide higher learning flexibility to students with digital and blended offerings, there is a scholarly need to consider the efficacy of the examination delivery mode options. In the case of assessment via invigilated examinations, universities have been far slower to progress innovative solutions despite growing evidence that students prefer the flexibility and opportunities afforded by digitalising exams (ibid.).
With the onset of the pandemic, the options for the delivery of timed assessments narrowed significantly across the Globe. The paper explores the outcomes of a project at the University of London that evaluated the pivot to online assessment. Students at the University of London are distributed across over 180 countries, studying at a distance. During the Covid-19 pandemic, students had to move to online assessment in place of conventional examinations and approximately 110,000 exams events were impacted by closure of exam centres. The institutional assessments which were delivered online in 2020, had been prior to March 2020 designed as pen and paper fixed time examinations taken in local examination centres all over the world (the university’s established examinations standard practice). In the pivot to online assessment, three overall types of online examination were used: proctored exams, fixed time unseen closed book style exams and unseen but open book exams with a longer response time. An ‘open-book timed examination’ is an assessment method designed to allow students to refer to either class notes and summaries or a ‘memory aid’, textbooks, or other approved material while answering questions under controlled conditions (Bristol Institute for Learning and Teaching, n.d). The response time referred to the submission window, i.e., the time the students had to complete and submit their answers. This varied from 24 hours or several days. In terms of content, changes in most cases consisted of moving to open book exams, redesigning questions to discourage plagiarism (including self-plagiarism from student’s previous assessed work). The rationale was to reduce reliance on rote learning and introduce an element of flexibility and ease anxiety by establishing windows of submission of variable length.
The project aimed to collect data about and generate understanding of this transition to online assessment, at the University of London, primarily from the perspective of the experience of the students who have been affected and those who examined them. The primary aim was to answer a fundamental question: What was the impact of the transition to online assessment on the experience of students and student outcomes? To this purpose, our methodological approach comprised (Hatzipanagos et al. 2020):
The quantitative data collected via the surveys of students and examiners were analysed and dominant themes were identified. The qualitative data based on interviews with students were also transcribed and analysed. The interviews offered the opportunity to explore in depth student attitudes to online assessment and elaborate on emergent themes in the survey. This paper focuses on supporting academic integrity in distance learning environments by exploring the key themes in student and staff perceptions about integrity, academic offences, and related pedagogical issues. The following sections describe the findings of our investigation.
66% of students reported in the survey that they preferred to continue with online assessments. Overall, students indicated they had a positive experience with the online assessment. Advantages were mainly perceived to be flexibility and lower cost (inc. savings on travel and accommodation to attend examination centres). The added advantage of moving to the online assessment for most students was having extra time to complete their exam, when open book exams allowed a longer response time.
Assessment timeframes ranged from one to four hours for the original in examination hall papers but shifted to between one hour and seven days for completion of the online assessments. Responses to the time extension of the submission windows have been mixed. Whereas for some students the longer submission window was an advantage, for others having additional time was not viewed as a positive change as it was anxiety-inducing and thought it might encourage academic offences.
Students who received a long submission period were significantly likely to suggest the online format makes no difference to cheating risks, and unlikely to believe it increased risks. However, there was clearly a perception amongst some students that extended time given to complete an exam (i.e., more time than is ordinarily given in a written exam at an examination centre) could increase the risk of cheating. Students suggested the extra time might allow students to look up model answers or communicate with others. Responses also provided insight into students’ views around academic offences. Students who received a short submission period were the group more likely to suggest it decreased risk (8%). Students receiving a medium submission period were significantly likely to believe online assessment increased risk.
A random open sample of 1750 responses was thematically coded to understand students’ experiences and perceptions. Of these responses, 3% suggested concerns with cheating or a lack of invigilation, and 3% indicated concerns with plagiarism. 4% of this coded sample suggested satisfaction with invigilation or a lack of cheating concerns. The findings are based only on the small sample of students who expressed concerns in this area and is not representative of the whole sample.
The open book exam format was thought to potentially encourage cheating by some students. In addition, some students were confused as to how they could use materials during their exam. They felt uncertain around what was permitted or not to avoid plagiarism and about an ethical code around cheating, which should be articulated, discussed and embraced by the students. Examples of comments illustrating these views were: “The University never actually made it clear what ‘cheating’ was…. A lot of my worry about these exams is that I don’t know if my code of ethics is the same as other students”.
Six interviewees reported cheating concerns in relation to students seeking help from either tutors or peers. Ultimately, students were concerned about how others’ cheating would impact their grade. Many comments suggested a need for more invigilation, simply to address processes that should be in place. simply because there appeared to be a lack of processes in place to ease concerns.
Overall, students agreed that the recognition of their qualification would not be negatively affected by the online examinations as universities across the world responded to the pandemic by the pivot to online assessment and the requirement to embrace alternative forms of assessment. Open comments suggested that a minority of students recognised that the potential for increased cheating risks might negatively affect recognition of their qualification, or impact grades. A number of students stressed the need to maintain the credibility of their programme if online assessment were to continue and underlined the importance of making academic integrity a key issue in online examinations in order to retain credibility for their qualification. In this respect, it was agreed that continued reassurances around the security and invigilation process could be beneficial.
A majority of examiners seemed to welcome the move to mark typed examination scripts, compared to handwritten scripts, as the latter tend to be more legible and easier to access. Over half (51%) of examiners said the move to online assessment helped students achieve higher academic standards in submitted work than previously.
We asked examiners to identify the top adjustments they would recommend the University makes for future online assessment to minimise academic offences. The dominant responses were for all exams to be converted into an open book format which beyond re-writing the content of the exams would make use of text-matching software during submission and for exams to be invigilated if necessary. The open text comments in the survey called for greater levels of (a) student training on academic integrity to avoid plagiarism and collusion and (b) staff training on identifying plagiarism. An area that was identified as requiring improvement if online assessment is used by the university in the future flagging of answers for possible plagiarism.
In addition to reflecting many natural situations, where professionals may have access and expected to use diverse reference sources, open book exams seem to support students avoiding academic misconduct. Developing assessment activities that allow students to demonstrate the application of their knowledge in relevant contexts can be effective in reducing collusion and cheating.
The major solutions to be worked towards the transition to online assessment include either an infrastructure with online invigilation software which controls identity and proctoring processes, along with communication and/or a partnership with programme teams in (re)-designing assessment to support academic integrity. The use of proctoring systems appears to receive a mixed reaction from students (De Santis et al. 2020) and presents significant hurdles as far privacy and identity control are concerned. The adoption of open book exams has to take into account disciplinary differences, e.g., open book exams are more suitable in some subject areas than others, and professional accreditation regulation, in particular in disciplines such as Law and Accountancy.
Supporting students to understand the ways in which they should approach the inclusion or not of words and work, which is not their own, has been important in preparing them for online assessment. This includes redefining the regulations institutions have in place. For example, where an institution may have clear rules about the equipment and resources a student can take into an examination hall, rules for online assessments may include additionally the use of text-matching software for all submissions and requirements for clear signposting by the student to indicate where others’ words are being quoted.
Online assessments over extended time-periods, for example, 24 hours or more, can be challenging for students with little experience of this format. Some students in our study indicated that they needed guidance on to how utilise this time effectively. To support this, a variety of innovative resources and adaptations to the assessments were introduced. For example, using word counts per question or per exam give an indication of expectations and resources that help students to develop approaches to time-management, concept-mapping and managing wellbeing during assessment are useful student supports.
Clear communication, particularly around academic integrity expectations is required. It is useful to illustrate how academic integrity is maintained for example by using interactive quizzes, providing opportunities for question and answers about online assessment, and providing practice opportunities. Giving students an opportunity to try out the digital system as well as examples of the new assessment question types are important in scaffolding student opportunities for success and to reduce stress. Having helplines and enquiry systems and backup routes to submit work before during and after the assessment is important as students may face technical issues including electrical outages or loss of WIFI.
Table 1 presents a summary of actions taken by the university in response to the recommendations of the project on the evaluation of online timed assessments to support academic integrity.
|RECOMMENDATIONS FROM EVALUATION||ACTIONS TO SUPPORT ACADEMIC INTEGRITY|
|Communication strategy is agreed early and a detailed and coordinated communications plan is developed which includes timely and relevant information for students, examiners and other internal stakeholders.||Communications were coordinated with programme
teams to provide a timely, accurate and clear communication
Students who requested special exam arrangements had separate communications and adjustments to timelines for submission were facilitated.
|A review into best practice of how students are inducted into the online examinations process is carried out and shared with Programme Directors.||A short course for students has been developed
covering referencing requirements, a quiz on plagiarism, a quiz on what
Turnitin can do, and how it is used. Complemented with wellbeing and
support for students preparing for assessments.
Student experience communications directed students to the course.
|Students and training resources are developed. The use of webinars as a vehicle for delivering this training should be maximised.||Considerable and careful redevelopment of the
General Regulations, Admissions Notices and other information that is
student focused took place. This included programme specific
communication in additional to general communication.
Programme specific videos produced by several programme teams and webinars used with students. Virtual Learning Environment (VLE) drop in and Q&A sessions were run by programme teams.
A short course was developed to support student use of speech to text/text to speech software (this is relevant for some students with special exam arrangements, but also for the wider student population).
|Identification of examination format and
||Wide range of approaches were identified from
short, fixed times (1 hour) to longer timing (24 hours) and in some
instances assessment available for 24 hours, however students were given
a fixed time (e.g., 3 hours) from opening of the assessment.
Open book/open world format was established.
|For those examinations requiring invigilation the institution engages online examination software that assures examinee identity and examination proctoring.||No proctoring was used in summer 2021. All online timed assessments were delivered on the VLE.|
|The institution reviews online marking platform to be used and considers the examiner voice in designing the marking process, including an online induction for examiners, and at minimum a ‘getting started with online exams’ pack of information.||Extensive work was undertaken to ensure a more
efficient and user-friendly examiner experience.
Guidance documentation for examiners, to help them on the VLE, was re-designed to enable much easier navigation and included a ‘quick guide’ for each individual programme.
Training materials for Examiners were developed – to support understanding of how to use Turnitin for example.
|Professional development on assessment design to support academic integrity, is offered through professional development.||On-going.
In addition, a resource for examiners on how to use Turnitin was developed.
|Continue to work with international regulators in understanding local expectations and requirements regarding face to face or online assessments.||On-going.|
In conclusion, our investigation gathered data to assess the success of alternative assessments and the future of assessment in distance learning environments, particularly where online exams are employed to assess student learning. The move to online assessment instigated an urgent review of assessment practices in distance learning environments. In these environments the standard practice before the pandemic i.e., invigilated unseen exams that were the mainstay of distance learning providers, needed urgent measures so that students did not get disadvantaged in the unprecedented circumstances of the pandemic.
Our evidence indicated that the transition to online assessment has facilitated changes and innovative practice in approaches to assessment because of the increased use of online exams. This has implications for how institutions approach academic integrity in assessment practice going forward. The move to online assessment requires:
The introduction of alternative forms of assessment created opportunities to redesign assessment and institutional frameworks within which students succeed and progress. Positive attitudes to online assessment are highly likely to increase in the future worldwide as network infrastructure improves along with student and staff confidence in virtual working. Higher education institutions are presented with the opportunity to move examinations online in a planned and on a permanent basis. This has to be done however in consideration and mitigation of the risk to academic integrity. Future research and development should consider further pedagogical innovations for responding to academic offences rather than institutional punitive arrangements and overreliance on technological solutions of authenticating and monitoring online assessment environments.
The authors would like to thank the members of the 2020 evaluation team at the University of London, James Berry, Huw Morgan-Jones, Ellen Hauff, Amardeep Sanghera, Michael Sawyer, Hannah Dorothy Mary Shekhawat and the education market research agency Shift Learning for their contributions to the collection and analysis of the project data on which this paper was based.
The authors have no competing interests to declare.
Bristol Institute For Learning and Teaching. (n.d). Online open-book exams. Retrieved from https://www.bristol.ac.uk/bilt/resources/blended-teaching-guidance/information-for-staff/online-open-book-exams/#a
Butler-Henderson, K., & Crawford, J. (2020). A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers & Education, 159(104024). DOI: https://doi.org/10.1016/j.compedu.2020.104024
Chao, K. J., Hung, I. C., & Chen, N. S. (2012). On the design of online synchronous assessments in a synchronous cyber classroom. Journal of Computer Assisted Learning, 28(4), 379–395. DOI: https://doi.org/10.1111/j.1365-2729.2011.00463.x
De Santis, A., Bellini, C., Sannicandro, K., & Minerva, T. (2020). Students’ Perception on E-Proctoring System for Online Assessment. Enhancing the Human Experience of Learning with Technology: New challenges for research into digital, open, distance & networked education. European Distance and E-Learning Network (EDEN). Proceedings 2020 Research Workshop | Lisbon, 21–23 October 2020.
Hatzipanagos, S., Tait, A., & Amrane-Cooper, L. (2020). Towards a post covid-19 digital authentic assessment practice: when radical changes enhance the student experience. Enhancing the Human Experience of Learning with Technology: New challenges for research into digital, open, distance & networked education. European Distance and E-Learning Network (EDEN). Proceedings 2020 Research Workshop | Lisbon, 21–23 October 2020. DOI: https://doi.org/10.38069/edenconf-2020-rw-0007
QAA (2020a). Assessing with integrity in digital delivery. Retrieved from https://www.qaa.ac.uk/docs/qaa/guidance/assessing-with-integrity-in-digital-delivery.pdf?sfvrsn=d629cd81_6
QAA (2020b). Academic Integrity Charter for UK Higher Education. Retrieved from https://www.qaa.ac.uk//en/about-us/what-we-do/academic-integrity/charter
Sullivan, D. P. (2016). An integrated approach to pre-empt cheating on asynchronous, objective, online assessments in graduate business classes. Online Learning, 20(3), 195–209. DOI: https://doi.org/10.24059/olj.v20i3.650
Wright, T. A. (2015). Distinguished scholar invited Essay: Reflections on the role of character in business education and student leadership development. Journal of Leadership Organizational Studies, 22(3), 253–264. DOI: https://doi.org/10.1177/1548051815578950