The reaction to the COVID19 pandemic by educators is seeing a tectonic shift in the mix of assessment tasks in most units at many universities around the world. The largest and perhaps the most jarring for many is the move away from invigilated high stakes on-campus exams. The needs of physical distancing, scalability, organisational, personal readiness has led to decisions to halt the use of face to face invigilated assessments such as practical and written examinations.

One early hope was that remotely staffed online invigilation (proctoring) services may come to the rescue. However this hope has been dispelled in recent weeks. A recent article by Professor Michael Sankey, President of ACODE, in Campus Morning Mail summarised the position that an increasingly large number of university teaching and learning leaders have come to realise – that online invigilated exams are not the right fit for right now and therefore online proctoring won’t make up the difference. He labelled such as “driving headlong for a cliff”. The reasons for this include that the technology can’t scale quickly enough, is seen as complicated, as radical new technology for both students (who at other universities have made known their displeasure for the idea of outsourced spies peering at them through their webcams) and academics to adopt in the time available and it is also seen as too expensive in a time of crashing incomes. Further, it is thought that the service providers themselves are likely to be overwhelmed. In the last couple of weeks we have already seen and heard of senior teaching leaders at several universities severely limit or prohibit the use of online proctoring via service providers. Our own Macquarie University has similarly agreed. Where remote invigilation is being considered it is being limited to cases where accreditation bodies stand steadfast or for final year finishing students. In cases where online proctoring is to be used, it will require significant resources and support for the community of academics, administrators and students to achieve acceptable outcomes. Those such as University of New England who have made the move to online invigilated exams have done so over a period many years through careful planning and support. Kylie Day and Jennifer Lawrence spoke about their experience of implementing remotely invigilated online exams at UNE in a recent Transforming Assessment webinar on 25 March.

As a consequence we are now seeing a large increase in the proportion of online non-supervised assessments to replace written examinations. This includes regular assignments and projects as well as take home time limited tasks and unsupervised online quizzes. A lesser number of units are planning for measures such as an online Viva Voce exam or online presentations. These decisions mean that as the current semester progresses there is potential for gaps to appear in the overall academic integrity landscape. In a normal semester, some may perceive that invigilated face to face exams served as a back-stop in the overall assessment integrity framework of their unit, in essence it was seen as a balancing element in the overall checks and balances that maintain a respectable environment of academic integrity.

It is recognised that the shift in the mix of assessment away from invigilated formats provides different opportunities for cheating, especially the use of commercial contract cheating services. Previous research into contract cheating by Bretag et al has shown that a persistent portion of cheating is present (around 6% in a recent national survey admitted as much – ref) and is influenced by a variety of factors including perceived opportunities to cheat, threat of detection, perceived value of assessment tasks and the quality of student support mechanisms. While a wholistic approach to academic integrity is certainly advocated that includes a toolkit of support, education and detection measures, the past balance of measures may no longer cope with the shift. In the past text matching tools have served us well in light of the huge increase in student numbers of recent years however such tools are of little use in detecting custom written work produced via contract cheating activity. It could be argued that wide spread use of text matching has pushed the cheating game into the hands of contract cheating services. Technology tools that would enable automated detection of outsourced custom work are not yet in widespread use, and there is limited evidence so far about effectiveness and accuracy. Given that there is no realistic digital detection available over the short to medium term, the job of addressing contract cheating largely falls on front line academic staff – but can they actually detect contract cheating? Where does this leave us in terms of maintaining academic integrity standards?

There is a light on the horizon that suggests that there are viable ways forward in the time scales we are now dealing with given the rapid pivot to online teaching in the time of COVID-19.

The first is work by Phillip Dawson and colleagues at Deakin University who have conducted practical research into the viability of equipping academic teaching staff with the skills required to successfully detect and evidence instances of contract cheating [pdf]. While detection is not proposed as ‘the sole answer’ it will play an important role in maintaining the overall framework of academic integrity in a world where invigilation is no longer playing the role it once was. Phill presented a Transforming Assessment webinar on “Detecting and addressing contract cheating in online assessment” 29 April 2020. The session recording is available.

The second is to adopt an old and proven method of oral assessment (Viva Voce) mode assessments, but in a format built for the contemporary online world. A previous Teche post explored the idea of oral exams, possibly as short recorded responses to iLearn (Moodle) quiz questions. The Griffith model has taken a more authentic approach where students select time slots via online booking tools and undertake short oral presentations or talks live via online video conferencing tools in the style of authentic tasks. This could include a mock job interview, a presentation to the board or a media briefing but in this case the discussion is with the assessor who mark on-the-go using a rubric. The team from Griffith University Business School have put together a guide to what they call ‘interactive oral assessment‘ that they claim can scale to large numbers. It has been tried with classes of over 300 students. The Griffith team presented a Transforming Assessment webinar on “Authentic online oral assessment – an exam replacement” 30 April 2020. The session recording is available.

I would encourage the audience to join in the free-to-all ASCILITE e-Assessment SIG sessions focused on addressing “Teaching in the time of COVID-19” at Session recordings will be published following each event.

Your thoughts on the impact upon academic integrity of recent changes to assessment is welcome below. We would love to hear about the measures are you taking in your discipline to address it?

Feature image via Freepik

Posted by Mathew Hillier

Mathew has been engaged by Macquarie University as an e-Assessment Academic in residence and is available to answer questions by MQ staff. Mathew specialises in Digital Assessment (e-Assessment) in Higher Education. Has held positions as an advisor and academic developer at University of New South Wales, University of Queensland, Monash University and University of Adelaide. He has also held academic teaching roles in areas such as business information systems, multimedia arts and engineering project management. Mathew recently led a half million dollar Federal government funded grant on e-Exams across ten university partners and is co-chair of the international 'Transforming Assessment' webinar series as the e-Assessment special interest group under the Australasian society for computers in learning in tertiary education. He is also an honorary academic University of Canberra.

Leave a reply

Your email address will not be published. Required fields are marked *