Following on from this introduction to alternatives for campus-based invigilated examinations, this post looks a little deeper regarding one of the suggested approaches.  COVID-19 shows us that sometimes, an old solution might just do the trick!

If we did not already know it, this COVID-19 Teaching Session, has revealed to many of us that the online learning environment creates new demands when it comes to finding effective ways to enable student learning.  Examinations are one of the areas that has gained much attention around the University, Australia and the world in recent weeks.  Colleagues should refer to this recent post on this issue which offers much food for thought.

At the start of semester, 482 units were scheduled to deliver an exam in the formal examination period.  On 8 April and after an examination of the issue and some sector benchmarking, the University Executive Group resolved the University, for reasons of time, cost and technology, could not deliver a centrally supported online invigilation examination system engaging a third party provider such as ProctorU or Examity.

Of the 482 units that had planned exams, there remain about 166 units still thinking about how they will deliver an exam or the alternative to an exam.  Still others are nailing down their final approach.  Mathew Hillier’s post offers some ways to think about alternatives to exams that are authentic and meet student needs.  This post and others, however, will focus on ways colleagues can still deliver their own exam without central support.

Recently I read an article on the subject of using video/voice recording of the student to conduct an exam. 

In 2020 Akimov and Malin from the Griffith University Business School produced a timely study titled “When old becomes new: a case study of oral examination as an online assessment tool” in the journal Assessment & Evaluation in Higher Education.

Some of us are old enough to remember a time when the ‘viva’ (or ‘viva voca’) was still a part of University teaching, especially in the research space (the so-called ‘oral defence’) and some disciplines have persisted with this technique in various ways to this day.  Viva’s lost support as an assessment tool in the 1990s not because they were ineffective but because they were costly.

In an online environment Akimov and Malin have returned to this type of oral examination.  They argue that new technology provides new opportunities for oral examination and provide a possibility that can meet the challenges of “accessibility and legality, identity and security and academic dishonesty”.  Their case study is an online postgraduate finance unit that delivers an online oral examination.

Akimov and Malin found that the online examination achieved the following:

  1. Was a good means through which to test whether students had acquired in-depth knowledge of theoretical concepts
  2. Complemented other assessment activities in the unit

Students book in for a 30 minute block for the oral exam which is worth 40% of the final grade.  The exam is broken up into three activities where the students have to provide a verbal response.  One activity related to their earlier assessment project.  The academic conducts the exam in real time and then completes a marking rubric at the end of the exam.  Students have to present their drivers licence at the commencement of the activity to prove identity (a technique common with the big third party online proctoring platforms).

As noted, Akimov and Malin’s approach was conducted synchronously.  The academics were online live with each candidate over the two days the exam was conducted. 

I was left thinking could this activity be delivered synchronously but on a mass scale?  At a prescribed date and time could students log into iLearn.  The academic releases the questions.  The students record their answers and upload the result to iLearn within a set time frame.

Talking to Jeremy Hind from the Learning and Innovation Hub, there are a few different ways students can record oral responses in iLearn.  There is an audio recording function in iLearn quizzes and Jeremy and the team have recently made some changes in iLearn in anticipation of some of these requests.   Current limitations are you can only record audio or video for three minutes per recording.  And alas it doesn’t work on iOS devices like iPads and iPhones.  This said, Jeremy and the team can provide alternatives for iOS though it is a little more complicated.  This post on exams mentions that the phone could work in this environment.

The advantage of doing this through the quiz function is that you can conduct the marking through the normal iLearn workflow.

Our colleagues in Medicine, Health and Human Sciences have applied this functionality previously.  See their very useful video on the subject: https://vimeo.com/334329857/6e8aaf0d0b

Speak to the LIH team or your Faculty Learning Designers if you want to explore this exam option.

Further reading:

Akimov, Alexandr, and Mirela Malin. “When old becomes new: a case study of oral examination as an online assessment tool.” Assessment & Evaluation in Higher Education (2020): 1-17.

Posted by Sean Brawley

5 Comments

  1. Mathew Hillier 9 April, 2020 at 7:06 pm

    Thanks Sean – I like this a lot 🙂

    I would like to offer the following as a starter in terms of a possible ‘how-to’ options guide for Viva Voce exams. I have two main suggestions here – using Moodle quiz or using Zoom. There is a trade off between scalability, integrity and task authenticity with this certainly coming into stark relief in the current COVID-19 induced rapid pivot online.

    Quiz:
    Sean, as you pointed out the use of the Atto editor in Moodle quiz now allows the student to respond with a 3-minute video or audio recording when using the essay question in the quiz. Many of the quiz settings I suggested in the attachment to the previous post on examination alternatives are applicable here too.
    The quiz could be mixed mode with some written and some spoken responses.
    If the quiz is purely made up of audio or video responses we could use ‘one question per page’ with ‘no backtracking’. While this goes against the principle of allowing student to check over their work, that is not the way a Viva exam works. Therefore in this instance it would be justified in replicating the Viva Voce exam style where someone is asking and receiving answers to questions verbally.
    We could consider embedding audio or video files in place of the ‘question stem’ too. Thus the student plays the question and then records their response. This may be useful to present a short case study, demonstration, patient or scenario and then have the student verbally respond. In the cases of shorter questions that may be overkill where a text based question stem is going to be easier to create and for the student to read.
    We could consider randomising questions from a pool as per the quiz setting suggestions in my previous Teche post attachment. The extra questions can also serve as back-up questions for any repeat run in case a student encounters technical issues in their first go.

    However we must acknowledge that some properties of invigilation present in live Viva Voce exams become lost using this quiz technique – in that it will be harder to detect if the student was coached or did web searching between the time they read the question and the time they made their recording. While we could use a tight time line for distribution and submission there is also the matter of needing to provide adequate time that takes into consideration the technical elements of the task including some buffer to cater for technical difficulties.

    If using a video recording question it would provide some degree of identity verification. We could have the student record a video of themselves showing their ID card as the response to a first question.

    At the end of they day there we are faced with a trade off between scalability and the highest levels of task integrity monitoring in our current circumstances. The quiz approach above may be a reasonable half-way point for running large scale viva-like assessment over a short time period.

    Zoom:
    Getting closer to a live Viva Voce could be achieved using Zoom (or similar) via video and screen share with an individual student and an examiner. This will allow the examiner to ask the student to do a 360 room scan and thus a greater level of assurance that they are not being coached. Similarly the live video would enable monitoring for the duration of the Q&A. The recording provides an audit trail and allows for moderation between student sessions and across examiners. However it is acknowledged that individual Zoom appointments are going to place a ceiling on the scalability of this exercise. The viva task could be allocated to class tutors to do these exams in order to enable some scalability. It would enable a larger number of parallel exams happening at one time. Some training will probably be required and perhaps a pre-exam moderation session (webinar) with the tutors to get everyone on the same page.

    In the case of senior and postgraduate units a deeper approach may be desired. In this case it would be best that topic experts are the ones asking the questions because a good Viva Voce includes subsequent questioning that may lead on from a prior statement by the student. i.e. a probing conversation in the Socratic tradition.

    However in the case of very large undergraduate units where there was no tutorial team then some of the rigor may need to be sacrificed at the alter of scalability. In that case the questioning could be done as a surface level activity i.e. for the primary purpose of maintaining academic integrity. In this case the viva would be made up of a series of question and one-step answers. In this case the person asking the questions need not be a topic expert. We could then draw upon other staff from across the university to act as the questioners. The session recordings would then need to be marked by topic experts. So it may not save time overall. It is unknown if reviewing a large number of short videos will take longer than marking the equivalent in the form of handwritten essay responses on a paper (messy handwriting included!). The marking is something that has to be done regardless so if decent rubrics are created then perhaps it could be done with some efficiency.

    About telephone use:
    In my prior article I suggested the use of a phone call – but to clarify the intention would be to use it as a back up rather than as the default in a large-scale exercise. If planning to use Zoom, we should ask students to make it known to us in advance if they are not able to meet the technical requirements (i.e. laptop + decent internet connection capable of sustaining a webcam enabled zoom session). We can ask for their phone number and we could still book times. We may choose to send the questions to the student a couple of minutes prior to the phone call or we could just ask the questions verbally. We could ask the student to use speaker phone (if possible) so that we can also have a chance of detecting any ‘coaching’ that may be occurring.

    In all cases above we should request that the student is located in a room on their own for the duration for the exam event.

    This comment way longer than I expected – but I will leave you with one further item of reading – a very good former colleague of mine Gordon Joughin is an expert in oral forms of assessment. He wrote a “Short guide to oral assessment” that is well worth looking at – grab it here https://teachlearn.leedsbeckett.ac.uk/-/media/files/clt/a-short-guide-to-oral-assessment-by-gordon-joughin.pdf

    I will also ‘leave my door open’ to Macquarie University folks who are interested in taking the conversation further – I am certainly keen to engage with you!

    Reply

  2. Matt,

    Thanks for your engagement on this piece and your general assistance to colleagues at this time of emergency and I encourage colleagues to take advantage of Matt’s willingness to flesh out ideas into more “How To’s”. You express some important considerations here. For me the conundrum at present is scalability and the development of a mass synchronous activity that does not require any form of invigilation but affords a level of security and deters dishonesty. We still have about 165 units seeking a solution that has the reassurance of campus-based invigiliation. These 165 units have a staggering 22k enrolments! This is my scaleability issue. If an approach like this could be delivered on scale that would be an achievement. I’m still thinking about the formal exam period with its two hour blocks. Could we still deliver this type of exam to hundreds of students at once? Its not like a take-home. The exam is scheduled in the exam period for a set period of time. The unit convenor opens the exam and the students complete and then upload within the specified time limit (eg twio hours) and then the convenor switches things off. Maybe someone should tell me I’m dreamin’!

    The video idea with the first question showing your ID seems a good idea and its one the big third parties use in real time online invigilation. Zoom could work though I guess you would have to test what number is manageable.

    I thought your suggestion about a mixed mode that maybe has a quiz and then an oral component was also smart.

    Cheers,
    Sean

    Reply

  3. P.s. An additional resource on oral assessment has come to light. This time by the folks at Griffith University who have put together some guidance on what they refer to as “Interactive Oral assessment” where the approach has been to do 10 to 15 minute authentically framed tasks. It has been used in classes of up to 300+ students. They give examples from their undergraduate Business programme such as job interviews, a briefing to a board, media interviews and briefings, short presentations with questions, explaining concepts or projects to stakeholders, or retrospectives covering prior assignment work. https://sway.office.com/yQ2s0Bm3ILkWtGll?ref=Link. A discussion on linkedIn provides some additional context and outlines the tools used. The discussion on LinkedIn is https://www.linkedin.com/posts/amandashelldaly_interactive-oral-assessment-an-authentic-activity-6654482877486600192-hEPx
    Hope this is helpful!

    Reply

  4. […] mode assessments, but in a format built for the contemporary online world. A previous Teche post explored the idea of oral exams, possibly as short recorded responses to iLearn (Moodle) quiz questions. The Griffith model has […]

    Reply

  5. […] Online oral examination […]

    Reply

Leave a reply

Your email address will not be published. Required fields are marked *