Oral assessment, also called Viva Voce assessment (the live voice), provides an opportunity for authentic, interactive assessment that supports academic integrity. Differing approaches to oral assessment are outlined.

Oral Assessment by name

Oral assessment can be used as a replacement for or in connection with written assessment to enable judgement of student knowledge, reasoning, problem solving and skills. Some variations include:

  • Oral individual or group presentations of content, processes, sometimes including questioning.
  • Viva Voce exams, often made up of a series of questions and responses. Commonly used for thesis defence. An interview can be used to confirm student authorship.
  • Moot court and mock trials are a form of role play used in Law studies to assess understanding of law, document preparation, evidence, argument and processes (See Bending, et. al. 2023).
  • Objective Structured Clinical Exam (OSCE), common in health, the student provides oral responses or takes physical action in responding to problems encountered. Multiple stations can be used, each with a different mock patient, role or clinical situation.
  • Online Interactive Oral Assessment (IOA) uses online meeting technology with the session centred on an authentic scenario or role play that links to discipline or future profession practice. Examples are, a media interview, report to a board, pitch to client, response to crisis, or presentation of an artefact. Griffith University use IOAs as an exam replacement for up to 750 students (See Logan, et. al. 2021).


Dynamically explore student understanding using both verbal and non-verbal signals that convey meaning. The format supports integrity given it is in-person, dynamic, visual and can be recorded for moderation or auditing.


Time needs to be limited due to scale and stamina. Multiple assessors are needed to manage larger groups of students. Students from a second language background may struggle to express themselves orally.

Implementing Oral Assessment


Focus on the intended learning outcomes for the unit. Design questions and scenarios to be authentic using discipline-specific contexts and vocabulary. Prepare ‘how’ and ‘why’ open ended questions or prompts to probe understanding of processes and rationale. Plan follow-up questions to explore student knowledge and to increase integrity. Peer review the assessment design for quality.

Assessor training, the use of rubrics and moderating of teaching teams (assessors) can enable scale with consistency. In large units, typically member of the teaching teams will take on the of the assessors (e.g. the tutors) and so these people need training. Conduct pre-event moderation meetings focused in techniques, standards and expectations before as well as after the assessment event. Rubrics enable rapid marking during and between each student. Be sure everyone understands these rubrics and the time available for marking. Also, agree on a private communication channel for use by the team during the event so that any problems can be communicated and managed quickly.

Technology tools can help with organisation and scale. For example, an online booking tool (e.g. MS Bookings) allows scheduling of assessors and for students to select suitable times. Using online meeting software (e.g. Zoom or Teams) means that students can participate wherever they are located. The online meeting system also enables recording of the sessions for moderation, academic integrity, audit, marking finalisation and for feedback.

When using Zoom, either plan to use one Zoom link with a waiting room and breakouts for each examiner and student pair (however recording must be done locally by each examiner in that case) or use separate Zoom links for each assessor so that cloud recording can be enabled. The link to the meeting system can be placed within the booking system so that students are able to identify where they need to go and when. When scheduling, ensure there is a break in between each student for marking using the rubric and time for changeover between students.

Provide students with details of the process, an example (video), and ample zero-stakes opportunities to practice and if online, to test their equipment (e.g. zoom, mic, video). This reduces stress and problems on the day. Provide students with the expectations (attire, conditions) and clearly state the time limit (e.g. 20 minutes each).

On the day

Send students a reminder with instructions and expectations. Upon arrival, have students show their ID and explain the task to the student and ensure each student understands the expectations. If online, record each session with students’ consent (recording using a phone may be an option for face to face oral assessment events). Recording helps support integrity, appeals and grade moderation. If online, check students can clearly see images or screen share as appropriate. You may want to have students to a room sweep using their webcam to ensure integrity is being maintained.


Conduct moderation processes between assessors. Store recordings in a secure location (e.g. private echo library or one drive). Collate, moderate and upload student grades into the unit site grade book.

Provide students with feedback and an opportunity to debrief to close the learning loop. Debriefing can be done individually or as a class, in the latter case general observations may be discussed. Providing an opportunity for students to rate their own performance against the rubric and discus ways they could improve next time helps to develop student’s evaluative judgement skills.

Download this as a one page quick guide

This article is available in summary form as a two page LT Quick Guide:

References and further resources

Other Teche Posts on Oral Assessment


MQ L&T Staff development team. Prepared by Dr Mathew Hillier, Macquarie University, 30 Oct 2023

Posted by Mathew Hillier

Mathew has been engaged by Macquarie University as an e-Assessment Academic in residence and is available to answer questions by MQ staff. Mathew specialises in Digital Assessment (e-Assessment) in Higher Education. Has held positions as an advisor and academic developer at University of New South Wales, University of Queensland, Monash University and University of Adelaide. He has also held academic teaching roles in areas such as business information systems, multimedia arts and engineering project management. Mathew recently led a half million dollar Federal government funded grant on e-Exams across ten university partners and is co-chair of the international 'Transforming Assessment' webinar series as the e-Assessment special interest group under the Australasian society for computers in learning in tertiary education. He is also an honorary academic University of Canberra.

Leave a reply

Your email address will not be published. Required fields are marked *