It is the exam period now here at Macquarie  – probably the most intense time for students and instructors. The former are relentlessly cramming, while the latter drink copious amounts of coffee and/or consume lots of chocolate to mark handwritten papers in a short space of time.

I am not a fan of exams. I didn’t enjoy taking them as a student, and I enjoy them even less when I need to mark them. If it were up to me, there would be other types of assessment, but I recognize that the exams are well-entrenched, and that some disciplines do need to check students’ performance under pressure.

Fair enough.

My biggest problem with exams is the fact that they often focus on ‘recall’ (students, prove to us that you remember facts/names/details) as opposed to more ‘life-like’ tasks of making sense of and using particular information.

So the first thing to ask yourself when putting together an exam is ‘what are my tasks really testing?’ Am I giving students an opportunity to analyse, synthesise, create or apply their knowledge in this exam? How similar are the exam tasks to ‘real-life’ scenarios?” These considerations will help you develop more pedagogically-sound practice for exams.

Are there other considerations to help make exams more ‘pedagogically-sound’?

Here are my top 3 picks:

  1. Consider an ‘open-book’ exam

Closed book exams tend to promote rote-learning and are often about regurgitating facts.

An ‘open book’ exam, on the other hand, often presents a ‘real-world’ scenario or some data, and asks students to analyse it, evaluate it, suggest a possible solution, etc. You can do it in a ‘closed-book’ exam too, but an ‘open-book’ exam potentially opens up (pardon the play on words!)  more question types and what you can include in the exam.

There is a growing body of research that suggests that open-book exams do a better job at equipping students for their future professional lives, and enhance learning (see here for example). Studies also show that open-book exams do not mean ‘score inflation’ (see, for example, this study that compared closed-book and open-book exams). And, frankly, in my opinion, the answers to open-book exams are usually a lot more interesting to read and mark than ‘closed-book’ exams.

  1. Have an ‘exam debriefing’

Exams can be a valuable learning opportunity, but, more often than not, students hand in the papers, get a grade… and that’s it.

Wouldn’t it be better to have at least a brief discussion about the exam?

The easiest way is to arrange a 15-30 minute Q&A session straight after the exam (e.g. 10 minutes after the papers are handed it) ideally to foster peer-to-peer discussion.

Another idea is to have a ‘debrief’ (face-to-face or webinar) after the papers have been marked and before the grades are released. This is probably the best time for the debrief, as the markers have a good idea of the typical issues and students will be ‘all ears’ at this point. This is a really good time to discuss key issues suggest the ways students could improve in the future. Feedback sessions of this nature are important for student growth. If they receive a final mark but don’t understand why that mark was achieved, their ‘learning loop’ isn’t complete.

  1. Get students to self-correct

Several studies (e.g  Gruhn and Cheng, 2014; Montepare, 2005) advocate allowing students to keep their exam tasks. Students in these studies submitted their exam answer sheet, then took a copy of the exam paper home and prepared another version of their answers at home. Both ‘in-class’ and ‘at-home’ exams were scored, and a cumulative mark was awarded.

I like the idea of self-correction, but I have my doubts about double-marking and a cumulative mark. Personally, I would approach it a bit differently: I’d allow students to hand in their answers, take the exam tasks home, identify 2-3 things they wish they’d done differently, explain what they would do differently and why and submit this ‘exam reflection’ via Turnitin. I’d then award additional marks (e.g. up to 5 points) to those students who have done their ‘exam reflection’ – higher points to those who showed deeper reflections.

So, there you go. Just a couple of ideas on how to make exams more ‘learning-friendly’.  

Do you have any other ideas on how to get a bigger ‘learning bang’ for your buck when it comes to exams? Share them in the comments below, or drop me a line.


Green, S. G., Ferrante, C. J., & Heppard, K. A. (2016). Using Open-Book Exams to Enhance Student Learning, Performance, and Motivation. Journal of Effective Teaching, 16(1), 19-35.

Gruhn, D. and Cheng, Y., (2014). A self-correcting approach to multiple-choice exams improves students’ learning. Teaching of Psychology, 41 (4), 335-339.

Montepare, J. M., (2005). A self-correcting approach to multiple-choice tests. APS Observer, 18 (10), 35-36.

Posted by Olga Kozar

I'm a 'long-term' Mq girl. I did my PhD here and taught on different courses, ranging from 1st year to PhD students. I now work in Learning and Teaching, which I love. I have 2 young kids and a dog, and I love meeting other Mq people, so give me a shout if you'd like to talk 'learning and teaching' or would like to brainstorm together.

Leave a reply

Your email address will not be published. Required fields are marked *