iLearn Insights has two functionalities to analyse quiz question responses. These are especially helpful when you have set your quiz to randomise selection from the question pool in the iLearn question bank.

Quiz analysis – question by question

Quiz analyser is particularly important at the end of the session to determine questions that most students answered correctly versus questions that most students answered wrongly. A question that most students answered correctly might mean that the question is too easy. On the other hand, a question that is answered wrongly by most students might mean that the question is too hard or there may not have been enough reading materials to support that question. It also might be that the question itself is incorrect and needs rewording.

Quiz analyser can be accessed from Activity > Activity details menu then clicking on the ‘Quiz analyser’ icon within the quiz you want to analyse.

Quiz analyser provides the following information for each question within the selected quiz:

  • Gave up: Number of students that did not answer
  • Graded right: Number of students that answered correctly
  • Graded wrong: Number of students that answered wrongly
  • Graded right (%): Percentage of students that answered correctly. This column can be used to sort the report by most correct answer or least correct answer.
  • Preview: Direct link to iLearn to view full question

Quiz analysis – student by student

Quiz analysis – student by student‘ can be accessed from Activity > Activity details menu then clicking on the ‘Quiz question analysis – student by student’ icon within the quiz you want to analyse.

This report is useful for Unit Convenors during the session to identify students who answered a particular question correctly, wrongly or did not attempt (gave up). This report also provide information for partially graded questions as well as questions that need to be graded. Sorting by the total number of ‘graded right’, ‘graded wrong’ and ‘gave up’ columns will help to identify groups of students who answered most question correctly, wrongly or did not answer.

Initiative and thanks to Titia Benders, Senior Lecturer, Linguistics

Like many colleagues in Linguistics (and probably elsewhere!), I use online multiple-choice quizzes as low-risk assessments during the semester. Students appreciate the clear reason to study and the quick turnaround of marks. But while the automatic marking is great, I also want to analyse responses per question, to ensure that each question contributes to a reliable assessment. This used to be near-impossible for questions that had been presented in randomized order, as iLearn showed students’ scores on the first, second, and third question they had answered and didn’t group the scores by the actual questions I had written. This issue is now solved by the student-by-student analysis of quiz questions. It helps me identify the best and worst questions from each quiz, which means that next year’s assessment is an even more reliable reflection of students’ abilities.

Titia Benders | Department of Linguistics | Faculty of Medicine, Health and Human Sciences

Need more help?

Please contact Shamim Joarder or Jeremy Hind for more information. Further information can be found on the iLearn Insights website.

Access the other iLearn Insights posts in this series:

Posted by Shamim Joarder

Shamim Joarder has worked for various educational institutions since 2001. His eLearning journey began in 2009 working as an eLearning Project Manager at the University of Adelaide. Mr. Joarder currently works at Macquarie University, Sydney as Senior Learning Analytics Adviser, where he has designed and developed two unique learning analytic applications titled 'iLearn insights' and 'MyLearn'. He has also authored the book 'Moodle eLearning Course Design: A Flexible Approach'. Mr. Joarder received numerous accolades, including awarded silver at the QS Reimagine Education Awards 2023 in Abu Dhabi under the Regional (Oceania) award category, 2022 ACODE and CAUDIT Award for Innovation in Teaching and Learning, and was a finalist in the 2022 AFR Higher Education Excellence Awards. His work was first recognised in Singapore at the ASCILITE 2019 conference by winning the ASCILITE award for innovation. He also won 2021 Vice Chancellor's Learning Innovation award at Macquarie University, Sydney, Australia.

Leave a reply

Your email address will not be published. Required fields are marked *