The Learning Analytics Team is a new team that brings together the Teaching Evaluation for Development Service (TEDS) and the iLearn Insights project. Our goal is to provide a strong evidence base for enhancement and innovation in teaching practices across the University. We also provide data for monitoring student engagement and learning, to assist both students and their teachers in understanding and developing the MQ student learning experience.
The Learning Analytics Team consists of:
- Shamim Joarder – Learning Analyst – creator and curator of the fabulous iLearn Insights tool and the soon-to-be-available MyLearn tool – a student facing version of iLearn Insights
- Alana Mailey – Learning Analytics Officer – survey processor, inquiries wrangler and TEDS front-of-house problem-solver
- Michael Marston – Learning Analytics Survey Coordinator, TEDS Team knowledge bank and trouble-shooter
- Cathy Rytmeister – Learning Analytics Manager, evaluation and feedback adviser, and living MQ history and culture storage unit (or so people seem to think😊)
The Teaching Evaluation for Development Service (TEDS) provides student feedback surveys for teachers and unit convenors who wish to obtain a student perspective on their teaching practice and unit curriculum. TEDS data can be used to inform development and quality enhancement of the student learning experience, from individual to institutional levels. The team can also assist with establishing and implementing L&T evaluation plans at project, unit and course level. Contact firstname.lastname@example.org if you have any questions about evaluation and student feedback.
Our current big project in TEDS involves refreshing our surveys in preparation for all units to have a standard LEU survey in Session 2. This will feed into the enhancement of student experience as well as providing baseline data for the review and monitoring procedures that the Higher Education Standards require. We’re also contributing to the revision of the Student Experience Surveying Policy and a draft version of the revised policy will be circulated in due course.
The iLearn Insights online tool for teachers and unit convenors is a one-stop-shop for the visualisation of student activity data within iLearn. It enables staff to monitor their students’ iLearn access frequency, interaction with iLearn activities and submission of assessment tasks, all in one place. iLearn Insights also provides possible indicators for students at academic risk (e.g. through missed activities, late submissions and/or infrequent iLearn access) and options for staff to contact students who may be falling behind.
Note that staff users of iLearn Insights can provide feedback and request developments using the feature request form. Staff are also welcome to join the iLearn Insights Group to receive information on new functionalities as well as a summary of key observations following each teaching session.
We are very excited to be developing a student-facing version of iLearn Insights, to be known as “MyLearn”. The goal is to provide students with the information they need to monitor their progress, gain insights into (and hence enhance) their study practices, and manage their time effectively in order to succeed in their studies.
The prototype MyLearn tool will be demonstrated in several forums over the next few weeks, starting, of course, with students. Our intention is to have an initial version of MyLearn available to students in Session 2 this year. MyLearn will have optimum utility for students when iLearn unit design complies with the standards currently being developed by the Learning Enhancement Team.
As a new team formed through the restructure of the learning and teaching portfolio, we’ve had some fun getting to know each other and learning about each others’ projects and areas of knowledge. We enjoy having a cuppa together at our Monday Team Tea catchup – sometimes f2f, sometimes on Zoom.
We all love working with data and helping others not only to understand it, but also put it to good use in making the student experience at MQ the best it can be. We often act as “critical friends” for each others’ projects – that gives us access to different perspectives and new ideas. This means a whole lot of new demands for data summaries and visualisation, as we shift from static to interactive reporting via data dashboards. Our challenge is to help people use data in sound and sensible ways, so we’re gearing up to ensure that the data we collect, the reports we produce, and the uses to which they are put, are robust and informative.
And now, some Learning Analytics Team Analytics…