Recently I attended the Learning Analytics & Knowledge Conference in Sydney. This is an international conference for research into learning analytics in the higher education sector. (Also check out Jeremy Hind and Fidel Fernando’s experience of the conference).

There were over 50 academic and 13 practitioner papers presented. You can read the papers by going to the research section of the Arts COP on iLearn.

Whilst I found some of the papers presented overly technical and theoretical there were a few that could have application at Macquarie University. I summarise the themes of the conference and a few papers that were of interest to me.

Some of the themes running throughout the conference included (in order of the most papers submitted) user-centred design, evaluation and feedback, retention, dashboards, performance prediction, self-regulation, infrastructure, academic analytics, and student behaviour. Retention previously had been a primary concern for professionals in learning analytics, however as LA is maturing other areas are now becoming of greater interest to the LA community. These include dashboards (both for teachers and or students) and self-regulation.

In the Practitioner paper by Rehrey et al., Implementation of a Student Learning Analytics Fellows program, the author outlines a program at Indiana University that supports scholarly research on teaching and learning and student success. The primary purpose is to encourage institutional change in the use of learning analytics to help embed data-driven decision making. Academic and Professional staff can apply for a grant to support their research question that aligns with the university objectives. These are to support student retention and graduation of students and develop best practices for recruiting and retaining diverse students. This is an annual process that is supported by the Institutional Research Office, that helps to refine the research proposal, develop a research strategy and provide the data and tools required. One of the biggest obstacles has been to provide the data required to support these research projects. This has been overcome to an extent by the office coordinating between the relevant data stewards on campus and developing standard protocols for access to this data. The program is in its third year and has supported 29 research projects across the university. Participants were asked pre and post surveys to assess changes in attitudes around learning analytics.

Another Practitioner paper, this time by Whitmer et al, Do students notice notifications? Large-scale analysis of student interactions with automated messages about grade performance has some lessons for the types of messages which students can receive from the various learning analytics tools available at Macquarie University. They found that it was more useful to separate the student cohort into 5 distinct clusters: low performers, decliners, improvers, “malleable middle” and high performers. The clusters refer to the type of rule-based notification they receive (eg if a student is getting a message because their grade has decreased they are a decliner). They looked at the percentage of students in each cluster that actually opened (and presumably read) the notification. They found that high achievers showed the highest “open rates” of 44% followed by the improvers 41%. They also found that students receiving varied notifications (the malleable middle) had similar rates of opening their messages as the lowest open rates from low performers. Another interesting finding is that in this study, students interested in comparative information (eg you did better or worse compared to the class average) are “indifferent to whether the feedback is positive or negative, while students show a significant preference for positive feedback when presented with a notification focusing on the student’s own performance”. This study had numerous limitations, one of which is that it did not look at whether these messages actually changed student behaviour, merely if they opened them. However, it at least offers an insight into potentially thinking about the student cohort in terms of different clusters, and perhaps targeting different types of messages. The authors also concluded that students are more interested in receiving messages recognising positive achievement compared to identifying areas for improvement. This suggests that whilst students at risk should be identified and contacted, students will appreciate being contacted if they have performed well in a task.

Posted by Chris Froissard

Leave a reply

Your email address will not be published. Required fields are marked *