7th March, 2018.
The Grand Lodge, SMC Conference and Function Centre:
The 8th International Learning Analytics and Knowledge Conference (LAK ’18) is about to begin. Jeremy and Fidel find themselves in the midst of the brightest researchers and practitioners of Learning Analytics (LA) from around the globe.
Mirroring the architecture of the SMC Conference and Function Centre (designed in Masonic style), ‘Learning Analytics’ appears to have its own kind of enigmatic symbolism, language and abbreviations. LAK? SOLaR? ML? Big Data? Quantitative Ethnography? Epistemic Network Analysis?
“I’ll need Google Translate to help me out”, Fidel thought.
Applause! At the conclusion of the first engaging keynote, it all slowly began to make sense – puzzlement changes to understanding, uncertainty to awe.
They shared a meaningful look as if to silently say to each other: “This is going to be good.”
(Cut to) Present day
Now back at MQ, they thought they’d share their top 6 takeaways from LAK ‘18! (Also, check out Chris Froissard’s post about the same conference).
1. A learning dashboard designed to support student and study advisor dialogue by visualising meaningful student data.
The research paper by Millecamp et al., A Qualitative Evaluation of a Learning Dashboard to Support Advisor-Student Dialogues evaluates the use of a learning dashboard that enhances Advisor-Student interactions by visualising meaningful student data to better offer personalised support and specific advice. The type of data it displays are grades, course progress, position among their peers and predictions of the length of their program based on past data. It was refreshing to see the use of Learning Analytics outside of the traditional Learning Management System data (e.g iLearn).
(A) Student / Peer performance (B) Key moments of a course (C) Histogram of peer performance for a course (D) Failed courses and the option to deliberate (E) Course Planning (F) Study trajectory of previous students with a similar profile (G) Study trajectories for different profiles (Screenshot from the paper)
2. A dating app style algorithm to pair students with similar competencies.
What stuck to me about the practitioner paper “Reciprocal Peer Recommendation for Learning Purposes” by Potts et al., is the creative use of reciprocal recommendation systems to connect students with their peers based on reciprocal preferences and knowledge gaps. Reciprocal recommendation systems are more popularly used in online dating apps. The platform used is called RiPPLE (Recommendation in Personalised Peer-Learning Environments) and how it works is that within a course, the topic-level competencies of a student is approximated through their responses to a set of multiple-choice questions regarding the topic material. With this data, and their selected preferences (e.g I can help with / I need help with / Looking for a study partner in), the algorithm recommends another student who matches their competencies and preferences. Future work with the platform is ongoing and it will be interesting to see where it goes from here!
Sample Ripple student interface
3. Leaders of all kinds across the institution need to work together on Learning Analytics adoption.
To round off my top 3, the paper by Dawson et al., Rethinking Learning Analytics Adoption Through Complexity Leadership Theory discussed in detail how large-scale adoption of Learning Analytics at an institution poses roadblocks that hinder efficient implementation. They argue that the Complexity Leadership Theory (CLT), where various types of leaders (senior, administrative, technical etc.) work together to tackle complex problems such Learning Analytics adoption, could be adopted to help navigate with the ever-changing nature of Australian Universities. What I found useful was the concept of a Learning Analytics Readiness Instrument (LARI), which attempts to answer the question “is your institution ready for Learning Analytics ?”. It is composed of the following, which is measured in a maturity scale: Leadership, Data Availability, Tools / Technology/ Infrastructure, Culture and Strategy.
Using the Learning Analytics Readiness Instrument, where does Macquarie stand? We are on our way – but far from ready.
4. Learning Analytics is broader than engagement dashboards to identify students at risk of dropping out.
Learning Analytics tools and projects are being created that are investigating aspects of learning outside the prediction of students at risk of dropping out (which is not a simple task in its own right!). Learning Analytics can be used to make data driven decisions about anything from the optimal use of on campus teaching spaces, as show in Brennan et al., Classroom Size, Activity and Attendance: Scaling up Drivers of Learning Space Occupation, to advising students on optimal unit selection by looking at past unit performance (S. Fiorini et al., An Application of Participatory Action Research in Advising Focused Learning Analytics). When data is used to aid decision making, it’s up to creative minds to work out how to leverage the information the systems are already collecting in order create tools and processes to improve the way teaching and learning occurs both online and on campus.
5. Getting the policy right is a long term process that needs to involve a broad cross section of stakeholders.
In the current environment of scepticism around data protection and data privacy, it is particularly important to have a well thought out and informed policy governing the use of user data in Learning Analytics. Ensuring all relevant stakeholders, including staff and students, have input into the creation of policies and tools is vital to ensure that any use of Learning Analytics is trusted. This stakeholder consultation will also ensure that Learning Analytics projects use the right tools for the job and actually be useful to staff and students. An example of one possible policy framework was presented by Y.-S Tsai et al., SHEILA Policy Framework: Informing Institutional Strategies and Policy Processes of Learning Analytics. For a critique of the increased datification of education and the ethical considerations of the use of analytics, view this keynote speech by Neil Selwyn from Monash.
6. Learning Analytics should be used to enhance existing L&T practices, not replace them.
A thought provoking example of a Learning Analytics tool that could be coming to classrooms in the future was given by Ken Holstein, a PHD student at Carnegie Mellon. The Classroom as a Dashboard: Co-designing Wearable Cognitive Augmentation for K-12 Teachers describes using Augmented Reality glasses to provide teachers in computer-based maths classrooms with real-time heads up display of analytics, showing how each is progressing. The AR display allows the teacher to provide data driven interventions for students, sometimes even before the student can identify they are having difficulty. Watch a video demonstration of his prototype system.
Herder et al Supporting Teachers’ Intervention in Students’ Virtual Collaboration Using a Network Based Model describes another real time Learning Analytics teaching tool that involves the use of an Epistemic Network Analysis dashboard (ENA). This dashboard analyses the content of students online interactions and maps the content in a network diagram. This allows the instructor to identify if the discussions are covering all the key elements of the problem and can even provide suggested interventions to point groups in the right direction.
Overall, we found the LAK 2018 presentations demonstrated that the Learning Analytics field is still in its infancy and its future direction is still yet to be determined, despite the growing interest in implementing analytics as part of learning and teaching in higher education.