What is Learning Analytics?

LEARNING ANALYTICS is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs, as defined back in 2011 for the first LAK, this general definition still holds true even as the field has grown. Learning analytics is both an academic field and commercial marketplace which have taken rapid shape over the last decade. As a research and teaching field, Learning Analytics sits at the convergence of Learning (e.g. educational research, learning and assessment sciences, educational technology), Analytics (e.g. statistics, visualization, computer/data sciences, artificial intelligence), and Human-Centered Design (e.g. usability, participatory design, sociotechnical systems thinking).

SO WHAT’S ALL THE FUSS ABOUT? People have been researching learning and teaching, tracking student progress, analysing school or university data, designing assessments and using evidence to improve teaching and learning for a long time. Learning Analytics builds on these well established disciplines, but seeks to exploit the new opportunities once we capture new forms of digital data from students’ learning activity, and use computational analysis techniques from data science and AI.

KEY USES. Historically, some of the most common uses of learning analytics is prediction of student academic success, and more specifically, the identification of students who are at risk of failing a course or dropping out of their studies. While it is reasonable that these two problems attracted a lot of attention, learning analytics are far more powerful. The evidence from research and practice shows that there are far more productive and potent ways of using analytics for supporting teaching and learning. Some of the most popular goal of learning analytics include:

  1. Supporting student development of lifelong learning skills and strategies
  2. Provision of personalised and timely feedback to students regarding their learning
  3. Supporting development of important skills such as collaboration, critical thinking, communication and creativity
  4. Develop student awareness by supporting self-reflection
  5. Support quality learning and teaching by providing empirical evidence on the success of pedagogical innovations

METHODOLOGIES

Descriptive Analytics: insight into the past

Uses data aggregation and data mining to understand trends and evaluative metrics over time. The majority of statistics use falls into this category which is limited to past data and includes:

  • Student feedback gathered from student satisfaction and graduate surveys
  • Analysis of data at all stages of the student lifecycle starting from admissions process , to student orientation, enrolments, pastoral care, study support, exams and graduations.

Diagnostic Analytics: why did it happen

This form of advanced analytics is characterised by techniques such as drill-down, data discovery, data mining and correlations to examine data or content to answer the question ‐ “Why did it happen?” and includes:

  • Analysis of data to inform and uplift key performance indicators across the organization
  • Analysis of patterns to design appropriate metrics
  • Equity access reporting and analysis of effective strategies to support students
  • Learning management system metrics to improve student engagement
  • Predictive Analytics: understanding the future

    Combines historical data to identify patterns in the data and applies statistical models and algorithms to capture relationships between various data sets to forecast trends and includes:

  • Development of Staff Dashboards to help predict student numbers and cohort mobility through programs to assist in identifying areas for improvement

Prescriptive Analytics: advise on possible outcomes

Goes beyond descriptive and predictive by recommending one or more choices using a combination of machine learning, algorithms, business rules and computational modelling such as:

  • Focusing on subject/courses where small changes could have a big impact on improving student engagement, feedback and outcomes
  • Data visualisation via specific tools to provide program/degree level metrics on student enrolments, program stage, results and survey feedback to give teaching staff visual snapshots of students in their programs

WHO BENEFITS? Learning Analytics provides researchers with exciting new tools to study teaching and learning. Moreover, as data infrastructures improve — from data capture and analysis, to visualization and recommendation — we can close the feedback loop to learners, offering more timely, precise, actionable feedback. In addition, educators, instructional designers and institutional leaders gain new insights once the learning process is persistent and visible.

ORGANIZATIONAL INFRASTRUCTURES. The community that SoLAR is fostering is distinctive in its interest in understanding the organizational systems required to introduce and sustain analytics. SoLAR’s events and publications draw an eclectic audience that welcomes leaders and policy makers in education/business/government concerned with strategies for organizational change.

ETHICS OF DATA, ANALYTICS & AI. There is rightly much public and professional debate around the ethics of ‘Big Data’ and AI, including privacy, the problem of opaque ‘black box’ algorithms, the risk of training machine learning classifiers on biased datasets, and the dangers of incorrectly predicting someone’s behaviour. These concerns are just as relevant in education, so the ethics of educational data, analytics and AI are front and center in SoLAR’s work, with a very active stream in our events and publications.

Learning analytics in a nutshell

A short introduction to learning analytics

By Yi-Shan Tsai, University of Edinburgh

References:
Clow, D. (2012). The learning analytics cycle: closing the loop effectively.
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64-71.
Siemens, G., & d Baker, R. S. (2012, April). Learning analytics and educational data mining: towards communication and collaboration. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 252-254). ACM.
Tsai, Y. S., & Gasevic, D. (2017, March). Learning analytics in higher education---challenges and policies: a review of eight learning analytics policies. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 233-242). ACM.
Tsai, Y.-S., Gašević, D., Whitelock-Wainwright, A., Muñoz-Merino, P. J., Moreno-Marcos, P. M., Fernández, A. R., Kloos, C. D., Scheffel, M., Jivet, I., Drachsler, H., Tammets, K., Calleja, A. R., and Kollom, K. (2018) SHEILA: Supporting Higher Education to Intergrade Learning Analytics Research Report (https://sheilaproject.eu/2018/11/30/sheila-final-research-report/)

Society for Learning Analytics Research (SoLAR)
 
×

Register | Lost Password