LASI23 Workshops & Tutorials
We are pleased to present a great LASI23 program that includes 10 Workshops and 10 Tutorials!
This will be our first in-person LASI since 2019 and our first LASI hosted by SoLAR outside of North America.
This year LASI participants will take a deep dive into TWO workshops of 5.5 hours total. In addition, each participant will take part in two 2-hour tutorials, enabling participants to get a flavor of a range of topics.
Each individual can choose 2 workshops and 2 tutorials. Capacity is limited in each session so be sure to register early to ensure your spot in your top choices.
Should you have any questions related to registration or general LASI question, please reach out to us at info@solaresearch.org.
*Note: This will not be a hybrid event. Should you be unable to travel to Singapore, we recommend hosting a local LASI event in your country/region. Successful local LASIs throughout the years have been ALASI, Nordic LASI, LASI Spain. Please be sure to review our SoLAR Events In-Cooperation to provide marketing and branding support for your event.

Workshops
Description:
Engagement in self-regulated learning (SRL) has been documented to benefit learners’ learning experience, motivation and achievement, and promote life-long learning. Many learners, however, need external support to engage in productive SRL. Multiple and interweaving learning processes involved in SRL have been traditionally hard to observe, measure and support. With the advancements of learning analytics methods and the increased use of learning technologies that collect fine-grained trace-data about student learning behaviors, researchers are afforded the opportunity to dynamically measure and analyse SRL processes, and tailor personalised SRL support to learners as they work on different learning tasks.
This interactive workshop will provide participants with a series of short presentations, group discussions, software demonstrations, and seven hands-on activities related to measuring and supporting SRL with fine-grained trace-data and learning analytics methods. The activities will be organised following the three phases in the SRL analytics loop: data, model, and transformation (Gašević et al., 2019). The participants will also be given the opportunity to examine FLoRA (https://floraproject.org/), an analytics-based adaptive learning environment that dynamically collects learners’ trace-data and generates personalised prompts to support learners’ SRL as they work on a task. The workshop will contribute towards expanding the community in this area of research and practice.
Activities
- Introduction - Closing the loop of SRL analytics: data, models and transformation
Data
- Using common trace-data collected in a conventional learning management system to measure SRL (demo and hands-on activity 1)
- FLoRA learning environment for SRL analytics: instrumentation tools and multichannel, fine-grained trace-data about student SRL processes (demo and plenary discussion)
- Mapping FLoRA trace-data to SRL processes using an SRL theoretical framework (hands-on activity 2)
Model
- The importance of temporal analysis in SRL research (short presentation and plenary discussion)
- Generating analytics on SRL processes from FLoRA trace-data using the Disco tool (demo and hands-on activity 3)
- Challenges related to temporal analysis of SRL processes (hands-on activity 4)
- Advanced methods for temporal analysis: process mining and ordered network analysis (short presentation and plenary discussion)
- What learning products are created by the SRL processes identified in activity 5? How can these products be analysed? (hands-on activity 5)
Transformation
- FLoRA SRL analytics on processes and products (hands-on activity 6 and plenary discussion about the tool affordances/challenges that remain)
- Designing process- and product-based SRL analytics for students and instructors (hands-on activity 7)
- Closing remarks
Target Audience
This workshop is well-suited for beginner level participants. It will be equally relevant for educators and researchers who are interested in personalised and scalable feedback and how to design feedback using learning analytics approaches in alignment with effective feedback principles.
Takeaways
-
Enhancing research capacity and identifying challenges in the contemporary SRL analytics
-
Strengthening and building new connections among participants interested in researching and/or applying learning analytics for SRL
Preparations & Pre-requisites
References
Gašević, D., Tsai, Y. S., Dawson, S., & Pardo, A. (2019). How do we start? An approach to learning analytics adoption in higher education. The International Journal of Information and Learning Technology, 36(4), 342-353.
Instructors: Dragan Gašević, Centre for Learning Analytics, Monash University, Xinyu Li, Centre for Learning Analytics, Monash University, Mladen Raković, Centre for Learning Analytics, Monash University and Yizhou Fan, Graduate School of Education, Peking University
Description:
Research into the field of Learning Analytics (LA) has made much progress over the past decade. There are numerous and extensive works ranging from predictive models of learning outcomes to semantic analyses with an increasing focus on multi-modal data sets. Arguably innovations in LA have largely plateaued as the field establishes itself as a major contributor to advancing the learning sciences and the process of education. To date, the complexities associated with the deployment and translation of LA based findings has diminished the scale of impact of the field. To transition LA from a novel small scale research approach to institutionally wide impact requires different thinking and different approaches. This workshop presents SPARK – a leadership framework to deploy and evaluate the impact of LA at the institutional level. The framework accounts for the complexities of learning environments, organizational policies, and external environment practices while guiding leadership teams to deploy and scale LA initiatives to achieve organizational impact. The workshop will present the framework and examples of how it has been used in other higher education institutions.
Activities
The objective of the workshop is to explore the elements that influence how learning analytics are adopted at the institutional level and devise a strategy to promote the appearance of productive tensions to transition the use of learning analytics from small research projects to fully deployed operational functions in an institution.
The workshop is divided into the following activities
- Discussion of the contextual model of LA and the SPARK framework:
- Elements to be considered in the context of LA (data infrastructure, pedagogical approach, data-supported decision making)
- Stakeholder groups influencing institutional adoption
- Leadership models and context for LA deployments.
- The SPARK framework.
- Review successful and unsuccessful case studies of LA institutional adoption.
- Attendees will explore case studies of institutional adoption of LA processes. The activity requires mapping the case context onto SPARK dimensions.
- Groups will diagnose problems and propose changes in the context that would improve the institutional uptake.
- Contextualization of the model in your institution
- Identify the elements in the models and framework that map into your institution
- Map processes, stakeholder groups, relationships and institutional structures required for institutional adoption
Target Audience
The workshop is ideal for researchers, practitioners and administrators engaged in and responsible for the deployment of learning analytics tools and outcomes at the institutional level. The workshop will be of interest especially to those developing leadership skills in the area.
Takeaways
- A strategic view of the elements and processes that influence how educational institutions effectively adopt LA.
- A framework to articulate actions, policies, teams and initiatives to transition from small research initiatives to institutional adoption.
Preparations
The attendees are encouraged to read the following two documents:
- Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the International Conference on Learning Analytics and Knowledge - LAK '18, Sydney, Australia. doi:10.1145/3170358.3170375
- Uhl-Bien, M., Marion, R., & McKelvey, B. (2007). Complexity leadership theory: Shifting leadership from the industrial age to the knowledge era. The Leadership Quarterly, 18(4), 298-318.
The workshop will be conducted in a highly interactive and exploratory fashion. Attendees are expected to approach the problem of adoption through their institution’s lens and role, sharing their insights and contextual information, but at the same time to explore alternative scenarios, contexts and initiatives.
Instructors: Shane Dawson and Abelardo Pardo, University of South Australia, UniSA
Description
This workshop introduces a collection of techniques, and provides some hands-on experience, for generating relational databases containing synthetic student data. We will start with inspection and pre-processing of an anonymized dataset of real student data, followed by the introduction and familiarization with some Python packages. The data generation techniques include Synthetic Data Vault (SDV), Generative Adversarial Network (GAN) and Incremental Relational GAN (IRGAN). We conclude with a guide to evaluating:
- The similarity of generated synthetic student data with real student data, and
- The utility of the generated synthetic student data for research, instructional and administrative purposes
- The privacy protection of the generated synthetic student data.
Activities
Each participant will be registered with NUS as an official visitor prior to the workshop. Participants will make use of a virtual machine (VM) on the NUS network to access the workshop’s datafiles and activities. The VM will have Python and a Jupyter notebook environment with relevant packages (pandas, SDV, IRGAN), as well as a small dataset D containing anonymized student data in multiple relational tables preinstalled. Participants will learn to:
- Pre-process the data with Python and Jupyter notebook
- Generate the necessary metadata
- Generate a synthetic version of one student data table
- Compare the similarity between the real student and synthetic student versions of the table
- Train SDV and IRGAN to generate a synthetic database (D’) of the provided database (D)
- Rigorously evaluate the similarity of D’ and D.
Target Audience
Learning analytics and educational data mining researchers, instructors, administrators and others who are interested in hands-on experience with the issues and techniques for generating synthetic student data across multiple correlated tables.
Preparation and Pre-requisites
- Experience with Python.
- Familiarity with notebook environments for programming
- Basic knowledge of relational database schema (attributes, primary and foreign keys), deep learning (artificial neural networks, hyper-parameters), probability (random variables, distributions) and statistics (correlation, samples).
- Recommended: familiarity with our tutorial (Generating Synthetic Relational Databases for Learning Analytics).
Takeaways
- Understand the necessary considerations for generating synthetic student datasets and databases.
- Hands on experience processing research-ready student data within a secure environment
- Generate synthetic student datasets and databases with a combination of machine learning and statistical techniques
- Conduct comparisons between real and synthetic student datasets and databases
- Evaluate the usefulness of synthetic student datasets and databases for research, administrative and instructional purposes
Instructors: Jiayu Li and Y.C. Tay, National University of Singapore
Description
From a human-centered perspective, supporting educational stakeholders by providing them with dashboards, visualizations and other forms of learning analytics end-user interfaces poses critical design challenges that may often be trivialized. Teachers’ and students’ interpretation of visualized data is essentially the construction of a narrative about the learning process. Yet, as researchers and designers of learning analytics systems, we should not assume this interpretation process will just “happen”. Students and several teachers commonly have mixed visualisation literacy and data literacy skills which are often required to meaningfully make sense of visual representations of data and act upon those. While an ideal strategy can be to upskill teachers and students in data interpretation, we can also improve the design of our dashboards and visualisations to help non-data-savvy end-users interpret and make sense of them.
Data Storytelling can be seen as a set of data compression, guidance and visualisation techniques aimed at making it easier for a specific audience to identify the key insights from the data. Data Storytelling is rapidly gaining traction in other communities such as Information Visualisation (InfoViz) and Human-Computer Interaction (HCI) since the problem of extracting actionable insights from vast amounts of data is becoming commonplace in our fast-paced society. In learning analytics, we urgently need to re-think how to communicate insights rather than data to make a meaningful impact on teaching and learning practices. The aim of this workshop is to introduce participants to embrace data storytelling techniques into the design of visualizations and dashboards that can communicate meaningful insights.
Activities
- Participants will be introduced to basic foundations of data storytelling and examples from related literature in learning analytics and beyond.
- Participants will engage in hands-on tasks to craft some basic data stories manually.
- Small working groups will design and report on learning analytics user-interfaces enhanced by applying data storytelling principles.
- Small working groups will discuss and report on issues related to the automation of data storytelling for learning analytics in terms of i) ethics; ii) agency) and risks.
Target Audience
Anyone interested in visualising data for educational purposes.
Takeaways
- Perspectives on data storytelling and its potential role in learning analytics.
- Understanding the key principles for manually or automatically creating data stories for research, teaching or learning analytics tool design.
- Development of simple data storytelling prototypes.
- First steps toward establishing collaborations to continue work on data storytelling in education.
Preparations
- Echeverria, V., Martinez-Maldonado, R., Buckingham Shum, S., Chiluiza, K., Granda, R., & Conati, C. (2018). Exploratory versus Explanatory Visual Learning Analytics: Driving Teachers’ Attention through Educational Data Storytelling. Journal of Learning Analytics, 5(3), 73—97. https://learning-analytics.info/index.php/JLA/article/view/6114
- Bring a laptop with Excell or your favourite data visualisation tool installed (we will mostly be using excel given the time constraints and because the purpose is to apply the data storytelling principles regardless of what tools are used).
Workshop Leader: Roberto Martinez-Maldonado, Monash University
Description
This track looks at K-12 learning analytics interventions explored and used in Singapore classrooms, namely Knowledge Forum (KF) and WiREAD+. The workshop will first introduce the two tools. Participants will be able to have hands-on experience in using the two learning analytics applications. They will explore the various affordances of the tools. How these applications are used in classrooms will be described, followed by learning gains and limitations of the intervention.
In brief, WiREAD+ is an augmented web-based collaborative critical reading and learning analytics environment addressing classroom challenges of print-based texts, scarcity of dialogue, and formative feedback. Knowledge Forum (KF) is an online discourse environment developed by Marlene Scardamalia and Carl Bereiter (1993, 2002, 2006) to support collaborative discourse and advance community knowledge. KF offers a suite of learning analytics such as scaffold trackers and word-cloud to visualize the idea-improvement effort on Knowledge Forum. The workshop will explore two newly designed KF Learning Analytics: the Curriculum-ideas-analytics (CiA) tool and the Automated Note Recommender System. These newly designed KF Learning Analytics aim to increase students’ agency in pursuing their ideas and learning goals. The tools support students in accessing related resources and KF notes shared by peers.
The final afternoon 1.5 hours is an interactive session with practitioners about their pedagogical design. Educators and students will share how they have used the tools in their own classroom and how it has helped them in their knowledge building endeavor.
Activities
- Participants will have hands-on experience with each learning analytic application.
- Interactive session with practitioners and researchers
Target Audience
Anyone keen to have a hands-on experience with the learning analytic tools, as well as practical and insightful understandings of learning innovations in K-12 in Singapore.
Takeaways
- Understanding of the learning affordances of two learning analytics tools
- Practical and conceptual insights of learning innovations in K-12 in Singapore.
Preparations
- Bring a laptop with the Google Chrome browser installed.
Workshop Leader Names & Affiliations (in respective order)
Chew Lee Teo, Elizabeth Koh, Katherine Yuan, Alwyn Lee, Aloysius Ong, Christin Jonathan
National Institute of Education, Nanyang Technological University, Singapore
Description
As the adoption of digital learning materials in modern education systems is increasing, the analysis of reading behavior and successful strategies that students employ gains attention. In this workshop, we will give an overview of reading systems and examine key learning tasks and indicators that can be analyzed to predict learning academic performance. Participants will be given hands on experience with reading systems and the analysis of data that is collected to answer research questions in education. Finally, groups within the workshop will discuss the application of reading behavior analysis, identify and test possible new strategies that could be used to predict academic performance.
Activities
- Introduction and hands on experience using a learning material reading system.
- Discussing in detail what data is collected and its interpretation and analysis.
- Applying different methods of analyzing reading behavior data to predict academic performance using Python.
- Group discussion and activities on how to use reading analytics to identify learning strategies.
Target Audience
This workshop is intended for anyone interested in reading analysis and no previous experience is required; however, it is recommended that participants familiarize themselves with basic statistics and machine learning concepts. Participants should have some prior experience with programming in Python.
Takeaways
Participants will have hands on experience with reading systems and how the data collected can be analyzed to predict academic performance on different learning tasks.
Prep required
Bring a laptop with Google Chrome and create a Google Account (for using Google Colab).
Suggested reading:
Flanagan, B., Majumdar, R., & Ogata, H. (2022). Early-warning Prediction of Student Performance and Engagement in Open Book Assessment by Reading Behavior Analysis. International Journal of Educational Technology in Higher Education 19(41).
Flanagan, B., Majumdar, R., & Ogata, H. (2022). Fine Grain Synthetic Educational Data: Challenges and Limitations of Collaborative Learning Analytics. IEEE Access 10, pp. 26230-26241.
Akçapınar, G., Hasnine, M.N., Majumdar, R., Flanagan, B., & Ogata, H. (2019). Developing an early-warning system for spotting at-risk students by using eBook interaction logs. Smart Learning Environments 6(1), pp. 4.
Flanagan, B., & Ogata, H. (2018). Learning analytics platform in higher education in Japan. Knowledge Management & E-Learning: An International Journal 10(4), pp. 469-484.
Instructor: Brendan Flanagan, Kyoto University
Description:
Effective feedback is timely, personalised, actionable, and conveyed in a tone that conveys the instructor’s care and encouragement. However, contemporary higher education presents challenges to being able to scale this level of feedback for instructors at the coalface of large and diverse cohorts, and perhaps even more so with the increased shift to online learning.
Learning analytics has the potential to counter these challenges, by offering a technological solution to scaling support in a personalised way to all learners. As research in learning analytics feedback continues to grow, there is a concern that these new forms of feedback may, if not deployed effectively, promote the idea of feedback as a product, rather than a process. In this respect, the notion of learner-centred feedback is key.
Learner-centred feedback requires a consideration of feedback recipience. Drawing on Winstone et al.’s (2017) work, this workshop facilitates participants to explore how to design learning analytics feedback with respect to learning design, as well as practical considerations of feedback recipience: characteristics of the sender, characteristics of the receiver, characteristics of the message, characteristics of the context.
During this in-depth workshop, participants will also have the opportunity to explore OnTask (Pardo et al., 2018) — a learning analytics tool that facilitates personalised communication of feedback and other learning support.
Activities:
[Part 1]
Discussion on effective feedback practices
Introducing learner-centred feedback and feedback designs
Introducing learning analytics and feedback
[Part 2]
Introduction to OnTask
Feedback designs for learner-centred feedback: Scenarios of OnTask use
Discussion activity: How would you design feedback with OnTask?
[Part 3]
Hands-on session with OnTask
Discussion activity: Reflections on using OnTask for designing learner-centred feedback
Target audience:
This workshop is well-suited for beginner level participants. It will be equally relevant for educators and researchers who are interested in personalised and scalable feedback and how to design feedback using learning analytics approaches in alignment with effective feedback principles.
Takeaways:
By the end of this workshop, participants will be able to demonstrate the following:
- Understanding of feedback design for learner-centred feedback
- Appreciation of considerations for using learning analytics for personalised feedback and support
- Ability to use basic functions of OnTask to prepare personalised feedback messages
Preparation:
Participants should bring their own laptops for the workshop.
Readings and references:
Lim, L.-A., Dawson, S., Gašević, D., Joksimović, S., Fudge, A., Pardo, A., & Gentili, S. (2020). Students’ sense-making of personalised feedback based on learning analytics. Australasian Journal of Educational Technology, 36(6), 15-33. https://doi.org/10.14742/ajet.6370
Ryan, T., Gašević, D., & Henderson, M. (2019). Identifying the Impact of Feedback Over Time and at Scale: Opportunities for Learning Analytics. In M. Henderson, R. Ajjawi, D. Boud, & E. Molloy (Eds.), The Impact of Feedback in Higher Education: Improving Assessment Outcomes for Learners (pp. 207-223). Springer International Publishing. https://doi.org/10.1007/978-3-030-25112-3_12
Tsai, Y. S. (2022). Why Feedback Literacy Matters for Learning Analytics. In the 16th International Conference of the Learning Sciences (ICLS) (pp. 27-34).
Pardo, A., Bartimote-Aufflick, K., Buckingham Shum, S., Dawson, S., Gao, J., Gašević, D., ..., & Vigentini, L. (2018). OnTask: Delivering Data-Informed Personalized Learning Support Actions. Journal of Learning Analytics, 5(3), 235-249. https://doi.org/http://dx.doi.org/10.18608/jla.2018.53.15
Winstone, N. E., Nash, R. A., Parker, M., & Rowntree, J. (2017). Supporting learners' agentic engagement with feedback: A systematic review and a taxonomy of recipience processes. Educational Psychologist, 52(1), 17-37. https://doi.org/10.1080/00461520.2016.1207538
Instructor: Lisa-Angelique Lim, Connected Intelligence Centre, University of Technology Sydney
Description
The potential for learning and teaching to benefit from the application of Learning Analytics (LA) is expanding. E-learning is getting much more pervasive in educational institutions from K-12, higher education, and adult education. However, LA applications that are actually deployed to inform teachers and learners are still rare, most of which either involve computer scientists/ engineers who engage in design research on the use of LA to improve their own teaching, or research collaborations between teaching faculty and their colleagues involved in LA research.
A key challenge for teachers without the requisite technical expertise to leverage the potential of LA is the lack of a robust conceptual framework and an associated technology platform for them to easily select the LA outputs and interpretations appropriate for their specific curriculum and pedagogical context. To do so requires a “design-aware learning analytics system” that allows teachers in different practice contexts to define the LA questions they are interested in exploring and to select relevant user-interpretable displays of the analysis results that are appropriate for the intended learning outcomes and the implemented pedagogical design principles in contexts. The tutorial introduces participants to a conceptual framework for the operationalization of design-aware learning analytics, the Learning Design Studio (LDS), which is underpinned by a robust, hierarchically layered Learning Design (LD) language, and how to generate design-aware LA questions for learning designs represented on LDS. The workshop provides participants with hands-on opportunities to generate LA questions and explore answers to those questions using learning data from a course that has its LD well documented on LDS.
Activities
- Participants will be introduced to a fully online course for which the asynchronous learning activities took place on Moodle, and for which the course LD is available on LDS.
- Participants work in groups to generate two LA questions, identify the data and analyses that could be used to answer these questions.
- Use the learning analytics plugin module provided to generate displays to address the LA questions identified, and to write interpretations and feedback for the teacher and students respectively.
- Discuss the usefulness of the LA outputs and feedback for the teacher and students respectively, and the extent to which the system is able to support design-aware learning analytics.
- Each group will present (i) their LA questions, analyses, LA displays and feedback for the course teacher and students, and (ii) share their reflections on the value of the conceptual framework and technology system introduced in the tutorial and workshop, and suggestions for their improvement.
Target audience
Anyone interested in deploying or developing technology systems that can provide LA-informed feedback about students’ learning and teachers’ learning design in diverse teaching and learning contexts.
Takeaways
- Meaningful LA starts with the specification of LA questions that are meaningful to the teachers and learners within their specific curriculum and pedagogical context.
- For LA research and development to benefit the general population of teachers and their students (i.e., teachers who do not have specialized technical knowledge), the LA system needs to be connected to a well-articulated LD representation grounded on a robust LD language.
- User-modifiable “storytelling” templates connected to the LA questions selected could be a useful approach to provide contextually meaningful feedback to teachers and students.
Preparation and Pre-requisites
- No pre-requisites needed.
- Read:
- Law, N., Li, L., Farias Herrera, L., Chan, A., & Pong, T. C. (2017). A Pattern Language Based Learning Design Studio for an Analytics Informed Inter-Professional Design Community. Interaction Design and Architecture(s), 33, 92 - 112.
- Law, N., & Liang, L. (2020). A Multilevel Framework and Method for Learning Analytics Integrated Learning Design. Journal of Learning Analytics, 7(3), 98-117. https://doi.org/https://doi.org/10.18608/jla.2020.73.8
- Bring a laptop with one of the common browsers (e.g., Google Chrome, Firefox, Safari) installed.
Workshop Leaders: Nancy Law & Xiao Hu, University of Hong Kong
Description:
Knowledge building (Scardamalia and Bereiter, 2006) focuses on collaborative learning and collective knowledge creation. The Knowledge Forum (a computer supported platform) was created to support this process. Different learning analytics tools have been developed to visualize student interactions on Knowledge Forum. While understanding the collaborative learning process is important, it is also essential to understand how individual learning contributes to collective knowledge advancement.
In the workshop, participants will create their own MDIs and analyse their (and others’) generated MDI data. At the end of the workshop, participants will engage in a final MDI experience and compare their data from their first trial. We will explore how the MDI supports the teaching of adult learners, as well as how the MDI can be used in the different teaching and learning contexts of the participants. We will conclude the workshop by showing how we have combined the MDI with Knowledge Forum to support adult learning in our IAL courses.
Activities:
Participants will be introduced to the MDI and will create their own MDIs. • Participants will categorise data samples using the MDI.
Target Audience:
Anyone interested in the practice of collective knowledge building in teaching and learning, reflective practice and the use of analytics to strengthen pedagogical practices.
Takeaways:
How Knowledge Forum has been used in courses for adult learners • How the Map of Dialogical Inquiry can be used in knowledge building • Insights on how the Map of Dialogical Inquiry can help learners become more self-aware of their learning journey and support educators in reflecting on their pedagogical practices
Preparations:
Read the following: Bound, H. (2010). Developing Quality Online Dialogue: Dialogical Inquiry. International Journal of Teaching and Learning in Higher Education. 22(2): 107-119. Scardamalia, M. and Bereiter, C. (2006). Knowledge building: Theory, pedagogy and technology. In K. Sawyer (Ed.). Cambridge Handbook of the Learning Sciences (pp.97-118). New York: Cambridge University Press. Stack, S. and Bound, H. (2012). Tools for Reimagining Learning, Institute for Adult Learning.
Instructors: Dr Lin Feng, Lecturer, Teaching and Learning Centre, Singapore University of Social Sciences; Dr Priscilla Pang, Head of Programme, Master in Boundary-Crossing Learning and Leadership, Institute for Adult Learning, Singapore University of Social Sciences; Associate Professor Helen Bound, Institute for Adult Learning, Singapore University of Social Sciences
Description:
Using visualisations for communication purposes is very common. In the educational field, individual visualisations or even whole dashboards are often used to show the results of data analysis in an easily digestible form to learners. Often, however, what was intended to be communicated by visualisation and how it is then interpreted by students differs. Similarly, dashboards are often built without a clear purpose or reason but simply because the data is available. In this workshop, we will look at principles and guidelines of data visualisation and work on a structured approach to the why, what and how of effective dashboard design for student-facing learning analytics. Participants will be introduced to guidelines of dashboard design, will analyse and interpret examples of LA dashboards, and will design their own dashboard mockups.
Activities:
- Participants will be introduced to the world of data visualisation and will get to see dashboard examples from the field of LA as well as other fields.
- Based on the analysis of examples and mock-ups, principles and guidelines will be formulated on how to design effective LA dashboards for learners.
- In small groups, participants will then design a learning analytics dashboard for learners in a given learning context: they will first explore educational problems (i.e. what problem do they want to solve with a dashboard, how can it be grounded in theory and practice) and then identify relevant information and data to work on the problem. Based on this, participants will draft dashboard mock-ups using the principles. Finally, they will prioritise design features and sketch evaluation criteria and plans.
Target Audience:
Anyone interested in the visualisation of learning data and the design of learning analytics dashboards for learners, anyone from students to teachers to practitioners to educational institution managers.
Understanding what learning analytics is, how and where it can be used and who its stakeholders are is beneficial.
Takeaways:
- Learning the process of designing dashboards for learners
- Understanding of principles and guidelines for dashboard design and data visualisation
- Getting a glimpse into the art of storytelling with data
- Understanding the importance of grounding dashboard designs in theory and practice
Preparation:
- Reading list coming soon
Workshop Leader: Ioana Jivet, Goethe Univerity Frankfurt & DIPF, Germany
Tutorials
Description:
Engagement in self-regulated learning (SRL) has been documented to benefit learners’ learning experience, motivation and achievement, and promote life-long learning. Many learners, however, need external support to engage in productive SRL. Multiple and interweaving learning processes involved in SRL have been traditionally hard to observe, measure and support. With the advancements of learning analytics methods and the increased use of learning technologies that collect fine-grained trace-data about student learning behaviors, researchers are afforded the opportunity to dynamically measure and analyse SRL processes, and tailor personalised SRL support to learners as they work on different learning tasks.
This interactive workshop will provide participants with a series of short presentations, group discussions, software demonstrations, and seven hands-on activities related to measuring and supporting SRL with fine-grained trace-data and learning analytics methods. The activities will be organised following the three phases in the SRL analytics loop: data, model, and transformation (Gašević et al., 2019). The participants will also be given the opportunity to examine FLoRA (https://floraproject.org/), an analytics-based adaptive learning environment that dynamically collects learners’ trace-data and generates personalised prompts to support learners’ SRL as they work on a task. The workshop will contribute towards expanding the community in this area of research and practice.
Activities
- Introduction - Closing the loop of SRL analytics: data, models and transformation
Target Audience
This tutorial is well-suited for beginner level participants. It will be equally relevant for educators and researchers who are interested in personalised and scalable feedback and how to design feedback using learning analytics approaches in alignment with effective feedback principles.
Takeaways
- Understanding of how contextualized learning data can be used to inform personalised feedback and support
- Understanding of considerations in segmenting learning data for meaningful personalised feedback
- Ability to use basic functions of OnTask to scale personalised feedback or communications
Preparations & Pre-requisites
- Enhancing research capacity and identifying challenges in the contemporary SRL analytics
- Strengthening and building new connections among participants interested in researching and/or applying learning analytics for SRL
References
Gašević, D., Tsai, Y. S., Dawson, S., & Pardo, A. (2019). How do we start? An approach to learning analytics adoption in higher education. The International Journal of Information and Learning Technology, 36(4), 342-353.
Instructors: Dragan Gašević, Centre for Learning Analytics, Monash University, Xinyu Li, Centre for Learning Analytics, Monash University, Mladen Raković, Centre for Learning Analytics, Monash University and Yizhou Fan, Graduate School of Education, Peking University
Description:
This tutorial is designed for everyone with an interest in increasing the impact of their learning analytics research. The tutorial will begin with a short introduction to the field and to the learning analytics community. It will go on to identify significant challenges that learning analytics needs to address, and factors that should be taken into account when implementing analytics, including ethical considerations related to development and implementation. As a participant, you’ll have opportunities to relate these challenges to your own work, and to consider how your research is situated in the field. You’ll be encouraged to reflect on how your work aligns with the learning analytics cycle, how it contributes to the evidence base in the field, and ways in which you can structure your work to increase its impact.
Activities:
The tutorial will include opportunities to share ideas and experiences using Google Docs and Padlet or similar openly accessible online tools.
Target Audience:
All are welcome
Advanced preparation:
None
Instructor: Rebecca Ferguson, Open University UK
Description
The move toward data-driven decision making and research in education calls attention to a long running tension in a new way. Educational institutions must protect student privacy while still creating space and resources for researchers and learning designers to monitor, evaluate and improve the system. Providing too much student data too freely increases the risk of data exposure and misuse. Providing too little student data limits the potential of new interventions and initiatives. One way of partially resolving the tension is to generate synthetic student data that is structurally, qualitatively and quantitatively similar to real student data and provide the synthetic data to interested parties.
Synthetic student data can be generated through machine learning. However, unlike classical machine learning tasks, student data is typically stored in multiple relational tables with multiple dependencies. This tutorial will introduce techniques for generating synthetic student data tables that account for structural dependencies and correlate values.
Target Audience
Learning analytics and educational data mining researchers, instructors, educational administrators and others who are interested in the issues and techniques for generating synthetic data across multiple correlated tables.
Preparation and Pre-requisites
Basic knowledge of relational database schema (attributes, primary and foreign keys), deep learning (artificial neural networks, hyper-parameters), probability (random variables, distributions) and statistics (correlation, samples).
Takeaways
- Identify expectations of a particular relational synthetic database.
- Understand some models for generating synthetic relational databases and their pros and cons.
- Evaluate how useful a synthetic database is as a surrogate of a real one.
- Ensure privacy preservation in synthetic data.
Instructors: Jiayu Li and Y.C. Tay, National University of Singapore
Description
The term human-centred learning analytics (HCLA) refers to the emerging subcommunity of learning analytics researchers and practitioners interested in creating reliable and trustworthy learning analytics systems that amplify and augment the abilities of educational stakeholders and which are aligned to intentions, revealed preferences, ideal preferences, interests and values. It also points at the increased interest in embracing Human-Centred Design and Human-Centred Artificial Intelligence theory and practice to effectively design learning analytics systems with teachers and students as active partners in the design process.
In this tutorial, we will have a conversation about potential ways to appropriate concepts such as "participatory", "co-design" and "human-centeredness", which each point at different bodies of literature and communities beyond learning analytics. We will also discuss how we can design learning analytics systems for and/or with educational stakeholders. Finally, grounded on broad Human-Centred AI principles, we will discuss how we can create learning analytics systems that are reliable, safe and trustworthy.
Activities
- Participants will be introduced to the basic foundations of human-centred design and emerging human-centred learning analytics literature.
- Participants will engage in discussions around the following questions:
- what are participatory design, co-design and human-centredness beyond learning analytics?
- how can design practices from other areas be adapted or adopted in learning analytics?
- what mechanisms need to be put into place to ensure we design learning analytics systems that are reliable, safe and trustworthy?
- Small working groups will discuss and report on a set of questions posed to ignite discussions. The expertise of all the attendees will enrich such discussions.
Target Audience
Anyone interested in the human factors of educational systems and learning analytics innovations.
Takeaways
- Perspectives on human-centred design in the context of learning analytics systems development.
- Understanding the key vocabulary, communities of practice and research and sources of key literature that can inform HCLA developments.
- First steps toward establishing collaborations to continue work on human-centred learning analytics.
Preparations
- Dimitriadis, Y., Martinez-Maldonado, R., and Wiley, K. (2021). Human-centered Design Principles for Actionable Learning Analytics. In S. Demetriadis, V. Dagdilelis, T. Tsiatsos and A. Mikropoulos (Eds.), Research on E-Learning and ICT in Education – Technological, Pedagogical and Instructional Perspectives. Cham, Switzerland: Springer (277-296). http://martinezmaldonado.net/files/Chapter_31_Human-centeredPrinciples.pdf
Tutorial Leader: Roberto Martinez-Maldonado
Description
This track looks at K-12 learning analytics interventions explored and used in Singapore classrooms. The tutorial will mainly cover the concepts, design and case studies of two learning analytics applications, WiREAD+ and a suite of analytic toolkits embedded in Knowledge Forum. The case studies will include student and teacher perspectives as well as the challenges and potential of the tools when used in class. The following provides a brief description of each application.
WiREAD+, pronounced “we-read-plus”, is a web-based collaborative critical reading and learning analytics environment first developed for secondary English Language teachers and students and subsequently augmented to scale across subjects, levels and schools. WiREAD+ addressed classroom challenges of print-based texts, scarcity of dialogue, and formative feedback and was designed to motivate and scaffold students to develop richer dialogue and quality interactions with peers around multimodal texts. It was co-designed with teachers, school leaders and education officers over two research projects. The latest trial involved two primary schools, three secondary schools, a junior college, and a tertiary institute.
Knowledge Forum (KF) is an online discourse environment developed by Marlene Scardamalia and Carl Bereiter (1993, 2002, 2006) to support collaborative discourse and advance community knowledge. KF engages users in knowledge building discourse, where members go beyond just sharing information but work as a collective community to figure out better ways to move forward, such as improving solutions and theories. KF allows customization of scaffolds (sentence starters) to help users post, elaborate, and refine ideas and theories. A suite of learning analytics is embedded within Knowledge. Each learning analytics aims to provide user information about different dimensions of the idea improvement effort on the Knowledge Forum. Two learning analytics that will be explored further in the workshop is (I) the Curriculum-ideas-analytics (CiA) tool provides word clouds which visualizes how ideas from students’ notes on KF overlap with key concepts in curriculum documents of different grade levels. The latest trial of CiA took place in two primary schools (Science and Social Studies classes) and two secondary schools (both Science and History classes) (II) An Automatic Note Recommender System that recommends ways for writers to improve their KF notes. The system is based on a large number of choices previously recorded on KF and filtered down to the context most relevant to the learner in a particular context.
Activities
- Participants will be introduced to the concepts, design and findings of the learning analytics interventions.
Target Audience
Anyone interested in understanding learning interventions in the K-12 space.
Takeaways
- Designs of learning analytics interventions
- Challenges and potentials of interventions at K-12
Preparations
1. None required
Workshop Leader Names & Affiliations (in respective order)
Chew Lee Teo, Elizabeth Koh, Katherine Yuan, Alwyn Lee, Aloysius Ong, Christin Jonathan
National Institute of Education, Nanyang Technological University, Singapore
Description
Digital learning materials especially digital textbooks are a core part of modern education, and the adoption of digital textbooks in education is increasing. Digital textbooks and e-books are being introduced into education at the government level in a number of countries in Asia. This tutorial introduces participants to the developments in reading behavior analytics research and its application to supporting lecture and learning material revision, predicting academic performance, group formation, etc. This tutorial will enable participants to start considering how reading behavior analytics and general use of reading systems can be employed for a range of learning strategies and educational tasks.
Activities
Participants will have the opportunity to have hands on experience with a learning material reading system and an introduction on how fundamental functions can be used for different learning tasks.
Key concepts in reading behavior analysis will be discussed including what data can be collected and how it can be interpretated and analyzed to examine strategies learners are employing.
Target Audience
As an introductory tutorial to reading behavior analysis, no prerequisite knowledge about reading analytics is required. It is primarily for anyone interested in reading analytics or wanting to further pursue research into reading systems.
Takeaways
Participants will learn the fundamentals of reading systems, the data collection process, and how data can be analyzed to identify reading strategies and learning performance indicators.
Prep required
Bring a laptop with Google Chrome
Suggested reading:
Ogata, H., Oi, M., Mohri, K., Okubo, F., Shimada, A., Yamada, M., ... & Hirokawa, S. (2017). Learning analytics for e-book-based educational big data in higher education. In Smart sensors at the IoT frontier (pp. 327-350). Springer, Cham.
Majumdar, R., Flanagan, B., & Ogata, H. (2021). E-book technology facilitating university education during COVID-19: Japanese experience. Canadian Journal of Learning and Technology 47(4), pp. 1-28.
Workshop Leader: Brendan Flanagan, Kyoto University
Description:
Feedback has long been recognised as being critically important for learners’ growth and performance. Research has established that effective feedback is timely, actionable, and personalised to the individual’s learning needs. However, contemporary education presents significant challenges: large enrollments, an increasing shift to online learning, as well as high workloads, all of which are barriers for instructors to tailor feedback in their teaching and learning contexts.
Learning analytics offers a technological solution to this challenge. Within the last decade, the field has advanced to a stage where many automated feedback systems have been developed as a solution to scaling personalised feedback. OnTask (Pardo et al., 2018) is one example of such learning analytics- based feedback systems.
The objectives of this workshop are to introduce participants to OnTask, a personalised feedback system based on learning analytics, and to consider how this tool may be used effectively for supporting students in their learning and to enhance their learning experience.
Activities:
In this hands-on tutorial, participants will be introduced to OnTask. The session will guide participants through a step-by-step process to using the tool – from preparing data for a new workflow, to creating conditions for personalised feedback or support messages, to sending off batch emails. Participants will be provided a toy dataset to work with during the session – this will be provided prior to the tutorial and ideally downloaded before the session.
Target audience:
This tutorial is well-suited for beginner level participants. It will be equally relevant for educators and researchers who are interested in personalised and scalable feedback and how to design feedback using learning analytics approaches in alignment with effective feedback principles.
Takeaways:
- Understanding of how contextualized learning data can be used to inform personalised feedback and support
- Understanding of considerations in segmenting learning data for meaningful personalised feedback
- Ability to use basic functions of OnTask to scale personalised feedback or communications
Preparations:
- It will be helpful for participants who are new to learning analytics feedback to read the following:
Pardo, A., Poquet, O., Martinez-Maldonado, R., & Dawson, S. (2017). Provision of data-driven student feedback in LA and EDM. In C. Lang, G. Siemens, A. F. Wise, & D. Gašević (Eds.), Handbook of learning analytics (pp. 163-174). https://doi.org/10.18608/hla17.014
Lim, L., Dawson, S., Gašević, D., Joksimović, S., Pardo, A., Fudge, A., & Gentili, S. (2021). Students’ perceptions of, and emotional responses to, personalised LA-based feedback: An exploratory study of four courses. Assessment & Evaluation in Higher Education, 46(3), 339-359. https://doi.org/10.1080/02602938.2020.1782831
- Bring your own laptop.
Workshop Leaders: Lisa Lim, Connected Intelligence Centre, University of Technology Sydney; Abelardo Pardo, UniSA STEM, University of South Australia
Description
The potential for learning and teaching to benefit from the application of Learning Analytics (LA) is expanding. E-learning is getting much more pervasive in educational institutions from K-12, higher education, and adult education. However, LA applications that are actually deployed to inform teachers and learners are still rare, most of which either involve computer scientists/ engineers who engage in design research on the use of LA to improve their own teaching, or research collaborations between teaching faculty and their colleagues involved in LA research.
A key challenge for teachers without the requisite technical expertise to leverage the potential of LA is the lack of a robust conceptual framework and an associated technology platform for them to easily select the LA outputs and interpretations appropriate for their specific curriculum and pedagogical context. To do so requires a “design-aware learning analytics system” that allows teachers in different practice contexts to define the LA questions they are interested in exploring and to select relevant user-interpretable displays of the analysis results that are appropriate for the intended learning outcomes and the implemented pedagogical design principles in contexts. The tutorial introduces participants to a conceptual framework for the operationalization of design-aware learning analytics, the Learning Design Studio (LDS), which is underpinned by a robust, hierarchically layered Learning Design (LD) language, and how to generate design-aware LA questions for learning designs represented on LDS. The workshop provides participants with hands-on opportunities to generate LA questions and explore answers to those questions using learning data from a course that has its LD well documented on LDS.
Activities
Tutorial:
- Participants will be introduced to LDS and its underpinning conceptual framework and LD language.
- Participants will explore a learning design represented on LDS and generate LA questions considered to be relevant to the LD context.
Target audience
Anyone interested in deploying or developing technology systems that can provide LA-informed feedback about students’ learning and teachers’ learning design in diverse teaching and learning contexts.
Takeaways
- Meaningful LA starts with the specification of LA questions that are meaningful to the teachers and learners within their specific curriculum and pedagogical context.
- For LA research and development to benefit the general population of teachers and their students (i.e., teachers who do not have specialized technical knowledge), the LA system needs to be connected to a well-articulated LD representation grounded on a robust LD language.
- User-modifiable “storytelling” templates connected to the LA questions selected could be a useful approach to provide contextually meaningful feedback to teachers and students.
Preparation and Pre-requisites
- No pre-requisites needed.
- Read:
- Law, N., Li, L., Farias Herrera, L., Chan, A., & Pong, T. C. (2017). A Pattern Language Based Learning Design Studio for an Analytics Informed Inter-Professional Design Community. Interaction Design and Architecture(s), 33, 92 - 112.
- Law, N., & Liang, L. (2020). A Multilevel Framework and Method for Learning Analytics Integrated Learning Design. Journal of Learning Analytics, 7(3), 98-117. https://doi.org/https://doi.org/10.18608/jla.2020.73.8
- Bring a laptop with one of the common browsers (e.g., Google Chrome, Firefox, Safari) installed.
Workshop Leaders: Nancy Law & Xiao Hu, University of Hong Kong
Description:
Knowledge building (Scardamalia and Bereiter, 2006) focuses on collaborative learning and collective knowledge creation. The Knowledge Forum (a computer supported platform) was created to support this process. Different learning analytics tools have been developed to visualize student interactions on Knowledge Forum. While understanding the collaborative learning process is important, it is also essential to understand how individual learning contributes to collective knowledge advancement.
In the tutorial, we will introduce participants to the use of an analytics tool to understand and visualize individual cognition in knowledge building and which complements the current knowledge building analytics. Specifically, participants will learn how the Map of Dialogical Inquiry (MDI) helps learners see the different learning modes they engage in during the course of a learning experience (Bound, 2010; Stack and Bound 2012). The MDI consists of eight aspects (theorising, imagining, reflecting, relating, experiencing, procedural, applying and analysing). It has been used by learners to become more self-aware of their ways of learning and inquiring. This in turn supports them in moving beyond habitual patterns of thinking and learning as they try on other aspects in the MDI. We will discuss how this tool can complement the current knowledge building analytics.
Activities:
Participants will be introduced to the MDI and will create their own MDIs. • Participants will categorise data samples using the MDI.
Target Audience:
Anyone interested in the practice of collective knowledge building in teaching and learning, reflective practice and the use of analytics to strengthen pedagogical practices.
Takeaways:
How Knowledge Forum has been used in courses for adult learners • How the Map of Dialogical Inquiry can be used in knowledge building • Insights on how the Map of Dialogical Inquiry can help learners become more self-aware of their learning journey and support educators in reflecting on their pedagogical practices
Preparations:
Read the following: Bound, H. (2010). Developing Quality Online Dialogue: Dialogical Inquiry. International Journal of Teaching and Learning in Higher Education. 22(2): 107-119. Scardamalia, M. and Bereiter, C. (2006). Knowledge building: Theory, pedagogy and technology. In K. Sawyer (Ed.). Cambridge Handbook of the Learning Sciences (pp.97-118). New York: Cambridge University Press. Stack, S. and Bound, H. (2012). Tools for Reimagining Learning, Institute for Adult Learning.
Tutorial Leaders: Dr Lin Feng, Lecturer, Teaching and Learning Centre, Singapore University of Social Sciences; Dr Priscilla Pang, Head of Programme, Master in Boundary-Crossing Learning and Leadership, Institute for Adult Learning, Singapore University of Social Sciences; Associate Professor Helen Bound, Institute for Adult Learning, Singapore University of Social Sciences
Description:
Learning analytics dashboards are applications designed to support teachers and students to reflect on their teaching and learning, aiding them in decision-making. Often, however, the information encoded on the dashboard is visualised in an ambiguous manner and interpretations among students and teachers differ. In this tutorial, the participants will be introduced to the concept of sense-making and how dashboard design elements can facilitate or hinder sense-making. Using a dashboard mock-up as a case study, we will discuss principles of dashboard design and data visualisation that contribute to sense-making, increasing the positive impact of dashboards.
Activities:
- Participants will be introduced to the concept of sense-making with dashboards.
- Based on the analysis of one use case, we will discuss principles for designing effective LA dashboards that support sense-making.
- The dashboard mock-up used as a use case will be adapted to reflect the discussed principles
Target Audience:
Anyone interested in the visualisation of learning data and the design of learning analytics dashboards, anyone from students to teachers to practitioners to educational institution managers.
Takeaways:
- Exploring the concept of sense-making with dashboards.
- Understanding of principles for dashboard design and data visualisation
- Understanding the role of user perspectives in dashboard designs
Preparation:
- Reading list coming soon
Tutorial Leader: Ioana Jivet, Goethe Univerity Frankfurt & DIPF, Germany