General Call

The 2023 edition of The International Conference on Learning Analytics & Knowledge (LAK23) will take place in Arlington, Texas, USA.  LAK23 is organized by the Society for Learning Analytics Research (SoLAR) with location hosts from University of Texas at Arlington. LAK23 is a collaborative effort by learning analytics researchers and practitioners to share the most rigorous cutting edge work in learning analytics

The theme for the 13th annual LAK conference is Toward Trustworthy Learning Analytics. As we continue to strive to develop learning analytics techniques, we also need to reflect appropriately on the impact of ethical and social change on learning analytics

Understanding societal impact of learning analytics needs to consider the role of humans to prevent biases in data and algorithms by design.  Ongoing work into data and algorithms that explain the reasons for each decision will have the potential to increase the trustworthiness of learning analytics systems making complex decisions or recommendations. Researchers and practitioners need to consider the role of data and algorithms including their misuse, understand their impact and influence on society, and solve the ethics, fairness, privacy, transparency, security, safety, and accountability by design, toward a responsible education system. 


The LAK conference is intended for both researchers and practitioners. We invite both researchers and practitioners of learning analytics to come and join a proactive dialogue around the future of learning analytics and its practical adoption. We further extend our invite to educators, leaders, administrators, government and industry professionals interested in the field of learning analytics and its related disciplines.

Authors should note that:


We welcome submissions from both research and practice, encompassing different theoretical, methodological, empirical and technical contributions to the learning analytics field. Learning analytics research draws on many distinct academic fields, including psychology, the learning sciences, education, neuroscience, computer science and design. We encourage the submission of work conducted in any of these traditions, as long as it is done rigorously. We also welcome research that validates, replicates and examines the generalizability of previously published findings, as well as examines aspects of adoption of existing learning analytics methods and approaches. 


Specifically, this year, we encourage contributors to consider how we can incorporate fairness, accountability, transparency and ethics (FATE) in the design, implementation and evaluation stages of learning analytics. Fair learning analytics means that the analysis technology should produce fair results. Accountable analysis refers to providing a certain degree of transparency and explanation, and adjusting the transparency of data and computation according to the differences of stakeholders. Trust goes hand in hand with transparency in decision-making; whether the decisions for predictions and interventions are fair and explainable is an ethical issue. There is still much to be done in human behavior and social values, such as respecting privacy, providing equal opportunities, and accountability. Based on diversity, equity, and belonging, inclusive learning analytics identifies and breaks down systemic barriers to inclusion, fosters a culture that every learner knows their belonging, feels empowered to bring their whole self to learning, and is inspired to learn.

Thus for our 13th Annual conference, we encourage authors to address some of the following questions in their submissions:

  • Is there anything distinctive about teaching and learning that requires distinctive approaches to explainable LA, or can we borrow unproblematically from other fields?
  • What would an algorithm impact assessment for LA look like?
  • How does “trustworthiness” vary depending on the stakeholder, or the academic discipline?
  • How do we give diverse stakeholders a voice in defining what will make LA trustworthy?
  • What counts as evidence that a deployed system is trusted?
  • We don’t demand transparency of many of our technologies, so how important is it really?

Further information regarding tracks and submission guidelines coming soon!


Register | Lost Password