Learning analytics is the collection, analysis, interpretation, and communication of data about learners and their learning to provide theoretically relevant and actionable insights that enhance learning and teaching, as defined by the Society for Learning Analytics Research. It integrates approaches from the learning sciences, data science, and human‑centered design and operates across formal and informal settings in education and training. According to SoLAR’s 2025 definition taskforce, the term now emphasizes interpretability and actionability in addition to data collection and modeling.
SoLAR
History and community
The modern field coalesced around the inaugural International Conference on Learning Analytics & Knowledge held in Banff, Canada, on February 27–March 1, 2011, with peer‑reviewed proceedings archived by ACM. The LAK series has continued annually, becoming the premier forum for research and practice. researchr.org;
SoLAR
The Society for Learning Analytics Research (SoLAR) formed as an international network to coordinate conferences (LAK), summer institutes (LASI), publications, and policy engagement. It serves as the official sponsor of the Journal of Learning Analytics, the first peer‑reviewed, open‑access journal dedicated to the field. SoLAR;
Journal of Learning Analytics
Foundational syntheses and handbooks have mapped the domain’s scope, methods, and open questions, including the open‑access 2022 second edition of the Handbook of Learning Analytics. SoLAR
Concepts, scope, and distinctions
Learning analytics applies analytic techniques to educational data with the goal of understanding and improving learning. Typical analytic orientations include descriptive, diagnostic, predictive, and prescriptive approaches, each supporting different decisions for learners, instructors, and institutions. SoLAR
The field is closely related to educational data mining (EDM) but maintains a distinct emphasis on connecting computational analysis with pedagogy, organizational context, and stakeholder needs. Scholarly overviews stress the importance of aligning metrics with learning theory and cautions against over‑reliance on correlations without interpretation. TechTrends;
Educational Technology & Society
Data sources and infrastructure
Learning analytics systems aggregate data from learning management systems, assessment platforms, discussion forums, video tools, and student information systems to analyze patterns linked to engagement and achievement. Early descriptions emphasized combining these “digital breadcrumbs” to build models that identify at‑risk students and support timely interventions. EDUCAUSE;
EDUCAUSE Review
Interoperability standards enable consistent event streams across tools. IMS Caliper Analytics provides a JSON‑LD event model and metric profiles for capturing learning events; version 1.2 refines the information model and use of linked data. Complementarily, the Experience API (xAPI) standard records learning experiences to a Learning Record Store (LRS); xAPI 1.0 was released in April 2013, and subsequent work aligned with IEEE 9274.1.1. IMS Global/1EdTech;
xAPI.com;
ADL GitHub
A Learning Record Store functions as the repository and access point for xAPI statements; vendors and standards bodies provide conformance test suites and adopters registries to ensure interoperability. xAPI.com;
ADL
Methods and techniques
Methods range from classical statistics and visualization to machine learning, natural language processing, network analysis, and process mining. Systematic reviews of empirical studies report common objectives such as predicting academic risk, supporting self‑regulated learning, and providing actionable dashboards and feedback loops. Educational Technology & Society;
SpringerOpen
Analytics workflows typically include data instrumentation, feature engineering from clickstream and assessment data, model training and validation, and communication of findings through dashboards, alerts, or recommended actions. Many institutions organize these around the four analytics types noted above to structure use cases from monitoring to decision support. SoLAR;
EDUCAUSE
Applications in practice
Common applications include early‑alert systems that surface indicators of non‑submission, low engagement, or atypical activity; personalized feedback cycles; and program‑level quality improvement. The Open University, for example, describes monitoring, early‑warning indicators based on historical data, and evaluation of teaching as three core uses. Open University
Institutions and instructors also use dashboards to inform timely feedback, advising, and activity design changes. Sector guidance from EDUCAUSE highlights pattern detection across LMS and assessment data to inform proactive support for student success. EDUCAUSE Review
Governance, ethics, and law
Ethical frameworks in learning analytics focus on transparency, consent, data minimization, validity of models, and the potential for adverse impacts. Influential analyses identify challenges around surveillance, fairness, accountability, and interpretability, and propose principles to guide responsible practice. American Behavioral Scientist;
British Journal of Educational Technology
Sector codes and institutional policies codify practice. Jisc’s Code of Practice for Learning Analytics (2015) provides principles spanning responsibility, transparency and consent, privacy, validity, access, enabling positive interventions, minimizing adverse impacts, and stewardship of data. The Open University produced a policy (and later a Data Ethics Policy) to govern the ethical use of student data for analytics. Jisc;
Jisc blog;
Open University
Legal compliance varies by jurisdiction and context. In the European Union, the General Data Protection Regulation (Regulation (EU) 2016/679) has applied since May 25, 2018, setting requirements for lawful processing, transparency, and data subject rights. In the United States, the Family Educational Rights and Privacy Act (FERPA) governs access to and disclosure of student education records. legislation.gov.uk;
U.S. Department of Education (via IN DOE)
Community initiatives such as the Asilomar conversations (2014, 2016) further articulated responsible use principles for student learning data in higher education. Ithaka S+R
Research, venues, and current developments
The Journal of Learning Analytics publishes peer‑reviewed research spanning computational, pedagogical, institutional, and policy perspectives. The LAK conference series, organized with ACM, anchors annual dissemination of advances, with companion proceedings broadening practitioner and workshop contributions. Journal of Learning Analytics;
SoLAR;
SoLAR
Recent community updates include SoLAR’s 2025 revision of the field’s definition and discussions of how generative AI and agents intersect with analytics to provide real‑time, personalized feedback, alongside renewed emphasis on ethical safeguards. SoLAR;
SoLAR
Standards and nomenclature
- –IMS Caliper Analytics: a standardized event vocabulary and data model for learning activity, designed for interoperability among educational tools.
IMS Global/1EdTech
- –Experience API: a specification for recording learning experiences to an LRS; version 1.0 released in April 2013; current work aligns with IEEE 9274.1.1 (xAPI 2.0).
xAPI.com;
ADL GitHub
Society for Learning Analytics Research: an international community coordinating conferences, publications, and initiatives. SoLAR
