A Decade of AI in xAPI: Building, Not Chasing
In the current wave of excitement around artificial intelligence, it’s easy to forget how long some of this work has actually been underway.
For the team at Yet Analytics, AI in the xAPI ecosystem isn’t a recent addition or a repositioning. It’s been a continuous line of inquiry, development, and application that stretches back more than a decade. We’ve been instrumenting AI systems well before the current moment made “AI-powered” a default descriptor.
Early Foundations: Neural Networks and Educational Econometrics
Our work with AI in learning systems began in 2016 with LESI Alpha, when we built our first neural network models to explore educational data. At the time, the question wasn’t “how do we add AI?” but rather:
What becomes possible when learning data is structured well enough alongside economic data to support advanced econometric modeling and multi-variate predictive analytics?
This work culminated in a presentation at the Education World Forum in 2017, where we demonstrated an educational econometrics use case featuring data from over 180 countries. The key insight then, and now, was that AI is only as useful as the data model it operates on.
That belief would shape everything that followed.
From Data to Experience: The TLA as an Activity Ecosystem and the First xAPI-Driven LXP
As xAPI matured as a standard for capturing rich, behavioral learning data, we saw an opportunity: not just to analyze learning after the fact, but to shape it in real time.
In 2018, we published a novel white paper: Mission Control for Learning and Performance: The enterprise learning ecosystem in the age of advanced streaming data architectures and real-time learning analytics. This paper would later inform Shelly Blake-Plock’s 2019 contribution on Analytics and Visualization to the book Modernizing Learning: Building the Future Learning Ecosystem, edited by Walcutt and Schatz and published by Advanced Distributed Learning.
This work, and research that we took on for ADL on the nature of data analytics and learning algorithms, informed the second generation of the Total Learning Architecture (TLA) as a streaming data system. This system would be designed to support AI as the expectation, not as an add-on.
All of this led to two projects. First was a partnership with The Learning Accelerator to link learner activity to nodes on a competency graph. The second was a project with the SparkX Cell at Joint Base Andrews where we designed and built a Learning Experience Platform (LXP) to deliver AI-powered content recommendations based directly on a learner’s xAPI activity stream. We called this Training Commons. Rather than relying on static rules or surface-level metadata, the system used behavioral data to inform what they should see next. It seems commonplace now, but this was over six years ago when the preeminent example of AI was the Netflix recommendation stream.
The point here is that this was an early expression of a principle we continue to hold: AI in learning systems should be grounded in lived activity, not abstract assumptions.
AI as Infrastructure: GIFT, STEEL-R, and xAPI Profiles
Our work deepened through collaboration with major research and defense initiatives, including GIFT (Generalized Intelligent Framework for Tutoring) and the work of the STEEL-R project.
Within this context, we developed the xAPI Profile for the GIFT AI tutoring framework to define how intelligent tutoring interactions could be expressed, shared, and reused as structured data.
More importantly, we helped establish a pattern where AI systems function not as isolated components, but as data brokers within a larger ecosystem:
Translating learner activity into meaningful signals
Routing those signals to competency assertion engines
Enabling downstream systems to reason about performance and mastery
This approach positioned AI not as a feature, but as part of the infrastructure where it was deeply integrated with how data moves, evolves, and becomes actionable. And it was all done within the context of xAPI.
Intelligence at the Data Layer: SQL LRS and Reactions
As our work evolved, we began to push intelligence closer to the data itself.
With SQL LRS, that philosophy took concrete form through Reactions. This is an onboard conditional logic engine that operates directly on xAPI data as it is received and processed.
Reactions enable systems to evaluate incoming xAPI statements in real time, apply conditional logic based on behavioral patterns, and trigger transformations, assertions, or downstream actions by producing the requisite xAPI statements within the LRS.
In practice, this means that intelligence is no longer confined to external analytics pipelines or AI services. Instead, it becomes part of the data infrastructure itself. SQL LRS could be leveraged to pre-process deterministic data continuously, and allow the interpretation and actionability of processes dependent upon learning activity as it happens. This is key to providing AI platforms with data that mitigates against learning data slop.
This is a subtle but important shift. Before you apply machine learning, you need systems that can understand and respond to data structurally.
Reactions provide that capability, turning raw xAPI streams into AI-ready, decision-capable data flows. They bridge the gap between instrumentation and intelligence, making it possible to operationalize learning data in real time and at scale.
The Ecosystem Approach
Through initiatives like Training Commons and STEEL-R, we explored how AI and xAPI intersect at both research and operational scale.
And throughout, our efforts have focused on establishing shared data models for learning and training, enabling interoperability between AI-enabled systems, and supporting communities of practice around learning engineering.
Since building our first ML use case over a decade ago, the goal always has been the same. We ensure that AI capabilities are not siloed, but connected. And that they are grounded in standards and are extensible across contexts relevant to learning ecosystems.
The Last Two Years: Prototyping What Comes Next
More recently, our work has expanded into new frontiers that build directly on this foundation.
We’ve developed prototype AI-powered games built on xAPI-native frameworks, where every interaction is captured as structured data. This allows both real-time adaptation and post hoc analysis. And it foreshadows a near-future where games are connected directly into business systems.
AI + Games + Business Systems = An interesting new opportunity.
At the same time, we’ve been working with transformer-based technologies to generate simulation scenarios directly as xAPI statements. This opens up a new paradigm, as well:
AI-generated experiences that are immediately interoperable
Simulations that are not just content, but data-producing systems (think about running simulations for the express purpose of generating training sets for ML/AI models)
Learning environments that can be analyzed, adapted, and recombined at scale
This is not a departure from our earlier work. It’s a continuation of it.
A Simple Distinction
There’s a growing conversation in the xAPI space about AI, and that’s a good thing. The field is moving forward, and new ideas are emerging.
At the same time, it’s worth recognizing a simple distinction.
Some approaches treat AI as an overlay, as a capability added onto existing systems.
Whereas our work has consistently treated AI as something that emerges from well-structured, interoperable learning data. And from systems that can act on that data in real time.
That difference shows up in the details. It is in how systems are designed. In how data is modeled. In how intelligence is operationalized (from Reactions to ML models). In how capabilities scale over time.
And ultimately, in what those systems are able to do.
Looking Ahead
If there’s a single thread that runs through all of this work, from early neural networks to AI-generated simulations, it’s this:
The future of AI in learning is inseparable from the quality, structure, and responsiveness of the data it depends on.
xAPI has always been about capturing the richness of human activity. When combined with systems that can interpret, react to, and learn from that data, AI becomes far more than a label. It becomes a capability embedded throughout the entire learning ecosystem.
We’ve been building toward that vision for over a decade. And we’re just getting started.