Rose Burt By Rose Burt • October 2, 2017

Tracking Live Event Data with AR + xAPI

Float and Yet Analytics recently partnered on an Augmented Reality Conference app and data collection; the following is an excerpt from the full white paper.

Update: Here's a quick demo video from DevLearn DemoFest, where this project won the top prize:

Introduction

When scanning the learning industry in 2017, it’s pretty clear there are few technologies that are turning more heads than Augmented Reality (AR) and the Experience API (xAPI). These two leading edge technologies are impacting the future path of where learning, measurement, and organizational performance intersect.

Even though these two technologies make headlines and are regular presences in learning conferences as separate entities, there has been very little work done in integrating the two of them to see the possibilities that might occur when they are combined.

Float and Yet partnered to bring the two of these together for a research project in conjunction with the eLearning Guild’s Realities360 conference that was conducted in San Jose from July 26-28th, 2017.

This whitepaper explores the project and shares some of the things that Float and Yet learned in creating this groundbreaking effort.

Measuring Success in the App

The Yet xAPI LRS is designed to make connecting your xAPI sources simple – cut and paste your authentication credentials and you’re good to go. While collecting data in an LRS is an important goal for any learning data program, Yet goes one step further by providing live interactive visual analytics designed for data discovery. Yet’s philosophy is that your data is only as good as what you can learn from it.

High Level Insights: Most Active Times, Most Active Users, Most Popular Activities

Let’s start out by taking a look at some high-level insights we can get by taking a look at the overall conference data.

Here we’ve scoped the data view to just the two days of the conference, though participants have also used the app before and after. The first pattern we can see is activity over time – which lets us understand overall patterns in participant activity. Participants appear to have been most active in two large blocks, with a couple of high points within each. Additionally, it looks like day one was a bit more active than day two.

Now if we take a look at this same data – statements over time – broken down in different ways, we can start to get some more detailed insights.

Here we’re looking at the breakdown of different activities (in an xAPI statement, the “object”) over time, with a hover-over detail highlighting the time window we’ve moused over. We can see that a couple of activities were far more popular than others in this time window, specifically “Learning Becomes Doing” and “Alternate Reality Games.”

If we look at the same moment in time with the stack sorting by verb, we can see which types of actions (favoriting, attending, viewing) were most popular over time.

We can also look at aggregate information to see which actors were the most active, and which activities were most popular.

One of the interesting things we can see in this view is that the breakdown of actions per participant and activity varies; while “viewed” is clearly the largest percentage of actions for each of our participants, our most active participant has a much higher percentage of “unfavorite” actions than the rest of our top five. This visibility into patterns of behavior can help us identify what kinds of interactions are happening with different types of content and with different learners.

For more data exploration details and information on how the AR event app was built, download the full white paper.

Download the White Paper

Subscribe for Weekly Updates

Posts by Tag

see all

Popular Posts