At just about every organization, a shared strategic goal of making more data-driven decisions has been set as a top priority. But in order to be successful in this endeavor, departments need more than just more data—they need access to the right kind of data for the types of decisions they want to be able to make.
When the Learning & Development team takes on this challenge, they often take their first steps by using web analytics tools like Google Analytics. What the team quickly realizes is that while the data they see is good in the aggregate, it lacks the specificity they need to make informed changes with content and interventions with learners.
In this article, we’ll take a look at what web and usage analytics are and how they can provide value to a learning and talent team, and then we’ll dig a little deeper into what it takes and why you may want to make the transition to robust learning analytics.
The most commonly used web analytics tool is Google Analytics. It’s free and fairly easy to set up. With the simple addition of a snippet of code to a website, you can easily start gathering data on behaviors including the number and frequency of visits, bounce rates, duration of stay, patterns of repeat visits, and common paths through content. This is great data to have; it gives you a window into high level usage patterns. You can track multiple web properties via a separate Google Analytics script on every site or application you own as an organization.
The primary value of Google Analytics is that it enables you to track aggregate patterns in user and content interaction. It’s a staple in the marketing and advertising tool kit. Knowing which content is engaging and how engagement is trending over time is crucial to understanding a website’s usage and in turn it’s value to your learners.
If you don’t currently have this kind of data tracking, and you have access to install the tracking code, you should set it up! It’s a great foundation for understanding aggregate behavior and engagement with your learning content. However, because Google Analytics is designed to help you understand website traffic, it lacks the granularity needed to see how content is being used by individual or groups of learners or how that content impacts the results they see in their learning or performance.
Usage analytics are primarily used by builders of software tools. They are usually focused on user behavior or on systems performance. Usage analytics help software teams understand how users interact with their products, including which features are getting used, how often users log in, what kinds of devices are being used, and many of the other statistics also available through typical web analytics.
Software teams leverage usage analytics to better understand how their users interact with the tools they build. Usage analytics are sometimes built in-house, and sometimes added via a specialized external application. Some usage analytics tools also offer user communication tools to embed within a user interface. Software teams often use a combination of passive data collection and active polling to understand what users like, don’t understand, or would like to have in their products.
While usage analytics try to answer some of the same questions web analytics address, the difference is that usage analytics begin to focus down to the experience of individual users. Web analytics by necessity are aggregate and fairly anonymous; content on the open web is viewable by anyone, and the statistics you get from web analytics are about aggregate and anonymous behavior. Usage analytics typically are dealing primarily with authenticated users, in an environment where behavior can be tracked at the individual level.
As an LMS administrator, you likely have access to information about usage in the LMS. Information like logins, logouts, student interactions with content, page views, comments or responses in discussion forums, quiz results, and other like information is available to you as access reports. When a learning analyst or course manager logs in to view this usage data, they may be able to make correlations based on other independently known information. In general these usage analytics are limited in granularity and at best provide enough information to know if someone interacted with any part of the course and if they passed or failed an associated assessment.
When we talk about learning analytics, we have both a different type of environment and a different set of goals to take into account. Web analytics can give us an aggregate view of anonymous user visit behavior and usage analytics can tell us what parts of an application users are interacting with, but the thing we most care about in a learning environment is the learning itself.
Learning analytics are anchored around learning specific interactions, including answers to questions, time spent interacting with different learning elements, pathways through content, and overall retention and completion rates.
Learning analytics take place in the context of learning content. The interactions tracked are at a much higher resolution than web or usage analytics, since in learning we care not just about time spent on a page of content or components of a course used in an LMS but specific interactions within a piece of learning content.
By tracking individual learning activity, instructional designers can get data on how learners interact with learning experiences. Much like usage analytics give product designers insight into how users interact with applications, learning analytics provide insight into the specific interactions of an individual learner within a designed learning experience.
Learning analytics can also be tailored to help learning and talent managers understand the answers to key questions they have about learning engagement and performance. For instance, a visualization we might use to represent engagement could show time spent, number of interactions, and score outcomes all together, allowing managers or training leaders to see engagement and performance patterns across cohorts and content.
Making the jump to learning analytics means taking advantage of the capabilities of xAPI-powered learning technologies. These technologies enable L&D to aggregate learning data from multiple platforms, providing a higher level view into overall learning engagement and performance. While web analytics track each web property separately, and usage analytics focus on interactions within individual applications, only a learning analytics approach can offer a unified view into individual and aggregate learner experience across multiple tools.
Data to Power Learning and Talent Decisions
Each of these types of analytics is designed to answer a specific set of questions for particular stakeholders. Web analytics give marketers and site managers insight into high level web traffic patterns. Usage analytics give software developers and platform managers insight into application usage patterns. Learning analytics give learning and talent professionals insights into learning engagement and performance across the learning ecosystems they manage.
By harnessing the power of learning analytics, you’ll be able to get much more personalized with how you can develop learning content, design training solutions, and demonstrate the value of the tools you use. And you’ll have the granular learning and experience data needed to achieve your goals of using data to make decisions and set the course for your organization’s learning and training.