The focus of the March 25th 2019 xAPI Camp at the eLearning Guild’s Learning Solutions Conference is “xAPI for Instructional Design.” Having attended xAPI Camps since 2015 and participated in the Guild’s “Tin Can Alley” for a couple years before that, I had to stop and ask myself, “Why now? What is different now versus 2012 when I first began writing about xAPI, before its 1.0 release? What is xAPI’s impact on instructional design as we move into 2019? Furthermore, what has taken so long?"

In this article, I will summarize my interviews with some of the presenters of the upcoming xAPI Camp to get their perspectives. I’ll also explore some of the things evangelists should keep in mind during the design process to take advantage of xAPI’s capabilities.

I am NOT going to address xAPI as a technical specification. I am NOT going to explore JSON statements, or how to hand-code content to send xAPI. Why? First, there are a ton of good resources that get down into the technical weeds. (See the resources linked within and at the end of this article.) Second, the technical understanding of xAPI is not as critical to instructional designers as the possibilities xAPI holds.

If I were designing a house, it is important for me to know that light fixtures exist so I can include them in my design. It is also important for me to know that it is possible to have a light switch installed near the door where I enter the room. I will appreciate a design that keeps me from bumping around in the dark looking for a pull chain or searching out a lamp. I do not need to be an electrician, however, to make these design decisions—I need to know what is possible. Sure, I will eventually request something that is not feasible, and that is fine. Not making the request out of ignorance is not acceptable.

How does xAPI change the learning landscape?

xAPI changes our optics on learning in a couple different ways. First, it helps us get out of a course or module mindset. With xAPI, we really do have the opportunity to look at learning as a collection of building experiences, interactions, and shaping learning over time versus designing a one-off event.
Second, xAPI gives us the ability to analyze data that we have never had or data that, in the past, required elaborate data gathering. Access to learning analytics and the ability to tie learning to performance is what gives L&D a seat at the table when it comes to making organizational decisions.

—Art Werkenthin, RISC Inc

Getting started—the fundamentals

There are a few high-level concepts that warrant a mention to establish a common language. First, xAPI is a way of recording experience as data—it is the Experience API. xAPI provides a common structure or syntax so this experience data can be recorded in a way that can be read, retrieved, or reported on in a consistent manner. Essentially, xAPI statements become a chronological activity stream similar to a Twitter feed, where every statement includes an Actor (who or what did something), a Verb (what they did) and an Object (what they did it to). The activity stream data is written to a Learning Records Store or LRS. The LRS is a database that allows an activity provider to write these statements and read them back. Yes, you must have an LRS to “do xAPI.” An LRS can be stand-alone or embedded as part of an LMS or other system. Megan Torrance has great information on Learning Record Stores and possible configurations within a Learning Ecosystem. (See the resources at the end of this article.)

Okay, that’s it. The end of “What is xAPI?” See, painless and non-technical, just like I promised!

So, what can I do? The possibilities

One of the challenges xAPI has been forced to overcome is its very open structure. When suddenly presented with the ability to track any learner behavior—how he or she interacts with content, how they perform tasks on the job, what resources or performance support systems they access, and even social learning engagements—it is often easy to be overwhelmed with the possibilities. Rather than taking the approach of capturing all the data you can and hoping to see actionable trends emerge, begin with the end in mind. What is your goal in capturing this data? What insight can you gather from this new intelligence?

From a design standpoint, try not to get bogged down in “how” to implement xAPI, and instead focus on “what” you want as your end result. Let’s consider some options following each phase of the ADDIE model of ISD. Don’t worry, similar uses of xAPI can be applied to iterative design and other models. The key is looking for opportunities to use data to improve the learner experience, drive retention, and streamline training where we can.

Why has it taken so long for instructional design to embrace xAPI?

Many instructional designers held back for a few years after xAPI was first announced because it seemed to require more programming than we would normally do, but it’s come a long way. It’s now a part of all major authoring tools and there are standalone, easy-to-use tools that make it possible to create xAPI statements without doing any actual coding. Put simply, xAPI has been coming to meet us, and it has arrived.

What do you hope to get out of xAPI Camp?

This won’t be my first xAPI Camp, but it will be my first as a speaker, and I’m thrilled to share how instructional designers can dive into using xAPI in ways that help us read our learners’ stories and provide more effective, more usable learning experiences for their needs.

—Judy Katz, Eduworks


There are a number of ways xAPI can assist in analyzing the training that needs to occur, as well as providing insight into potential delivery mechanisms. Certainly, we can use xAPI data to look at past performance in training. What about your students' use of performance support tools? Are your learners posting questions to forums or discussion groups that may indicate a learning need? What resources on a corporate intranet or SharePoint site are students using? Are they accessing this information from a mobile device or web browser?

While it is easy to focus on learning data, don’t neglect what happens on-the-job. An xAPI record can be written by just about any system, application, and even IoT devices. Could a worker’s interaction with a software application indicate where training or real-time performance support could be offered to reduce frustration, time, and errors? Even if in-house tools and applications do not natively send xAPI statements, there are a number of add-on tools that can be used to send statements on their behalf. xAPI allows us to step back and take a broad assessment of the data sources we can use to analyze need.


When designing a learning intervention, what information do you want to capture? If you want to track questions back to an objective to determine mastery, it is important to include that objective data in your xAPI statement for analysis. If your users are accessing content from their smartphones, does it make sense to bypass the LMS and design a learning experience that is mobile first? xAPI can provide this insight.

Two highly valuable but often overlooked features of xAPI that can influence design are: 1) the ability for content or other applications to "read" xAPI statements from the LRS, and 2) the ability to include attachments to xAPI statements. Content can be designed to adapt based on a learner’s performance in past training. If a student mastered a topic in a past course, that topic could be hidden all together in a subsequent course because the content can access the previous completion from the LRS. Perhaps the option to test out of learning content is only offered if a student reaches a particular score on another piece of content, or a student’s answer in one content module drives branching in a subsequent module.

These adaptive learning options are further enhanced by the ability for the xAPI statement to contain an attachment. Consider a curriculum on writing operating procedures. In one module, a student could be asked to upload a picture of a piece of equipment. This picture is saved in the xAPI statement and a later module could request this picture back from the LRS to display to the student. This is just one case where xAPI helps us move past a didactic model of delivery and on to a truly learner-centric delivery model.

Finally, what opportunities does xAPI open for the assessment of students? xAPI can certainly be used for tracking traditional measures like Level 1 reaction surveys or knowledge-based tests. The xAPI data from these assessments can be used to create visualizations around the learning experience and/or mastery of a topic. xAPI gives us the opportunity to go even farther with assessments. Mobile applications can send xAPI statements to document field observations of tasks. AR (augmented reality) performance support tools can use xAPI to track how a student performs on a task and the resources used for completion. Interested in behavior data? The Ann Arbor Hands-on Children’s Museum now uses xAPI beacons at each exhibit to evaluate visitor usage to align with state-defined curricula. During the design phase, brainstorm on your ideal method of evaluating student performance and then consider whether xAPI could be leveraged in the evaluation instruments you create.


During development, the strategic design decisions around the type of learning experience and the data we want to consume or capture should already be defined. The critical consideration for development is how we will create statements. Is there an existing profile that can be used to structure statements? What verb libraries will be used? A while back, a developer said to me, “I can’t believe we spend eight hours determining how to write a statement, and five minutes of development time to implement it.” While a bit funny, it is truly the right approach.

If you shove data into the LRS without developing governance for statement structure and being faithful to that structure during development, how will you report on that data? If everyone uses the same statement structure (profile) when a student plays a video, I can report on video use across all content consistently. Take the time during the development phase to research what others are doing and try to avoid creating rogue statements that may skew analytics or cause confusion later. Good and consistent statements provide confidence in the story that the data tells.

What do you see as a benefit of xAPI to instructional designers?

Instructional design has always been about designing effective training experiences. As instructional designers we all learned about theories behind learning, best practices, and what research had shown us about training effectiveness. Stepping into the business world, we sought to apply many of these principles, but continually found ourselves butting up against one key problem. While we knew in our gut that our approaches could yield superior results and overall improvement, we had little data to demonstrate the effectiveness of one design over another, much less the ability to quantity the effectiveness. Businesses need to measure value, and budgets and time are often linked to that value, perceived or otherwise. When faced with a lack of data, the result is often the minifying of budgets and time invested in training.
With xAPI we can begin to realize the key part of the equation we’ve been missing, true results and detailed information beyond a simple test score. At the root level we can begin to easily measure the base of all instructional design, instructional objectives. From there we can tie these objectives into other data and measures the business is already collecting and show effect and relationships.
This is only the beginning though, from here we can look to what materials are being used and how, but more importantly, how effective are those people who use this training, or does one design approach improve effectiveness over another? Take it a step further, and we can even begin to efficiently drive the right learning to the right people. It won’t happen overnight, but xAPI is the huge step forward we always knew we needed for training to become recognized as an integral part of the business for training effectiveness to mean more than simple information distribution.

—Paul Schnider, PhD, Domiknow


There are some technical implementation decisions that must be made when using xAPI. Taking inventory of your current technology ecosystem is important, as is evaluating cyber security and data privacy. The best technical "plumbing" for your xAPI is highly dependent on the organization and audience. Not to discount the importance of this infrastructure, when I think of the impact of xAPI on the implementation phase of instructional design, my focus is on delivery.

Unlike SCORM, which defines the way a learning management system and content interact, xAPI has no mention of an LMS in the standard. There is a profile, cmi5, that defines the use case for launching content from an LMS using xAPI. What can I design if I am no longer beholden to the LMS? Can I provide a better user experience though a native, mobile app that sends xAPI statements? Should I leverage existing social networks within the organization to send statements? What if we send out training via text message that will send an xAPI statement completely bypassing the LMS? Suddenly xAPI frees us from the limitations of the LMS in crafting a solution. The LMS can become another activity provider rather than the central hub for all training.

What do you see as the payoff for xAPI?

xAPI doesn’t just tell you what someone has done. It can be used to collect and illustrate the user’s experience. The how’s and why’s of what they did. By learning these, we can make our learning products far more tight, tailored, and targeted; ultimately resulting in less time spent on training interventions, and better, more efficient employee performance.

—Anthony Altieri, Omnes Solutions


It is easy to jump straight into the summative evaluation potentials for xAPI. We can use xAPI to identify how content is used and if elements are being neglected. We can use xAPI to identify topics where students struggle. We can even use xAPI to compare how someone performs in training versus how they perform on the job. All of these data points provide the ability to evaluate a course and recommend enhancements. But what about summative evaluation of learning content?

How do you perform user testing of new content? If you are observing users or having users self-report though a questionnaire their experience with content, are you getting fair and unbiased feedback? If using xAPI to quietly monitor how users interact with content, can you minimize the Hawthorne Effect and improve the validity of my evaluation?

Getting started

So, where do we begin? One option is certainly to join the upcoming xAPI Camp at Learning Solutions for a deep dive into xAPI and its impact.

Learning Solutions 2019 will host a number of additional conference sessions on xAPI. The Torrance Learning xAPI cohorts are another great place to roll-up your sleeves and work with a team to build and present an xAPI project.

If this is your first foray into xAPI, consider starting small. There are several free resources and even trial Learning Records Stores that can be used to get started without a capital project or need for extensive technical support. Finally, take advantage of xAPI communities of practice. Many early adopters of xAPI started from scratch without the benefit of tools with native xAPI functionality and built communities of practice to formalize profiles you can use today.