As the eLearning world begins to adopt and develop the xAPI standard, it’s not hard to find products, tools, and platforms that offer xAPI compliance. But as an industry, the xAPI is still new enough that it’s hard to find actual examples of it at work in the real world. And yet, those stories are important cases to guide us as we move forward. This is the first article in an “adventure series” that will explore the xAPI in its natural habitat.

If you’ll forgive me, in the first couple of stories I’ll share projects my team is working on. After that, the sky’s the limit, the more the merrier, and so on. Do you have stories to share? Drop me a line—I’d love to hear what you’re doing and help you share your story.

RFED—the Ann Arbor Hands-On Museum’s personalized interactive exhibits

(This project won the first xAPI Hyperdrive at DevLearn in October, 2014.)

The Hands-On Museum is a children’s STEaM (science, technology, engineering, art, math) museum based in Ann Arbor, Michigan, USA. Museum educators were eager to offer an enhanced field trip experience for school children, one that would show quantitative data related to the learning experience and the science standards that schools and teachers are held to. At the same time, they wanted to provide a richer experience at each exhibit, slowing the kids down long enough to interact meaningfully with the exhibit and learn something more.

To do so, the museum would need to track which students were at which exhibits and what they did there. The resulting data also helps teachers tie their museum field trip visits to curriculum standards to ensure that their time (and their budget) is spent meaningfully. The RFED project is an ongoing effort to meet all these needs and, at the same time, explore some cool technology and connect with the community. (RFED is a play on the underlying RFID technology and ED for education.)

What it does

In the first iteration of this project, the museum used RFID tags embedded into nametag lanyards to passively log students into each exhibit as they came into range of a wall-mounted antenna. Here's how it works: As the children come into range, a tablet computer mounted nearby greets them by name and engages them in a short series of questions and explorations with the exhibit, which is typically an analog device of some sort (Figure 1).

Figure 1:
As the children come into range, a tablet computer mounted nearby greets them by name and engages them in a short series of questions and explorations with the exhibit

Data from each student activity is then sent back to the LRS (Learning Record Store) immediately after the student(s) complete the interaction, or after they “log out” by leaving the area (as determined by the antenna). Teachers and museum staff can then access a dashboard showing an activity stream and some simple data visualizations by student and by exhibit. A simple search function allows teachers to search for specific text strings found in either the xAPI statements or in the text responses typed in by students.

As this project moves into Phase 2, the somewhat finicky RFID technology is being replaced by beacons that offer more granular proximity tracking and a more robust signal. As it turns out, kids wiggle a lot and don’t stand with their name badges neatly lined up in front of an antenna. Go figure!

How it works

When a student—or group of students—approaches the exhibit, the antenna picks up on their name badges and the tablet opens up the appropriate grade level content. One of the advantages of the xAPI over SCORM is that it can record the same activity data for multiple learners at once. Rules are being worked out to determine what happens when students from multiple grades approach all at once.

Each time a question is answered or a response given on the tablet, an xAPI statement is formed. To allay data privacy concerns with the children, each student is assigned a badge with the name of a famous scientist, engineer, inventor or mathematician. Back at school the teacher simply matches up the student with the correct badge. (At the same time, it’s an opportunity for even more learning about the person on the name badge!)

Here’s the corresponding activity data sent to the LRS and retrieved:

To follow along the xAPI statement structure of Actor-Verb-Object-Context,

Actor = Ada Lovelace

Verb = answered

Object = “It isn't big enough to get into space”

Context = “Blast Off” (the name of the exhibit) and “Q3” (Question #3 in the interaction)

Admittedly, a multiple-choice interaction like this example from the original prototype is pretty SCORM-like. You could accomplish many of the same things using an LMS and some spiffy reporting tools. However, the xAPI allows us to ask and record more complex interactions like those in Figures 2 and 3.

Figure 2:
An example of a more complex interaction enabled by use of the xAPI

Figure 3:
Another example of a complex interaction

The data is sent to a LearnShare LRS for storage. A separate dashboard page requests these statements from the LRS to create the nearly-real-time report. A key benefit in this application is that the xAPI statements easily enable a human-readable activity stream. While this is interesting, and sometimes fun to read, in any sort of quantity it soon becomes overwhelming and considerably less useful.

We used JavaScript widgets to create simple graphs to quickly and concisely display activity by exhibit and by student, but there’s still more data available that we could use to create whatever meaningful visualizations we might require in the future (Figure 4).

Figure 4:
Simple graphs quickly and concisely display activity by exhibit and by student

As I mentioned earlier, the push for teachers to connect all their instructional decisions to state and national standards is becoming more prevalent. To assist teachers and school districts in aligning field trip experiences with these standards, Phase 2 xAPI reporting will include the standard(s) that the exhibit and corresponding tablet interaction address. In this way, teachers not only have quantitative data about how students interacted with exhibits but also how those interactions helped that student move toward standards mastery.

The ah-ha!

We learned a number of things on this project and we continue to learn more with each new iteration. Some of the key findings so far include:

RFID was somewhat problematic with highly mobile learners. Beacons will likely work better, although they will also raise the cost of each name badge considerably.

When we think about tracking interactions, it’s very easy to get caught in question and answer, SCORM-like quiz questions. The first four interactions we built all fit this model. The later interactions started to push the boundaries a bit more, collecting data, free-form text, multi-part constructions and so on. In future phases, the xAPI will allow the recording of objective data from the exhibits themselves.

We discovered one of the interesting findings the night of the event that we released RFED. Students were completely unimpressed: of course their technology knows them by name! And it’s nothing new to them to have tablet-based learning interactions. The biggest fans of RFED were teachers who were very excited about the ability to get real data in real time about their students. The fact that the dashboard offered multiple easy-to-use options for using the data was even better.

In a SCORM world, a user must generally be logged into a system to record their individual progress and interaction. But the xAPI allows us to record the activity of anonymous users in any location, a choice that could be made due to privacy concerns or just to make it easier to gather anonymous data without a complex system of authentication that can slow down user interaction (especially kids running around a science museum).

For RFED, we fed the predetermined scientist names (triggered by their RFID badges) into the xAPI statements in place of a user’s real name, and the LRS just records the statement—at its most basic, it doesn’t care who people are. Then those who have access can do what they want with the data—display, report, analyze patterns, course-correct, justify decisions, etc.

We’re looking forward to learning more from this project as we continue with the next iterations. Phase 2 will include beacons, a new front end login, 20 exhibits, three grade levels and enhanced reporting for the museum, teachers and students. Stay tuned!