You’ve got data. Lots of data. Your LMS and LRS collect all kinds of information about learners and their activities: You know what learning modules each learner has registered for, how much time they’ve spent, whether they watched an entire video or only the first minute, what their quiz scores were, and who they chatted with during the virtual class session. You know who did what; in addition, you’ve got aggregate data about time spent and progress recorded in eLearning department-wide—or even company-wide.

Great!

Now what?

Managers, instructional designers (IDs), and developers are constantly told that they can use data to improve learners’ engagement, results, or job performance, to personalize eLearning, or to make learning stick. Are these claims realistic? What data do managers or designers need, and how should they use it to achieve any or all of these goals?

What data should you collect?

Data can help a manager or an ID improve learners’ job performance only if it is the right data and it is used in a strategic way. To begin, the manager has to have a clear goal in mind. What performance is targeted? What improvement is sought? What will change look like? The answers to these questions determine what data to collect—but even that is not enough.

Change has to be measured against something: current performance. You won’t know if employees’ performance improved after the eLearning if you don’t measure performance before the eLearning.

“To correlate to on-the-job performance, a baseline is needed prior to the learning intervention,” said Sean Putman, an instructional designer and xAPI expert at Learning Ninjas. “Managers can compare the baseline to the post-learning intervention data to help answer the question, ‘Did the learning intervention work?’ With the right data collected, this comparison could be done—and is being done today—by companies.”

To further the goal of using learner data to evaluate whether eLearning is effective, Jigsaw’s virtual classroom platform collects hundreds of “data points” on learners and their interactions with the learning tools on the platform and with one another. Jigsaw also helps managers evaluate different kinds of learning.

“We’re looking at the value of engagement learning versus learning by PowerPoint,” said Ginger Ackerman, Jigsaw’s vice president of sales and marketing. She distinguishes between lecture and PowerPoint slides—which she calls “guides” for the instructor—and “learning tools,” which are activities that foster learner engagement. On Jigsaw, these include chats in breakout rooms, large-group discussions, collaborative work using shared whiteboard space, and participation in role-playing exercises.

While Jigsaw—like other learning platforms—doesn’t actually collect job-performance data, the data it does collect helps managers decide whether eLearning is effective. For example, managers measure pre- and post-learning job performance; the learning platform captures data on what activities learners engaged with (and for how long), what they did in a role-play exercise, and their scores on quizzes. IDs, virtual classroom instructors, or learners’ managers can then “tie that together with application learning,” Ackerman said. “Because we can do role-play reporting, because we can do individual project work as well as breakout rooms, the results of those pieces come back to the facilitator. And the facilitator can then review that with the performance.” They “absolutely” can and do track and correlate individuals’ learning activities with job performance, she said.

Creating a better learner experience

In addition to collecting data about learners’ activities—time spent, material viewed, test results—learning systems generally collect demographic data on the learners themselves. That might be conventional demographics like age, but what might be more relevant is job data: What is the role of the learner in the company? What tasks does she do? How much experience does she have?

Mapping this data against baseline and post-learning job performance data can help determine whether an eLearning module correlates to improved job performance. This analysis can also help managers map out effective training plans for employees in different job roles or with varying amounts of experience, thus personalizing and targeting the learning experience for future learners.

“When data is collected and compared to actual business data, we can see what actually works for a given demographic. When we have the knowledge of what works, we can create learning paths to personalize the learning for future learners by demographic,” Putman said. “As data is collected, patterns can be used to provide alternate paths through content based on answers given or selections made.”

Ackerman said that the data collected and aggregated in Jigsaw’s virtual classroom platform aids IDs in improving the learners’ experience. “When you look at that information from an organization aggregately, what it does is it provides the instructor the opportunity to understand what type of information they need to be building and designing for their corporate training going forward.”

Collecting and analyzing learner data can enhance the virtual classroom experience for both instructors and learners, Ackerman said. For instance:

  • Improving instructors’ skills—Jigsaw measures instructors’ use of platform tools. Managers and IDs can use that data to determine whether an instructor is spending most of the virtual session lecturing—or engaging with learners. They can then identify which instructors might need coaching to develop a more engaging, activity-based teaching style.
  • Using learners’ preferred tools—Data about which tools learners engage with most readily, or most often, helps IDs and virtual classroom facilitators create lessons that emphasize the tools that learners prefer and are willing to use. It can also highlight a need to teach learners how to use other tools, helping them get more out of their eLearning.
  • Providing useful job aids—Data collected about learners’ use of materials, both within the virtual session and as downloaded job aids for use in the workflow, helps designers create tools that will improve learners’ job performance.
Not all data collected counts or measures activity. Qualitative data, such as comments gathered from chats, discussions, role-play exercises, or observations of learner performance, also provides valuable information. Managers and IDs can study the vast amounts of quantitative and qualitative data they’ve gathered to gauge the effectiveness of eLearning—and to improve future learners’ experience.