How does your work translate to business results? How much better are our numbers after we complete your learning program? Are the learning resources worth the investment?

Familiar questions?

You are measured everywhere. The learning and development (L&D) business is no exception. Learning leaders, and your managers, CEOs, and boards in particular, want to see the impact of your work. Yet, trying to tell this story may cause you headaches.

Measuring learning outcomes is not an easy task. As soon as we move from basic metrics—counting participants or the number of minutes spent in eLearning, studying satisfaction surveys and assessment results—we face difficulties. Particularly if an informal (and social) learning component is present.

What else do we need to be able to provide a compelling story about our (successful) learning efforts? The answer is obvious: the story itself. Or more of them.

Think about stories in advance

To be successful in analyzing and presenting data about learning impact, you need a story. Your data must be wrapped in and augmented by stories—stories about the value of learning and effective knowledge transfer, parameters that demonstrate the return on investment (ROI) but are not so easy to quantify.

So, to enrich the learning data, it is crucial to think about the data presentation and visualization in the planning phase, even before data acquisition begins. The best way to visualize the data is to “draw a picture” via stories that both:

  • explain the data itself, and
  • tell about the real-life impact of learning, use cases, best practices, etc.

With such an approach, you will be able to demonstrate clear, valuable, impactful presentations of L&D achievements and the value they bring to the organization.

Let me take an example from our recent knowledge sharing survey. We figured out that tracking the number of posts, views, and contributors provides a baseline only. We wanted to hear from people about the value, about the ways they use information stored on the platform.

With a carefully crafted survey, we were able to determine some qualitative aspects of knowledge sharing, including how users value it. We also managed to filter active contributors from passive observers. We were also able to learn how exactly the users apply the knowledge they find on the platform.

A well-designed survey enabled us to augment our learning data and add value, so that we could tell a holistic story. The real vibrancy in knowledge sharing was thus determined with a combination of statistics (quantitative) and the survey (qualitative) data.

Measure the right things

If you want to have a good story, you have to measure (or survey) the right things. It is so easy to fall into a trap.

Long ago, we implemented an internal social (learning) platform. The purpose was to improve knowledge transfer within the company. To show that we were serious about the effort, we decided to reward the contributors of the top posts. The idea quickly deteriorated: The top posts were the ones where the employees discussed their sushi orders, beer parties in the afternoon, etc. Ultimately, the CEO simply shut down the platform.

This example shows not only how important it is to measure the right things, but also why it is essential to filter the “noise” out of your data. It is the value of the information that should determine the “top posters”—not the quantity of posts.

That is why it is of utmost importance that the person analyzing the data (the “data scientist”) is fully aware of the processes, procedures, and other elements being observed, measured, and surveyed. Only with knowledge of the dynamics behind processes, systems, and tools can they determine correlations that typically lead to stories, to impactful “punch lines.”

Measuring value

Once again, we face the key question: How can we measure the value of our solutions without asking people?

We can't.

To a certain degree, we do look at the number of views (likes, shares), but that is insufficient. Interaction with people–the consumers of information—that goes beyond mere counting is mandatory: a survey, a focus group, a 1:1 discussion.

In these interactions, we must ask the right questions. Rather than asking “How did you like it?,” ask “How did it help you to perform your task?

For example, in our recent survey about being a member of a virtual team (VT), an essential driver of our knowledge sharing, we asked “How much do VTs help you in your daily work?” The respondents really had to reflect on their daily work, and their responses were encouraging.

To create a great story about learning and the data you collect, you have to understand the relation of the specific knowledge and skills to the overall business and how they fit into the organizational strategy. Seeing a holistic picture helps you determine the impact of your L&D solution beyond “attendance” numbers.

Get more stories

Many of us constantly seek feedback. Numbers, surveys … what else? One suggestion is creating a “lessons learned” document for every learning program you complete. To do so, talk to participants—and to facilitators and mentors. Ask them to provide a written report about every participant, where possible. And ask them to identify skills developed beyond the ones the training was supposed to build.

Maintain connections with the participants even after the “event” by treating training as a long-lasting experience. Encourage participants to stay on a platform and discuss things they encounter while transferring the learned knowledge and skills into their work environment or when they take the relevant exam. Ask managers what improvements they have seen in their team, and encourage them to share the feedback they hear from team members with you.

In our recent upskilling project, the goal was to add and improve the coding skills of our network engineers. We focused on the number of attendees who completed the mentored program, including the final project, and its relevancy to the company business. The number of such projects was really high.

And we went one step further—we wanted to know if any of these projects gained the attention of our customers. Had participants promoted their work outside the company?

We found a participant who started using his code in a customer environment. He did not say “I learned coding skills.” He said, “I created this piece of code that can help you automate your process.” There was a perfect match. The customer bought the software; pure commercial success! A measurable ROI! And—we had a story of learning transfer.

An additional indicator that created an inspirational story was the fact that some participants took on the role of mentors in the new training group. Quite an achievement!

Change boring data into inspirational stories

To learn more, to promote learning, to clearly demonstrate the value of learning, inspiration is key. The inspiration that participants in the program, business stakeholders, and top management get is much higher if the L&D work is presented through stories that augment the data. No analytics tool is able to create great stories the way a person can, the way a human being, who understands the learning business, the cognitive science, and the data, can convey information with enthusiasm via stories.

Guidelines for presenting learning data

In closing, here are some guidelines that will help you present data about your learning programs in an inspirational way.

  • When you plan what data you will acquire, have the data presentation in your mind. Think about how you want to present it, who is in your target audience, and what you want the data to tell you.
  • Make sure you have a baseline; use an industry one or establish one at your first measurement. Think about continuity when setting your measurement plan. In a month or a year, you will need to observe the same parameters and ask the same questions to be able to determine the trends.
  • To enrich the data, survey the participants not only in the “How satisfied”-style, but in a way that prompts participants to describe the value they got. This might be done using focus groups, asking facilitators or mentors for verbal or written feedback, creating a “lessons learned” document, etc.
  • In addition to data acquisition, try to “capture” as much non-quantitative data as possible—things that happen during and after the training that can show additional proof of learner engagement, learning transfer, etc.
  • Make sure the data representation (graphs, diagrams, pie charts) is attractive and clear, and then point to some key takeaways with a short “story,” perhaps two or three powerful sentences. Do not leave a single slide in your presentation without a clear and focused comment. People, especially top management, need short, focused, useful information.
  • When you get a lot of verbal feedback (qualitative data), group related comments together and summarize; create a story from the unstructured information—and remember to record all of the comments in an appendix to your report.

When you feel that learning is hard to measure or learning KPIs are hard to define, think about how people consume information. As Amy Borsetti, of LinkedIn Learning Solutions, wrote, “… highlighting the outcomes influenced with stories that bring it to life will catch the eye of any executive.” So, augment your learning data with unstructured information that proves the value of your work—and “visualize” it through stories.

Learn from your peers

Are you seeking the strategies and skills required to navigate the needs of today’s ever-changing workplace? Are you an experienced or aspiring leader looking for a community to connect with to explore today’s biggest learning leadership challenges?

The Learning Leaders Alliance is a vendor-neutral global community for learning leaders who want to stay ahead of the curve, and for aspiring leaders wanting to build their skillsets. The Learning Guild’s Alliance Membership package includes access to exclusive digital events and content curated for today’s modern learning leader, as well as opportunities to attend in-person learning leadership events held around the globe. See the details here.