How are you measuring the effectiveness of your learning programs? Surveys? Quizzes? In today’s business climate, what gets measured gets managed. What doesn’t get measured doesn't matter. What doesn’t matter is soon made redundant.

It is important to make a business case for learning solutions, and it may not be clear to L&D practitioners how to do that. Even if you know how your larger organization measures and calculates return for other business investments, you still need to discern what matters to stakeholders when it comes to proving impact and showing which activities are delivering business value. ?ROI (return on investment) is the same measure applied to any business investment, and training is an investment of time and money.

Making a business case for learning

While it may be difficult to know exactly what is required in order to make a successful business case for a given learning initiative, we do know what causes a business case to fail. In her Learning Solutions article, Building a Business Case for Learning, Pam Boiros points out four key failures:

  • The business case is not in line with strategic business objectives.
  • The business case lacks alignment with CEO and CFO goals.
  • The business case as presented requests spending without financial benefit projections.
  • The business case uses HR and L&D terminology that is a “different language.”

It is best, of course, to do the opposite.

Where to begin?

First, understand that ROI has these sources:

  • Revenue increases
  • Cost reduction
  • Risk reduction

These are what matters and what needs to be shown.

There are a number of ways that organizations measure and calculate the return on investment, and the L&D organization should understand the details that apply in the business as a whole. But regardless of the method of calculation, those three are where the impact must be shown, and they must be factored into the design of your learning programs.

What keeps you from being able to show ROI?

There are things that many L&D organizations do that make showing ROI difficult. Those are things you should stop doing.

  • Solutions that are “one size fits all”—generic ideas based on tradition rather than on measurement and identification of the root causes of problems with revenue, costs, and risk. You know you are dealing with one of these when the basic idea is “the way everybody has always done it.” A path is set, a test is given at the end of the course, and when the learner passes the test they are sent to the field. No follow-up, no coaching, no assessment of results. Nothing gets measured before or after training, therefore no demonstration of return is possible.
  • Too much time passes between the learning event and the moment of need. There is no practice, no reinforcement, and no feedback on performance. People forget what they learned, or they lose whatever edge they may have received in training.
  • If trainers get any data about employee performance, they don’t know what to do with it. In Putting Data to Work (2018, an eLearning Guild research report), Ellen D. Wagner, PhD points out: “While business operations in corporate, educational, and government settings have made good use of business intelligence techniques for business optimization purposes, most training (and school- or university-based education) has been slower to adopt learning and learner analytics than to adopt data analytics for business applications. ... In corporate education and training, data analytics use cases have emerged in the context of determining return on investment (ROI) from L&D programs and assets. Without national catalysts sparking movements in a unified or comprehensive direction, companies and government agencies make these decisions based on their own internal growth and development plans. Without a regulatory requirement in place, movements in this direction are voluntary.”

At the same time, there are things that successful L&D organizations do to show ROI, and those are good things to do.

What can you do about all this?

One good way is to find out what successful organizations do, and then adapt and imitate it. Jeremy Negrey, director, customer education at PartsSource, will present a session during The Measurement & Evaluation Online Conference (September 30-October 1, 2020) "Data-Driven Design: Identifying Core Metrics for Your Learning Program".

In this session, you will learn how PartsSource identified key measures of success for their customer education program. You will understand why they chose the metrics they did and see how they measured and reported on them to key stakeholders across the organization. You'll learn how measurement is embedded into the design of each course and learning event offered, as well as reveal some internal metrics used by their customer education team to measure efficiency and productivity.

Register now for this Online Conference and learn new strategies for enhancing your L&D projects! If you are interested in attending this online event, but are unable to attend on either September 30 or October 1, register anyway and you’ll receive access to the recorded sessions and handouts after the event.