In my last article (“SkillCamp: Linking Learning to Business Outcomes Through Performance Support”) I outlined the importance of performance support in learning, using the example of SkillCamp, a global learning framework for marketing and sales at Bayerʼs Crop Science Division. SkillCamp has established a stable approach for measuring success that I will share in this article.


SkillCamp is a globally steered marketing and sales training initiative targeting several thousand marketing and sales colleagues across many countries.

Bayer’s Crop Science Division is headquartered in Germany and is structured in global, regional, and local country organizations. Marketing and sales (M&S) activities are organized as commercial excellence and partially supported by a global marketing department. Figure 1 (later in this article) illustrates the scope of the commercial excellence activity as it relates to our 12-Step Go-To-Market approach. Global training activities are steered from headquarters.

The cornerstone of SkillCamp learning activities is our electronic performance support system (EPSS)—an electronic repository for all relevant and actionable information that serves as a single point of reference and truth for all global colleagues in marketing and sales as well as their key interfaces. But it also covers regular instructor-led trainings (ILTs), virtual instructor-led trainings (VILTs), webinars, newsletters, and individual learning paths.

SkillCamp metrics approach

The metrics approach was initiated along with the setup of the SkillCamp program, and refined with the growth of content and structure, covering tactical, functional, operational, and strategic elements as described below.

It targets all marketing and sales units and colleagues in our organization, spanning headquarters as well as all our country organizations. It is globally integrated into learning for these functions.

Overall, the metrics approach can be broken down into three stages or levels of integration that I describe in more detail in this article:

  • Level 1: Initial SkillCamp objectives
  • Level 2: Focusing on learner performance: our five SkillCamp goals
  • Level 3: Focusing on impact: impact measurement and impact improvement in SkillCamp

These three stages represent our SkillCamp Maturity Model for learning metrics.

Level 1: Initial SkillCamp objectives

In our stakeholder interviews, we learned that a successful learning approach will need to provide a common and consistently used terminology or language, close cooperation between marketing and sales, relevant examples from our and other industries, continuity, top-down support, and a close link to implementation.

Based on this insight we defined our first milestones:

  • To train around 4,500 colleagues in marketing and sales functions and their key interfaces with a SkillCamp foundational classroom training
  • To ensure a common mindset and language around our 12-Step Go-To-Market approach
  • Finally, to provide relevant and actionable processes and tools for the daily work of those colleagues in an EPSS

All these initial objectives were achieved. After rolling out our SkillCamp Foundational Training and introducing performance support for marketing and sales, employees now truly use a common language and common processes for commercial excellence. If a marketing manager from Malaysia meets a colleague from India in a regional conference, they can quickly align and exchange their activities along our 12-Step Go-To-Market approach. Figure 1 shows these 12 steps and identifies the content or skill areas we cover with our commercial excellence initiative. A sales manager from the US can easily contact the right colleagues in the headquarters for specific questions, and a manager from New Zealand can benefit from a case study provided in France. A common language, common basic approaches, examples, tools, concepts, videos, and contacts are shared in the community via our EPSS.

Figure 1: The 12-Step Go-To-Market approach

Level 2: Focusing on learner performance: our five SkillCamp goals

Once Level 1 was achieved and SkillCamp was up and running, we launched Level 2 of the measurement approach. Following our performance support approach, we wanted to measure learner performance. This would enable us to improve learner performance and adapt our learning and performance offerings accordingly.

We defined five clear objectives for SkillCamp as a sustainable functional learning and performance initiative. For Level 2 our target group is really the individual learner.

While it is difficult to relate any learning or training initiative directly to business success as top management measures it (e.g., market share or turnover), it is nevertheless possible to link individual learner performance to successful learning and training activities or initiatives. We conceive training and learning as a performance improvement process of the individual. Their performance improvement objectives are derived from business process and therefore contribute to business success. Business success on the individual level can, for example, mean:

  • Greater business impact from training—for example, a sales or marketing manager gains greater customer understanding based on a better leverage of customer insights
  • Increased capability to achieve business results from learning—for example, a sales or marketing person can easily find and use resources he needs on the job
  • Greater capability to meet emerging business needs—for example, staff can access emerging and existing trends for marketing campaigns as well as tools and how-to explanations on where and how to use these tools

As a result, the following five objectives are defined along the individual learner’s learning journey:

  1. SkillCamp provides sustainable learning, i.e., learning that sticks
  2. SkillCamp helps learners to use available resources, e.g., via an EPSS called “SkillCamp Online” that hosts resources like tools, examples, plans, or videos
  3. SkillCamp aims to improve the performance of individual learners and their communities, for example, their effectiveness, speed, depth of knowledge, or contextual understanding
  4. SkillCamp aims to improve sharing and community-building across our organization globally, regionally, and locally
  5. SkillCamp continuously improves the content, learning material, know-how, and messaging of commercial excellence

We measure these objectives with specific indicators for each of our five goals.

The hardest part was to identify realistic, helpful, and obtainable key performance indicators (KPIs) along our five overarching performance objectives. It was not the identification of relevant metrics that was difficult. It was the availability of data that drove us toward a pragmatic approach. Some KPIs were discarded along that road; others were taken as compromises. Overall, we found that many quantifiable KPIs are very often just a good approximation of the actual measure we would be looking for. However, we decided to take the available data and work with it rather than having nothing to work with.

Examples of such KPIs are:

  • Number of people trained (plan vs. actual)
  • Satisfaction levels for ILTs
  • Number of people attending webinars (plan vs. actual)
  • Degree to which learners changed their working behavior based on SkillCamp

Specifically for the EPSS KPI, examples are:

  • Monthly number of unique users of the EPSS
  • Monthly number of page views
  • Topics visited
  • Top downloads, or
  • Average visit duration (we aim at a duration of approximately one minute, as we want the learner/performer to quickly find what they need and go back to the working process)
  • Other metrics include shared best practices, update cycles, or content quality

All those metrics are collected and displayed in a dashboard on a monthly basis as well as in quarterly infographics. Examples are provided in the results section. The monthly dashboard meeting has become a kind of steering meeting, as the dashboard review always results in specific improvement measures.

The investment in SkillCamp and SkillCamp Online cannot be justified only via classic feedback mechanisms like participation rates and participant feedback sheets. We must also ask if participants actually apply the learning in their daily life, at their moment of need, if they actively use SkillCamp (Online) resources, and how they interact with our performance support and learning offers. Eventually we should be able to see if these measures lead to more success in our market space—via linking it back to individual performance improvements.

However, Level 2 of our measurement approach did not aim at measuring financial impact or showing a correlation between learning and business results. Level 3 of the measurement approach changes that.

Level 3: Focusing on impact: impact measurement and impact improvement in SkillCamp

Currently SkillCamp is in the process of understanding more concretely which specific metrics help us to comprehend individual performance improvements more deeply and to identify what we need to measure to really demonstrate impact.

This leads to a pragmatic focus along our five goals that fit the current situation in our business, which is constantly undergoing a lot of change. It makes sense to adapt the metrics reporting and communication to these changing needs, while at the same time sticking to a clearly defined approach to ensure validity of collected data over time and being able to understand trends (in our activities and their results) over several years.

Specifically, Level 3 aims at taking those KPIs from Level 2 that measure impact and then adding KPIs and ways of measuring that enable us to improve impact.

We want to find out what business impact SkillCamp has on the individual level. To do that, we identify those learners who report a high impact as well as those who report no impact at all. As a result, we are able to:

  • Showcase best practices so other employees can learn from high performers
  • Improve the SkillCamp ecosystem to increase its impact on the company
  • Identify ways to adapt our offerings to further fit the needs of the organization
  • Quantify the impact of SkillCamp and show how it has improved the business and financial performance of the company

Specifically, the process—according to the “Success Case Method” by Robert O. Brinkerhoff—we are currently piloting is:

  1. Build an impact map for a training to define the desired learning and business outcome of the training and to identify how exactly the training is linked to the overall business strategy
  2. Execute a workshop or training event
  3. Do pre-selection surveys to identify success-case and non-success-case candidates
  4. Run success and non-success interviews: These are interviews with participants who implemented a lot after the workshop as well as interviews with those who had not changed anything after attending
  5. Analysis and creation of the success and non-success cases
  6. Use of the cases to verify our goal, change the workshop or learning setup, and intense stakeholder communication


For Level 1: We managed to establish a sustainable learning initiative after thoroughly analyzing the needs for nine months. SkillCamp has been running for four to five years now and has created a common mindset for commercial excellence. The metrics approach is continuously evolving and improving. SkillCamp was integrated into the Crop Science Balanced Score Card.

For Level 2: We can report 100 percent coverage of the identified target groups with our SkillCamp Foundational Training. Sixty-seven percent of the trained people state that they have changed their behavior and use the tools and SkillCamp Online. A global commercial excellence community has been established and is actively exchanging best practices and discussing commercial excellence both offline and online. SkillCamp reports to global marketing at the headquarters of Crop Science. It resides in the business itself. The performance support tool (EPSS) is used heavily—doubling its usage from 2015 to 2016.

Entering Level 3: We will also be able to better shape our program with learning paths for different roles as well as skill levels—e.g., from a rookie level to a master level. As we are measuring this in the system and for the individual learner (for example, with the help of badges), we are immediately using the metrics data to improve learning in the organization. Level 3 is now starting to show that SkillCamp has a financial impact on the organization.


Overall, it makes sense to start with a common base of content that is valid for the whole target group before extending to other content areas or other target groups. Community involvement and identifying multipliers is key. It is imperative to align all stakeholders when building or making big changes to a large training framework—e.g., for EPSS changes, design changes, or new search strategies. It is difficult to keep track of and prioritize all the communication topics that can or should be communicated within a performance-support learning framework. But it is worth the effort. It is important to take enough time to set up and run a well-planned communication framework for your learning framework. It is important to provide enough human resources to run a performance-based learning framework (these can be internal or external resources). Getting the data to analyze the impact is difficult. Focus on impact, however, is key. A good idea is to link learning metrics to company performance metrics.


We are evolving from a simple spreadsheet-based analysis toward a more integrated tool-based approach by integrating relevant metrics and parts of the analysis into our EPSS structure. This way analysis of our data becomes available in real time and accessible for different user groups—in other words, metrics starts to become scalable and available as a service—for example, for our country organizations.

One aspect that helps us to implement this is a keyword-based automated search. The keywords and faceted search are continuously optimized among others based on the results of our metrics approach. Measured user behavior is collected, analyzed, and then fed back into the system to improve usability.

Our measuring journey produced a path evolving from traditional training metrics to clear objectives for a holistic learning framework involving a focus on the “Learning Moment of Apply” and our performance support approaches. The third stage of our maturity model focuses on understanding the impact of learning on the individual level and how to use these insights to prove business effect while at the same time improving learning content and approaches.