Any eLearning design and development project begins with a goal—and with effective, measurable learning objectives that will lead learners to achieve the goal. “If you don’t know where you are and where your learners need to be, you can’t figure out how to get them there,” Julie Dirksen of Usable Learning, a Guild Master and consultant, wrote in Design for How People Learn, which has a detailed chapter on writing learning objectives.

It’s not enough for an instructional designer (ID) to know that there’s a performance gap that training is expected to bridge, or that managers want their employees to learn specific information. To write effective learning objectives, the ID also needs to understand why the learners need that skill or knowledge, how it will be applied, and how successful mastery and use of the information will be measured. Five questions that an ID must be able to answer before writing learning objectives are:

  1. What is the problem? A manager requests training. The ID’s first question should be, “Why?” Without understanding what performance gap or problem the manager is trying to resolve or what knowledge learners need and why, the ID cannot even determine whether training would help. In many cases, training or eLearning is not the best way to resolve the problem.
  2. What does success look like? The overall goal of the learning aid is often very broad. Goals like “improve communication skills” or “provide training for new managers” are not only broad, they offer no indication of what it would mean to achieve the goal. Dirksen notes, “Not all journeys are about the destination.” Sometimes “success” is simply enjoying time spent learning. This, too, is important for the ID to know.
  3. What are the consequences of failure? Knowing what would happen if the learners did not master the skill or information helps the ID understand the urgency of the training and the level of competence needed. It also helps IDs figure out what tools and job aids might work in a particular situation. If the consequence of failure is catastrophic—a pilot not knowing how to land an aircraft if the automatic systems fail—then training is needed to ensure that people don’t die. If the consequence of failure is minor—an incorrectly filled out, rarely used form is bounced back to the employee for revision—a better solution might be a job aid where that employee can look up instructions for filling out the form.
  4. What do the learners already know? Audience analysis is a key element of design; a course that begins by covering information that learners know in their sleep or teaches tasks that they have successfully performed on the job for years will not accomplish its goals because learners will be annoyed or offended and tune out all instruction. Alternatively, a course that starts at a level well beyond learners’ knowledge will frustrate them and lead to similar disengagement.
  5. What specific goals feed into the overarching goal or solution? The learning objectives, based on these smaller, more focused goals, can lay out a path from learners’ current level of skill or knowledge to the desired level.

Writing clear, measurable learning objectives

Armed with the answers to the above five questions, IDs can write their learning objectives. Dirksen advises being very specific and using what she calls “doing” words—verbs that represent observable actions. “Understand” is not observable. “Describe,” “define,” and “explain” are better but not great. “This is a hedge,” Dirksen wrote. “Besides, you don’t actually care if they can define it—you want to know if they can do it.” The learning objectives should be explicit and concrete enough to measure. Two questions that Dirksen suggests an ID ask about a learning objective are:

  1. Is this something the learner would actually do in the real world?
  2. Can I tell when they’ve done it?

One guideline that many IDs use when creating learning objectives is Bloom’s Taxonomy. Each of the six building blocks suggests a list of verbs. These verbs can form the kernel of a measurable learning objective; Bloom’s Digital Taxonomy, in Figure 1, adapts those verbs for eLearning. Bloom’s Taxonomy is often shown as a continuum, from lower-order thinking skills to higher-order skills. The six blocks are:

  1. Remember: Recognize and recall learned information
  2. Understand: Extrapolate meaning from learned information
  3. Apply: Use learned information in a product, process, presentation, etc.
  4. Analyze: Break concepts into parts and determine how the parts relate to one another or to a structure
  5. Evaluate: Make judgements based on criteria and standards
  6. Create: Put elements together to form a new, coherent pattern or structure

 

Bloom’s Digital Taxonomy associates verbs with the Bloom’s Taxonomy stages, emphasizing actions that are appropriate in an eLearning environment.

Figure 1: Bloom’s Digital Taxonomy infographic by Lee Watanabe-Crockett, from the Global Digital Citizen Foundation 

Functions of learning objectives

Not all learning objectives are useful in the same way or intended for the same audience. In 2006, Will Thalheimer, president of Work-Learning Research, published a “New Taxonomy for Learning Objectives,” which delineates four types of learning objectives, each with a specific function. These are:

  1. Focusing objective: Guide learners’ attention to the most important aspects of the learning material
  2. Performance objective: Provide learners with a quick understanding of the competencies covered in the learning material
  3. Instructional design objective: Guide the design and development of learning and instruction
  4. Instructional evaluation objective: Guide the evaluation of instruction

The first two are learner-focused; they are generally presented to learners at the beginning of a course of instruction. They distinguish between what learners should pay attention to (focusing) and what they’ll ultimately need to do with the new knowledge or skill (performance). The latter two are not aimed at the learners: They guide IDs and others in designing, developing, and evaluating the eLearning.

It’s customary to tell learners what the focusing and performance objectives are; often, eLearning opens with a screen listing these learning objectives. Dirksen points out that a list is far from the only—or best—way to accomplish it, and suggests presenting learners with a challenge or a mission or introducing a scenario instead. Thalheimer points to research that finds that “pre-questions” are at least as powerful as learning objectives in directing learners’ attention to the most important material.

Don’t forget about the gaps

After writing the learning objectives, Dirksen suggests revisiting the question of “gaps” or reasons that employees are not currently meeting the stated objectives. Gaps might occur due to missing knowledge or skills; these are easily filled by instructional materials. But instruction alone cannot fill gaps in motivation, gaps created by habit or environmental factors, or those that result from poor communication. Teaching people to perform a skill in a software package that they don’t have, or instructing them to do something that violates accepted practice or cultural norms, is not likely to bridge the gap. Determining whether learning objectives and identified gaps align serves as a final check on whether training—or some other job aid, tool, or strategy—will effectively solve the business problem.

Armed with a set of measurable learning objectives, an ID can progress to the next stages of designing and developing eLearning, such as determining the instructional approach and storyboarding or wireframing the course or tools. “Each Phase of ADDIE Encompasses Core Tasks for IDs” offers one model for progressing through the design and development stages.

Bloom’s Digital Taxonomy associates verbs with the Bloom’s Taxonomy stages, emphasizing actions that are appropriate in an eLearning environment.