Not many people really like writing assessments of any kind, whether the end product is a test, quiz, or complex certification examination. In fact, I have always thought of learning assessment as one of the toughest challenges facing learning and development (L&D) practitioners.
Today’s L&D practitioners are typically not statisticians, assessment specialists, standardized test writers, or learning psychologists. Their responsibilities are more likely to be broader and deeper than that, because they have an overall responsibility to prepare their learners for the realities of working in a technology-mediated, information-rich, and increasingly collaborative workplace.
The need for practical assessment tools and resources
Here’s the point: We need to provide assessment writing tools and resources for members of the learning and development profession who are not assessment specialists, but still want and require specialized guidance in this increasingly important skill area. Providing those practical tools, templates, professional perspectives, and resources for creating today’s learning assessments is the goal of our latest eBook Writing Assessments to Validate the Impact of Learning.
Edited by experienced learning practitioner and assessment expert, A.D. Detrick, our eBook begins with Jane Bozarth’s insightful introduction and then goes on to present current perspectives from several industry thought leaders, including Mike Dickinson and Marc Rosenberg. We also provide usable guidelines, downloadable templates, assessment websites, and other practical resources for all aspects of hands-on learning assessment. These include in-depth references, an annotated bibliography of the Guild’s assessment resources for further reading, and a glossary of terms for those new to the specialty field of learning assessment.
Selecting the best type of assessment item
Table 1 shows an example of these detailed guidelines. Of these six question types described below, “multiple choice” is often the preferred item type for most cognitive tests because these types of test items are able to assess most of Bloom’s cognitive categories, and can also be quickly and reliably scored by an individual or by a machine. Bloom’s Taxonomy and the newer “Digital Taxonomy” are critically-important assessment tools. We provide detailed information in the eBook about the cognitive categories within the taxonomy as well as additional templates and web resources.
Question Type |
Advantages |
Disadvantages |
True/False questions |
|
|
Matching questions |
|
|
Multiple choice questions |
|
|
Fill-in-the-blank questions |
|
|
Short answer questions |
|
|
Essay questions |
|
|
Source: The eLearning Guild Research,
2016.
Best practices for writing assessment items
Writing individual assessment items, regardless of the type you choose, can often be the most daunting part of the process. Table 2 is an abbreviated list of best practices included in the eBook. These ensure that your questions are as effective as possible, and apply to all questions regardless of type.
Best Practice |
Question Guidelines |
Write effective questions |
|
Review
questions |
|
Review
questions |
|
Source: The eLearning Guild Research,
2016.
Looking to the future of learning assessment
In addition to these practical resources, we also thought it important to look at the “future of assessment” practices and issues. Here is a brief sampling of what our contributors (including myself) said. As you will see, we often paint a less than enthusiastic picture of “where assessment is going” in the future.
“I’ve long warned of my concerns about so much ‘evaluation by autopsy’: our tendency to assess at the end of training or another learning intervention. We offer a smile sheet at the end of the day. We offer a multiple-choice quiz at the end of a module. The learning management system spits out reports about number of completions and average time to complete and average quiz scores … [But] none of that really tells us much about how well a learner can apply new learning, nor do we find out much about how to fix a course that isn’t working well. The future is bringing new tools to help us assess—and respond—as needs emerge or conditions evolve. There will be a quantum shift in the needs assessment phase—the beginning, not the end—of our work.” (Jane Bozarth)
“Probably the biggest irony of discussing the future of learning assessment is that the future looks so much like the past. Rather, the future looks very much like the past we should’ve been implementing all along. Indeed, technology has provided us with new and exciting ways to deliver assessments and capture assessment data, but the future remains relatively unchanged. We need to create valid, reliable, assessments that provide some certainty that the knowledge was a product of learning—and not guessing or prior knowledge. Then we need to use these measurements for the performance management of our learners as well as our own internal learning and development efforts.” (A.D. Detrick)
“My hope for the future is that instructors and trainers will view assessments as an integral part of the learning process, not an isolated side step, especially not one that focuses on lower levels of learning, such as terminology and taxonomies. I think that, too often, writers of paper-and-pencil tests (whether given on actual paper or on a computer screen) are lured by those very media into writing simple multiple-choice and true/false assessments, when more authentic assessment is within reach… So for the future of assessments, my hope is that we will continually challenge ourselves to go beyond the lower levels of learning when we write assessments, and that we will not be restrained by the media at hand (e.g., multiple-choice questions) from finding imaginative, more authentic ways to measure students’ learning.” (Mike Dickinson)
“If we want learning assessment to have a future, here are three suggestions. First, take some money away from your instructional design budget and build your organization’s evaluation expertise. You may produce fewer courses, but what you do build might have a shot at actually demonstrating real performance improvement. Second, move compliance training away from measures of attendance and completion to better measures of actual performance (a hard slog, I know). And finally, put your clients and customers in charge of evaluation by letting them tell you what constitutes success and then, together, you measure it … or not, and see what happens.” (Marc Rosenberg)
“We all hear about the need for assessments to produce better and more precise data. The most important use of these assessment data (as we’re told) is to isolate the precise impact of training interventions on business outcomes. This is an increasingly tall order for most learning organizations… [Instead] we need to focus our future assessment efforts on gleaning actionable and practical assessment data rather than producing an ever-growing morass of precise data points that try to connect a single training event to a change in business metrics, such as sales or costs. At the end of the day, by focusing on actionable assessment data, we’ll save ourselves from an exhausting waste of effort and resources because—in most cases—precise assessment data aren’t required to produce practical, feasible, repeatable, and actionable results.” (Sharon Vipond)
Finding the pathway to better and more effective learning assessment
To conclude, I believe that our contributing editor, A.D. Detrick, has said it best: “For far too long, we have relegated assessments to an afterthought in the design of courses. Every instructional model that includes assessments will start with the assessment and build the course design from that, but I have seen very few courses designed that way. Instead, assessments are assembled at the last minute, and their design is primarily informed by an insufficient allotment of time. The job of writing the questions is often left to subject matter experts who have knowledge of the content, but (usually) no experience writing assessment questions. Worst of all, the analysis of the results is often completely absent. This is a common dysfunctional cycle, in which it feels useless to measure something that was poorly designed, and hard to properly design something that won’t be effectively measured… The purpose of this eBook is to compile enough information to provide a pathway that would allow anyone to write [effective] assessment questions, regardless of experience or role.”
To emphasize what A.D. Detrick and all of our contributors are saying: If followed properly, these guidelines and resources can help you find a better “pathway” to learning assessment. Avoid the pitfalls that often compromise today’s learning assessments with these immediate, practical steps that help break the dysfunctional cycle of bad measurement and bad assessment design.
References
Shrock, Sharon A. and William C. Coscarelli, Criterion-Referenced Test Development: Technical and Legal Guidelines for Corporate Training, New York, NY: John Wiley & Sons, 2008.