Evaluation should be an integral part of each activity of the instructional development process, including for eLearning in its various forms, although designers often overlook it or leave it out. Evaluation is important because it is the most reliable tool for continuous improvement in training system quality.

Properly done, evaluation starts in the planning stage with development of an evaluation plan and continues for the life cycle of the training system. In this article, I propose a method for conducting the training evaluation from start to finish.

Let’s get started!

Evaluation overview

Evaluation provides different kinds of information to those who are directly involved with the project (the participants) and to those who otherwise invest in the project, whether by credibility, control, or other capital (the stakeholders).

The evaluation plan should clearly identify the participants and stakeholders.

Methodology

Table 1 summarizes the steps, process and inputs, and the outcomes.

 

Table 1. Methodology for conducting a training evaluation

Steps

Process and Inputs

Outcome

1 – Plan the evaluation

Users of the evaluation data and analysts identify the evaluation questions.

Evaluation plan

2 – Determine the evaluation design

Identify the appropriate level of evaluation and the population, sampling methods, and the method of data collection.

Evaluation design

3 – Develop the instruments

Develop and validate the instruments.

Data collection instruments

4 – Collect data

Conduct interviews, distribute questionnaires, or collect other data.

Raw data

5 – Analyze the data

Use statistical packages (e.g., SPSS) to analyze the data and interpret the results.

Summary descriptive and statistical data

6 – Report the findings

Write draft and final reports to present the findings and make recommendations.

Evaluation report

 

Evaluation step-by-step

1. Plan the Evaluation

Begin by developing a written evaluation plan to:

  • State the objectives of the evaluation,
  • Determine the questions to answer,
  • Select the information that you collect to answer these questions, and
  • Set a timeline for when the collection of information will begin and end.

This plan can guide you through each step of the evaluation process because it details the practices and procedures for successfully conducting your evaluation.

2. Determine the Evaluation Design

In this step, provide detailed descriptions of what you plan to do, how you plan to do it, and who it is you want to reach. Use this information to answer evaluation questions pertaining to your objectives, such as: Are the objectives attained? If not, why not?

For each objective, the evaluation plan must describe the following:

  • Types of information needed – The program objectives guide the types of information you want to assess.
  • Sources of information – From whom, what, and where will you obtain your information?
  • Methods for collecting information, such as questionnaires and procedures – To the extent possible, you should integrate the collection of this information into ongoing program operations. For example, in training programs, you can use participant’s registration forms and their initial assessments to collect evaluation-related information as well as information relevant to conducting the training.
  • Time frame for collecting information – Although you will have already specified a general time frame for the evaluation, you will need to specify one for collecting data relevant to each implementation objective. Again, the objective being assessed will guide the times for data collection.
  • Methods for analyzing information – This section of your evaluation plan describes the practices and procedures to use in analyzing the evaluation information. For assessing program implementation, the analyses will be primarily descriptive and may involve tabulating frequencies (of services and participant characteristics) and classifying narrative information into meaningful categories, such as types of barriers encountered, strategies for overcoming barriers, and types of facilitating factors.

3. Develop the Data Collection Instruments

Use the methods specified in your evaluation design to develop data collection instruments. Data collection instruments may include:

  • Multiple-choice items
  • Matching items
  • Short-answer items
  • Essay questions
  • Paper-and-pencil questions
  • Job performance

4. Collect the Data

Use the methods specified in your evaluation design to collect and organize your data. The data collection methods may include:

  • Focus groups
  • One-to-one or small-group interviews
  • Written questionnaires
  • Data from learners
  • Data from supervisors

5. Analyze (Interpret) the Data

Once you have collected the data and organized it for either a formative or summative evaluation, then analyze and report on this data. Analyze the data to answer the evaluation questions specified in the evaluation plan. Analyze the data to determine training or education deficiencies and any instructional needs. Samples of analyzed data may include:
  • Curriculum design
  • Target audience
  • Instructor qualifications
  • Environmental conditions

6. Report the Findings

Produce Evaluation Reports. I suggest reports that consist of content sections that use this format:

Section 1.0: Executive Summary

Prepare an Executive Summary as the first part of your Evaluation Report. This Executive Summary provides an overview of the evaluation results for the training program. Additionally, it highlights key findings and recommendations so readers can locate and digest the report quickly.

Executive Summaries may include:

  • Purpose of the program
  • Program activities, location(s), and target audience(s)
  • Overview of findings and outcomes
  • Overview of recommendations

Helpful Hints:

  • Do not include technical details in the Executive Summary
  • The Executive Summary provides the reader with key points of the evaluation without having to read the complete evaluation report

Section 2.0: Introduction

The Introduction is the second part of an Evaluation Report, and it provides a description of the evaluated program. The Introduction may include such details as:

  • The relationship of this program to the organization’s mission and broader organizational efforts.
  • Explanation of how the program originated – This explanation should include a description of this particular program within the total instructional and training program, summary of a literature review, and summary of the Needs Assessment and the political climate.
  • Program overview – This section should focus on the program’s purpose and highlight key program activities. Additionally, the Introduction should describe the program’s target population, when and where activities took place, and why the program was set up the way it was (program design).
  • List your goals and objectives.
  • Significant program revisions – Use this section to describe any changes to the program’s objectives or activities that occurred prior to or during the evaluation, and provide a rationale for those changes.
  • History of the program’s development, or changes in the program since its initial implementation, including prior accomplishments or gaps that your program seeks to address. This is especially relevant for programs that have been in existence for several years, and for programs that received funding from the same agency in prior years.
  • Comparison of the program evaluated to similar programs sponsored by your organization or by a competitor.

Section 3.0: Evaluation Methodology

Describe in detail the research methods used for your evaluation. The purpose of this section is to explain the design and implementation of your evaluation:

  • Demonstrates that the evaluation and procedures for collecting data were planned carefully and systematically.
  • Tells readers how to gather the information presented in the report. This allows readers to assess the quality of data-collection procedures.
  • Provides documentation that program staff can use to repeat procedures if they want to collect comparable data in the future.
  • Documents your methods, providing a framework for staff with similar programs to draw on as they design or improve their evaluation procedures.
  • Assesses whether or not the data collection tools used were appropriate.

Remember: You may have already described your evaluation methods in another document, such as a grant application. If so, you may be able to draw on that text for your final report, editing it to reflect any changes in methods used and challenges faced.

Section 4.0: Findings

Prepare an accurate and concise summary for your evaluation report. The purpose of the Findings section is to provide coherent account information for the user. The Findings section:

  • Organizes your findings in qualitative, quantitative, and/or mixed method information that allows your client(s) to address objectives, and any other specific information.
  • Analyzes your data in order to detail if and how well you met your program objectives (research questions).
  • Presents detailed Conclusions. Examples: item analysis or specific patterns.

Example. Here is an example of how you may summarize and analyze data for presentation in the Findings section of a final report:

Objective: “Sixty percent of carpenters completing the training program during program year 2010 will have acquired the skills needed to pass the carpenters’ entrance exam.”

Section 5.0: Conclusion(s)

Prepare and interpret your data to determine what the data analysis results say about the evaluation of your program. Were the results what you expected? If not, why not?

  • Explain which objectives were met and which objectives were not met.
  • Did unrelated circumstances or events influence this evaluation? If so, how?
  • Provide appropriate information for stakeholders.
  • Determine if different data sources produced different results. If so, why?
  • Discuss any relevant unexpected findings.
  • Compare the current program evaluation results with previous evaluations of this program or other similar programs.
  • Determine if the user needs to know any information other than the specific results determined by this evaluation.

Section 6.0: Recommendation(s)

Prepare detailed suggestions based on your evaluation report’s conclusions. Explain the basis for each recommendation. Additionally, provide information on why a particular recommendation will or will not improve your program.  

Conclusion

The purpose of instructional and training evaluation is to provide continuous feedback to improve training. Training improvement should lead to learners achieving higher results in tests, quizzes, on-the-job training, and other methods of evaluation. The six-step process in this article will provide you with guidelines for conducting a training evaluation. The evaluation report template here should also be useful to you in documenting your program design and results.