Performance-based surveys: how to collect clues that forecast the impact of training

Collecting clues on people performance
andreypopov
Share this content

In a three-part series, L&D data detective Kevin M. Yates explores how to collect facts, evidence and data for learning’s impact on employee performance and organisational goals. Part three shows you how to collect facts and clues that forecast and predict the impact of training.

Just like a detective looks for clues that solve mysteries, you can use clues from a performance-based post-training survey to forecast performance impact. More specifically, you can estimate behaviour and performance change as a result of training.

The clues you get from a performance-based post-training survey will help you to answer questions about the effectiveness of training, such as:

  • How much of what people learned will they use on-the-job?

  • Are there barriers that prevent people from using what they learned?

  • Is the connection between performance and achieving goals clear?

Training fulfills its highest purpose when it impacts performance. Unlike traditional ‘smiley sheets’, which ask if attendees liked, for example, the food, the instructor and the facilities, performance-based surveys measure training’s influence on performance and the effectiveness of instructional design.

These five questions will help you gather clues and tell a story about expected performance outcomes as a result of training.

How to forecast the amount of performance change

Question 1: How much will your performance improve because of this training?

The answers from which to choose for this question are: Very Little (less than 20%), Some (30% to 40%), Moderate Amount (50% to 60%), Large Amount (70% or 80%) and Substantial Amount (90% or more). The percentages will help to quantify the expected change. Here’s an example of what results would look like using a horizontal bar chart series:

Question 1: How much will your performance improve because of this training?

This question provides a clue to the depth of change in performance as a result of training. Moderate, large and substantial amounts of performance change forecast measurable results. Some to very little amounts of change suggest that the programme is having minimal performance impact.

What to investigate:

  • Does training use examples of expected outcomes when you apply what you learned?

  • Do participants see the expected outcomes as achievable or unrealistic?

  • Are there factors beyond training that prevent participants from using what they learned?

How to forecast performance impact on goals

Question 2: I see the connection between using what I learned and achieving the organisational goal(s) specified in this training.

The answers from which to choose for this question are Strongly Disagree, Disagree, Not Sure, Agree and Strongly Agree. Here’s an example of what the results would look like using a single horizontal bar chart:

Question 2: I see the connection between using what I learned and achieving the organisational goal(s) specified in this training.

   

This question measures participants’ clarity about the connections between using what was learned, their performance and achieving goals. The results for strongly disagree and disagree are combined. So are the results for agree and strongly agree.

You can also ask the same question and use Yes, No and Not Sure as the answer choices. Here’s an example of what the results would look like with a pie chart.

Question 2 (pie chart): I see the connection between using what I learned and achieving the organisational goal(s) specified in this training.

This question is a clue for predicting the impact of performance on goals. High levels of agreement forecasts that participants are committed to using their performance to achieve goals. High levels of uncertainty and disagreement suggests lack of clarity on goals and/or doubt about self-contribution to achieving goals.

What to investigate:

  • Is the goal(s) to which the training programme is aligned clearly communicated?

  • Are there examples of performance outcomes?

  • Does the training illustrate the impact on goals when performance expectations are met or not met?

How to forecast the frequency of using what was learned

Question 3: How often will you use what you learned?

The answers from which to choose for this question are Rarely (less than 20%), Sometimes (30% to 40%), Frequently (50% to 60%), Often (70% to 80%), or Very Often (90% or more). As before, the percentages help with quantifying the results.

Here’s an example of what results would look like using a vertical bar chart:

Question 3: How often will you use what you learned?

This answers to this question will provide a clue to training’s influence on regular use of the skills and behaviours that impact performance. Higher frequency forecasts consistency with using skills and behaviours that achieve goals, while lower frequency predicts a potential barrier to achieving goals.

What to investigate:

  • Is there a clear message that frequent use of what was learned impacts performance?

  • Are the connections between using what was learned and achieving goals illustrated?

  • Does training show expected outcomes for consistently using what was learned?

How to forecast the amount of learning that will be used

Question 4: How much of what you learned will you use?

The answers to choose from for this question are percentage amounts from 0% to 100% in increments of 10, i.e. 0%, 10%, 20%, 30% and so on. Here’s what results would look like using a table:

Question 4: How much of what you learned will you use?

Option average = 51.11%

In the table, Option = answer choice, e.g. 0%, 10%, 20%. N Count = number of people who selected the option. Percent = percent of people who selected the option.

Option Average = (30% + 30% + 30% + 40% + 50% + 50% + 70% + 70% + 90%) divided by 9, the total number of people.

This question is a clue to how much learning will be used on the job and forecasts the depth of impact.

Higher amounts of applied learning forecasts measurable performance improvement, while lower amounts of applied learning predict less likelihood for achieving goals.

What to investigate:

  • Is there an appropriate volume of content in the training programme? Not too much or too little?

  • Does training include knowledge checks for comprehension?

  • Are expectations clear for how much learning to use to meet performance expectations and achieve goals?

How to forecast low performance impact

Question 5: Is there anything about this training that will not improve your performance?

This is an open-ended question and collects clues about where change in performance is least likely. It helps with identifying where instruction and experience can be changed to have higher impact on performance outcomes. It also forecasts where you will not see a change.  

What to investigate:

  • Are there critical skills and behaviours for achieving goals that were less relevant?

  • Is the content aligned with goals and performance expectations?

  • Is there a better way to teach a skill or behaviour?

Are you forecasting results and gaining insight?

Are your training programmes impacting performance? Are you forecasting results and gaining insights about the effectiveness of instructional design for influencing performance outcomes?

The clues you collect from a performance-based post-training survey will help you forecast and predict the impact of training on performance and inform decisions for instructional design.

Becoming an L&D data detective

In this three-part series, we've shown you how to use facts, evidence and data to answer the question, “Did training work?”.

We’ve explored how to collect facts about impact for performance and goals before training starts, we examined how to forecast the effort for collecting facts and we concluded with clues and data that forecasts performance impact.

My hope is that you are on your way to becoming an L&D detective!

 

 

About Kevin M. Yates

Kevin M. Yates Picture

Kevin is a Learning & Development detective and just like Sherlock Holmes, he solves mysteries. The mystery he solves is, “Did training work?”. He uses facts, evidence and data to show training and learning’s impact on behavior, performance and goals.

His work is global and multi-industry. He’s served in a variety of roles across training, learning and talent development which guides and informs his perspective and actions. Kevin’s guiding principle is, “Find one thing about a person’s behavior or performance you can attribute to training or learning and let that lead to the facts about impact.”  

Replies

Please login or register to join the discussion.

There are currently no replies, be the first to post a reply.