Author Profile Picture

Jackie Clifford

Clarity Learning and Development

Director

Read more from Jackie Clifford

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

How L&D can partner with managers to surface meaningful learning data

Digging beneath the figures is the only way to understand the true impact of learning
Understand the deeper learning

A few weeks ago I was listening to Andrew Marr interviewing Chancellor of the Exchequer Rishi Sunak in advance of the budget. One of the areas discussed related to the training of HGV drivers via Skills Bootcamps.

Finding the right data

A piece of research was quoted from the Department for Education (See an evaluation of Wave 1 of the Bootcamps here) – 69 people finished a Bootcamp and achieved the certification; of those, three people got jobs. This was described by Mr Marr as a “less than five per cent success rate” and he went on to comment that the Bootcamps would need to be much more effective in the future.

Later that week I listened to the Budget speech and heard Mr Sunak saying a couple of things: “Whilst today’s Budget delivers historically high levels of public spending. Its success will be measured not by the billions we spend but by the outcomes we achieve and the difference we make to people’s lives.” 

A stark statistic cannot give us the answers to these questions and can have a potentially detrimental impact on future activities

“As well as investing in infrastructure and innovation, there is one further part of our plan for growth that is crucial: Providing a world-class education to all our people; higher skills lead to higher regional productivity. And higher productivity leads to higher wages.”

Listening to all of this made me think about how and why we use data within the learning context. Whilst less than five per cent of people in the quoted Bootcamp got jobs, is that really a measure of the success of this intervention? What other learning took place? What impact did the whole Bootcamp experience have on the individuals who participated? 

A stark statistic cannot give us the answers to these questions and can have a potentially detrimental impact on future activities. And the Chancellor himself was talking not only about outputs, but also about outcomes. (I also have a question about cause and effect – do higher skills always lead to higher regional productivity? But that’s a topic for another article!)

Why focus on the numbers?

Definitions of learning analytics tell us that we should be collecting and analysing data about learners and their contexts so that we can understand and optimise learning and the environments in which learning takes place. 

It strikes me that when we look at data as L&D professionals, even in 2021, we still focus on the numbers of individuals who ‘need’ a specific piece of learning and the numbers of individuals who have completed a piece of learning. This puts the focus firmly in the output space with less attention being paid to the outcomes for organisations, teams and individuals.

As learning professionals, we can work with managers in our organisations to help them uncover and define their expectations

To what extent do we truly understand the impact of our learning interventions and how can we go about deepening this understanding? I wonder whether there is mileage in using our relationships with line managers in our organisations to get under the surface of what outcomes we are trying to achieve and then measuring those achievements.

Here’s what I’m thinking…Line managers are in a prime position to notice and recognise changes that are needed in individual and team performance. Good managers know their team members as individuals and can spot times when they need some extra support or when pressures (in and outside the working context) are impacting performance. 

When managers are noticing and recognising good performance and areas for improvement, they must be doing some comparison between a desired scenario and an actual scenario. This means that there must be something to measure – both quantitative and qualitative.

Asking the right questions

As learning professionals, we can work with managers in our organisations to help them uncover and define their expectations. Once this has been achieved, we can then collect data which can help to measure outcomes. 

We can use a range of questions which will help managers to clarify exactly what they need, identify any gaps and consider what they will recognise when their desired state has been achieved. Here are some examples of the questions we can ask:

  • Overall, what does good performance look like?
  • Specifically, what do you need to see your team members doing?
  • How would you expect your team members to behave as they perform each of their allocated tasks / activities?
  • At the end of each day (shift, week, month) what tells you that you have achieved maximum productivity?
  • How do you know that your team / team member has not had a good day?
  • What specifically do you notice that tells you that productivity hasn’t been where you needed it to be?
  • What do you see your team members doing which tells you that they aren’t ‘up to scratch’?
  • How do you know that your team members are using all the skills at their disposal?
  • How do you know that your team members are taking the initiative and going beyond the accepted norms?
  • In explicit detail, what distinguishes your best performing team members from your worst performing team members?

The message behind the metrics

A conversation which uses these types of questions might take some time because managers are not always used to this level of analysis and evaluation. Investing this time will support the development of some meaningful metrics which can then be used to assess progress being made towards the desired outcomes.

We need to continue to look at the wider picture and always contextualise what we are doing and how we are doing it

Of course, all this relies on us having regular and impactful dialogue with managers in our organisations. And in turn this means that we need to develop positive relationships where we are viewed as the trusted advisor by all our stakeholders.

As we reflect on all of this, there are so many things to consider, and we shouldn’t take data in isolation. We need to continue to look at the wider picture and always contextualise what we are doing and how we are doing it. 

To conclude for today, my view is that we are still in transition from training order-takers to learning and development partners. Using learning analytics effectively can support this transition to continue and will enable us to optimise the impact that learning has on the success of our organisations. 

As always, I’d love to read your thoughts on this. And if you have any powerful questions to add to my list, please do share them.

Interested in this topic? Read Re-evaluating our learning culture in a time of major uncertainty.

Author Profile Picture
Jackie Clifford

Director

Read more from Jackie Clifford
Newsletter

Get the latest from TrainingZone.

Elevate your L&D expertise by subscribing to TrainingZone’s newsletter! Get curated insights, premium reports, and event updates from industry leaders.

Thank you!

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to TrainingZone's newsletter
ErrorHere