Author Profile Picture

Becky Norman

TrainingZone

Managing Editor

Read more from Becky Norman

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

An interview with an L&D data detective on finding the facts about learning impact

cyano66_detective

Following on from his session at LEARNING LIVE 2018 on data for learning’s impact, we caught up with Kevin M. Yates, Fact-Finder for Learning & Development, to find out more about his detective work in the world of L&D data...

Kevin, you describe yourself as “a learning and development detective solving the mystery of where, when and how training, learning and professional development made a difference.” Can you provide more insight on your detective work?

I investigate efficiency, effectiveness and outcomes for training, learning and professional development. This involves collecting facts, evidence and data for learning’s impact and searching for clues that reveal learning’s influence on behaviour, performance and business results.

My guiding principle is, ‘Fact-based evidence answers questions about training, learning and professional development’.  

In what ways is learning analytics disrupting the L&D space already?

Senior leaders are looking for facts about learning’s impact. Historically, L&D has not been held accountable for results in the same way as other parts of the organisation. Analytics is disrupting L&D by removing excuses for not being able to answer the question, ‘Did training work?’ with facts, evidence and data.

We’ve defaulted to the idea that you can’t measure the impact of learning. Analytics gives us insights we’ve not seen before. Data helps us answer questions and take action in ways we’ve not been able to before.

I’m excited to see the rise of analytics in L&D. We can no longer fall back on the false idea that you can’t measure the impact of training, learning and professional development.

A big challenge that L&D teams commonly face is proving that their learning and development efforts are making an impact. How can data help with this?

Data is fact-based evidence. Rather than rely on what we hope or feel about learning’s impact, we can use data as proof that learning and development efforts are working. But we must do it in a way that is credible, reliable and honest.

You must use data that follows the trail from the results of learning, to the influence of learning on performance, to the influence of performance on achieving organisational goals and, ultimately, business results.

It’s important for learning professionals to be aware of how facts, evidence and data inform decisions and answer questions about our work.

So, if we use data that tells a story about performance impact that comes from learning outcomes, and how performance outcomes drive business results, we can tell a story the organisation will believe.

If the data shows that a business’s L&D strategy is in fact not making a positive impact, what should the next steps be to change this?

My answer is simple, and may be controversial, but if what you’re doing is not working, stop! I’m not afraid when the facts tell me that training didn’t work. I’m afraid of not doing anything with what the facts tell me.

I don’t want to leave out the value of using facts and data for when there is no impact and it has nothing to do with training at all. Training is not a silver bullet and winning is a team sport.

There are instances where other influences are working against people applying what they learned in a way that helps the organisation win or when training is not the solution.

Just as there’s facts, evidence and data for the impact of learning, there’s also facts, evidence and data that show where other influences are roadblocking training’s efforts.

Would you say professionals in the L&D industry need to better hone their fact-finding and data analysis skills to measure learning impact? If so, how should they go about doing this?

I believe fact-finding and data analysis is an area of specialty and expertise just like instructional design, facilitation, learning technology, learning operations – the list goes on. So I’d say that not all professionals in the L&D industry need to better hone their fact-finding and data analysis skills.

It’s important for learning professionals to be aware of how facts, evidence and data inform decisions and answer questions about our work... but we all don’t need to be experts in data and analytics.

For those who are interested and curious about L&D data and analytics, or who want to hone their fact-finding and data analysis skills, I say take our medicine, get some training!

Using facts, evidence and data is here to stay.

There are organisations, conferences and workshops providing education on methods for measurement, data and analytics for L&D. There are also podcasts and a ton of books on the topic for self-development.

I recommend building relationships with L&D practitioners who are already doing the work of fact-finding and data analytics for L&D. People like me!

Finally, what most excites you about the direction in which L&D is heading today?

I’m excited about seeing L&D become more data-driven in its solutions, strategy and demonstration of value and impact. I’ve been in L&D long enough to see trends come and go, mindsets shift and change, and methods move in and out. I believe using facts, evidence and data is here to stay.

 

Author Profile Picture
Becky Norman

Managing Editor

Read more from Becky Norman
Newsletter

Get the latest from TrainingZone.

Elevate your L&D expertise by subscribing to TrainingZone’s newsletter! Get curated insights, premium reports, and event updates from industry leaders.

Thank you!