Diagnosing your learning evaluation needs
Kenneth Fee and Dr Alasdair Rutherford are back with another great resource for all your ROI wants.
Have you ever diagnosed your learning evaluation needs? Identification and analysis of needs is something L&D practitioners tend to associate with learning itself, often as the first stage of the systematic training cycle, leading to planning training interventions, design and delivery of training, then evaluation of its effectiveness. As we discussed in our earlier paper, Getting it Right, this is the wrong place for evaluation. But perhaps the solution is to hand.
We think it makes sense to diagnose your learning evaluation needs at the outset of any learning and development work. This focuses you on thinking about the effects and impact of learning and development activities, and ensures your plan for evaluation of business impact is built in from the start.
We recommend a model that considers five dimensions:
- Learning evaluation approach
- Learning evaluation practice
- Stakeholder involvement
- Research methods
Your learning evaluation approach is about clarifying the outcomes you’re interested in and how you are going to investigate whether you achieve them – what judgements and measurements you’re going to make. Many organisations take a one-size-fits-all approach and focus exclusively on the Kirkpatrick model or Return on Investment (ROI) or something else. We believe this is wrong, and your approach needs to be tailored to your specific situation and needs, which may mean adopting different approaches to different kinds of learning initiative. Diagnosing your learning evaluation approach helps you understand what you are (explicitly or implicitly) doing at the moment, and helps you make decisions about where to go in future.
"Comparing what actually happens with what you believe should happen helps identify what you need to do to ensure full engagement throughout your organisation, and perhaps beyond."
Your learning evaluation practice is about what you currently do, regardless of your aims, strategy or intentions. Your diagnosis should clarify what activities you currently undertake, such as baseline measures, reaction sheets, follow-up questionnaires or interviews, and impact studies. This helps highlight which techniques you are aware of, and which you are currently implementing, and provides a point of comparison for considering alternatives that may better meet your needs. Most organisations exhaustively measure learner reactions to training (the dreaded “happy sheets”) – and many do little with the information – while few organisations can show evidence of following the link through from training to impact on business results.
Stakeholder involvement is about how you involve everyone in and around your organisation who has an interest in learning and development, and what these people contribute to evaluation. It considers learners, line managers, senior managers, customers and others. Your diagnosis should help you identify who your key stakeholders are, to what extent they need to be involved or informed, and what specific roles they should fulfil. Comparing what actually happens with what you believe should happen helps identify what you need to do to ensure full engagement throughout your organisation, and perhaps beyond.
The research methods you use are important in checking first that you’re using the most effective techniques for collating and analysing quantitative or qualitative information – or both. Many L&D practitioners have little experience or training in robust research methods, and yet this is an essential part of their role. The second critical question is whether you’re using the tools in the right way – are you using informal, structured or semi-structured interviews and what determines that choice? Are you correctly implementing focus groups, sampling or control groups? Your diagnosis should consider the what and the why.
Finally, there is the question of reporting. The most rigorous evaluation system in the world is worthless if it is not effectively communicated. You need to think ahead about who you are reporting to, the format you’re using, and exactly what you are reporting. Are you going to be clear about the findings – facts and figures – and will your analysis, conclusions and recommendations flow naturally from them? Your diagnosis should compare your current reporting practice with where you want to be, and outline your plan for improvement.
All of this should be fairly straightforward, but represents the minimum required to evaluate your evaluations, and think about what you need to do to get better at them.
Charity Director. Consultant in OD, strategic HR and leadership development. Author of five books including Delivering E-Learning (Kogan Page, 2009). Interested in strategy, OD, leadership, technology, evidence and evaluation. Available for writing and speaking.