No Image Available

TrainingZone

Read more from TrainingZone

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

What’s the Value of Learning?

default-16x9

A new Chartered Institute of Personnel and Development (CIPD) report from Portsmouth University Business School survey prompts Donald H Taylor to question how organisations can – and should – value their learning.


The Value of Learning: from Return on Investment to Return on Expectation (published on November 26) by the CIPD commissioned Valerie Anderson of the University of Portsmouth Business School (author of April’s The value of learning: a new model of value and evaluation), to examine ‘how organisations are currently measuring and reporting on the contribution of learning to strategic value’.

It’s a fascinating read of what organisations really do when assessing how and whether to invest in learning.

The findings of the 56-page report are based on a process using three inputs. The first two are pretty standard: an online discussion thread and an online poll, but the third is fascinating. Duplicating an approach established by the American Society of Training and Development (ASTD) the researchers ran semi-structured interviews with two individuals from a spread of 12 organisations: a senior operational manager and a learning, training and development (LTD) executive. That provides a perspective you seldom see: the internal supplier and consumer side by side.

The research was driven by the perception that:
“A strategic model of value and evaluation is required as organisations recognise that developing and sustaining a strategic approach to learning is necessary to enable them to thrive in a knowledge economy.”

In other words, organisations know that learning is important, and want to be able to do it better.

Evidence
It’s a solid report, peppered with real-life evidence and anecdote and attempts neither to over-simplify nor to over-complicate the issue of valuing training. A common theme repeated throughout the report is that there is no simple, single answer to the question of how to evaluate training. Instead, the report outlines four possible approaches:

  • Learning function efficiency measures (is learning delivered with minimum waste?)

  • Return on investment measures (cost/benefit analysis for specific training interventions).

  • Return on expectation measures (focuses on the extent to which the anticipated benefits of learning have been realised).

  • Key performance indicators and benchmark measures (comparison of processes with KPIs and benchmarks of good practice),
  • The message from this list is clear: one size certainly does not fit all. But that doesn’t mean the report sits on the fence. Indeed, the full meaning of the sub title – from Return on Investment to Return on Expectation – becomes clear in the conclusions. Here, the report suggests that whatever techniques are used, it is crucial to follow a three step process:

    1)Establish how well learning is currently aligned to strategic priorities.
    2)Identify a range of methods to establish learning’s contribution.
    3)Establish relevant evaluation techniques by talking to senior decision-makers.

    So far, so obvious. But the report goes on to identify what these senior decision makers are looking for:

    “Learning processes are particularly valued by key stakeholders when they can be shown to contribute to:

  • The ‘strategic readiness’ of employees

  • The delivery of performance improvement

  • The delivery of cost-effective labour

  • Career/talent-management processes”
  • This is the meat of the matter.

    This is not some relativistic world where training is fine as long as the manager says it’s good (what David Wilson of Elearnity terms the ‘conspiracy of convenience’). No, there are particular types of measure that are of value to stakeholders, and as long as the learning function can show its effect on these, and be seen to be operating effectively, then it is doing its job.

    Hence Dr Anderson’s stress on moving from rigid ROI studies to showing the effect of learning on these organisational metrics. She concludes:

    “Return on Investment measures, for so long considered to be the ‘holy grail’ for LTD professionals, are of limited interest to senior decision makers.... Return on expectation measures make use of both ‘hard’ and ‘soft’ information and assess the extent to which the anticipated benefits of the learning investment have been realised.”

    Does the report stand up?
    It is a pragmatic look at the real world by someone who doesn’t have a vested interest in pushing a particular line. The real value of the report is two-fold. First, it demonstrates that there is emphatically not a single way to answer the complex question ‘what is learning worth?’

    Second, it suggests that although there may be no single, silver bullet that shows the value of learning, some measures are better than others. And just how should you find out what is important for your organisational stakeholders? Simple. Ask them.

    On this last point, however, I don’t believe the report goes far enough.

    Quite rightly it divides interventions into those meeting ‘short term business plan priorities’ (tactical training) and those meeting the ‘longer-term capability requirements of the organisation’ (strategic learning). The aims of these two types of activity are different (even though the interventions and delivery mechanisms may be the same). But there is a key point to be made – especially with tactical training – that the report avoids.

    The L&D department is often the first point of call for managers dealing with performance issues. But just because they’re asking for it doesn’t mean that training is the solution to their problems. Neither does it mean, though, that slamming the door in the line manager’s face is a good idea. Instead, the department should be prepared to take on an additional role. It should be prepared to act as the line managers’ performance consultant. This means taking them through a structured analysis of the issue in hand and helping them decide both where the department can help, and where other interventions might be suitable.

    The alternative to this approach is something all too familiar. Managers knock on the door asking for training, but training is not always the answer. Supply it, and the inevitable result is a kick in the teeth a few months later when someone judges that the training was useless, because it didn’t fix the problem.

    But then, performance management was never the focus of this report, rather, as the CIPD’s Martyn Sloman notes in his foreword, it is all about measuring and reporting on the value of learning. As he says:

    “This has become a central feature of the management of learning; it should not be solely about the impact of a training initiative, still less about justifying the role of the training department. We must focus on the needs of the organisation and the learners.”

    It’s time to engage with the rest of the organisation – and at senior levels, where strategy is set, because that is where the learning and development function deserves to operate.


    About the author: Donald H Taylor is Alliances Director at InfoBasis, and Chairman of the Learning Technologies and IITT National Trainers conferences. In January he was presented with the Colin Corder Award for outstanding services to IT training. He blogs at learningtechnologiesconference.wordpress.com.

    One Response

    1. Learning is securing new
      Learning is securing new information, practices, abilities, qualities, inclinations, or understanding, and may include orchestrating diverse kinds of data. Our gain for a fact, training, assignment expert sharing and a great part of the procedure includes a discussion with others or ourselves.

    Newsletter

    Get the latest from TrainingZone.

    Elevate your L&D expertise by subscribing to TrainingZone’s newsletter! Get curated insights, premium reports, and event updates from industry leaders.

    Thank you!