Share this content
Brought to you by TrainingZone

Four not-so-new learning innovations that still matter today

It’s hard to promote ‘emerging’ tools and technologies as genuinely unique and transformative year on year. There are, however, a handful of more mature innovations that are really coming into their own today.

15th Jan 2020
Share this content
Old typewriter and new laptop
Rouzes/iStock

Whenever I’m asked to write about Learning Innovations it is tempting to create a sneery piece, which effectively says: “Seen it all before.” One of the challenges of being quite so old is that very often I have seen in previous incarnations what is being promoted as new, different and innovative. 

But I’m determined to add a sober reflection on some innovations in L&D which – while not brand-spanking new – at least may have found their time.

1. Competence badges

I was never a big fan of digital badges. They were somewhat simplistic – I could never imagine someone in a senior role thinking ‘I must complete that piece of e-learning otherwise I won’t earn my badge’. Often, badges were awarded for completing something – not doing something.

Gaining a badge for finishing an online module or attending a course is just another certificate of attendance. I think we all recognise that these are pretty worthless.

But here’s another thought. How many of us join some august institute because we get letters after our name? How many of us attach those letters to our email signature or LinkedIn profile? If you are allowed to put CIPD, LPI or similar after your name, does it appear on your CV? Of course it does.

So, here’s a challenge. Badges could mean something but only if they are linked to competence. They should be awarded only when something has been done – and done well, to a standard previously agreed. This takes some of the best of Continuing Professional Development (CPD) and makes achievement visible and practical.

We are still in the learning management system mentality – churning out content and then making sure people have looked at it.

Badges should be focused on what people do, not which online modules they have completed or courses attended. Workplace evidence, validated by managers and – where required – trained assessors, is the only valid way of showing professional development. In a digital world, it makes sense to record and disseminate these in digital format.

A shout out to Rob Stewart, at the Scottish Social Services Council for his input to the World of Learning Conference – a worthy winner of the Learning Professional of the Year at the 2019 TJ Awards.

Some of you will not think this is particularly innovative. You’d be right. But there are a couple of reasons for including it here:

  • We must recognise and engage with non-traditional learners and those who are otherwise excluded – such as part-timers or those in the gig economy who, as quasi-contractors, rarely get corporate support to improve their capability. This focus on what people can do, rather than which courses they have been able to attend, levels the playing field a little.

  • With recent political changes expect job security to be somewhat up in the air in the coming months. Being able to advertise your ability to do things may become of increasing value.

2. Learning experiences that properly facilitate transfer to the workplace

On a slightly similar note, it is impossible to ignore a focus on capability and performance being the proper business of those of us working in learning and development. However, when I look at some of the trumpeted innovations in the world of learning experience platforms or learning experience design it quickly becomes apparent that what is being discussed and eulogised is another way of providing yet more content and monitoring access.

We are still in the learning management system mentality – churning out content and then making sure people have looked at it (and got 80% in the quiz at the end!)

But there seems to be increased focus on how we ensure learning interventions make a difference to what people do. Encouragingly, lessons are not considered to have been learned until someone does something differently in their work. 

In some environments, the focus is on learning in the workflow or entirely informal approaches to learning – coupled with new resources to design learning out of work.

While this might work in certain environments and with certain groups of people, there is little significant evidence to support a view that learning interventions are never required. For sure, it doesn’t always have to be a course or a suite of online modules, but sometimes we do have to show people what good looks like in our organisations.

There are standards to be achieved and we can’t blame people for not meeting them if we haven’t given them a chance to know what they are, understand not only why they’re important but what needs to change, and support them as they work towards meeting them.

In my world, working on interpersonal skills, these standards have been established through in-depth research over many years. Some behaviours that have been observed to work are counter-intuitive. A lot of the time my colleagues and I are myth-busting – explaining why what people believe to be the right way of doing something, isn’t.

Maybe we have learned – the hard way – that launching a shiny new platform or some cool new online modules complete with user generated videos doesn’t change a thing. 

This seems to me to require a process of education and a chance to repeatedly practise new ways of doing things with high quality feedback.

This isn’t going to happen only through an informal discussion between peers. Sometimes we should acknowledge that the blind leading the seriously short-sighted is not the best route to capability improvement. 

But what of those learning experiences? With technology platforms enabling collaboration and sharing of experiences, it seems we can provide assignments and tasks for people that demonstrate what they have done differently. We can make achievements visible.

Sharing experiences as part of a learning process is not a nice to do, it is essential. Why should it end at the classroom door? How do we hold people to account for preparing implementation plans (which are actually actionable – not bland wishy lists?)

Some platforms, in the hands of those who know what people can and should be able to do in the workplace, are starting to provide this transparency and rigour. They are following through programmes of learning into workplace activity.

This is being achieved through collaboration areas – think MOOCs or social media with a clear focus – where tasks and assignments focus on participants being accountable for carrying through on action plans. 

Workplace activity is documented and the outcomes shared. This requires a major focus on organisational culture. If this is seen to be at odds with ‘the way we do things around here’ it will be an up-hill struggle which swallows resources for little effect.

3. Learning culture

It was impossible to attend an L&D conference in 2019 without talking about learning culture. What this meant was not always clear. Some folks presented a terrific learning strategy but re-named it culture because learning strategies have somehow fallen out of favour. (We’re dead good at re-branding in L&D – new names for the same old stuff.)

Here’s my take on a learning culture:

  • People know what good looks like – including an environment which recognises and encourages people to take risks and try new things, where standards are absent. Licence to challenge is fine so long as experiments are in tune with why the organisation exists. 

  • Managers wholeheartedly support this definition of performance and recognise that it is their role to help people achieve the standards – and go beyond them. They are rewarded and recognised for supporting performance improvement.

  • L&D resources are built around what people need to do. How people access these resources is not ‘one size fits all’. Resources, content and courses should be kept to a minimum and where possible sculpted around me and my circumstances. (To be clear, this does not mean psycho-babble like learning styles!)

  • Managers coach. By which I mean they recognise much of the learning is on the job and this requires support, feedback and an acceptance that sometimes people won’t get everything right first time. Success is celebrated. Lessons learned are shared – especially if it can prevent someone else repeating your mistakes. Managers know to focus on how jobs are done not simply what the results are. Only by understanding, in detail, how we achieved a result can we learn from it.

  • People are given time to learn. This does not mean just time to attend courses or complete modules – though that would be nice. Doing things differently and doing different things takes time. To properly do workplace learning is sometimes contrary to achieving performance goals and targets. People who are working to improve need to be cut a little slack, not constantly confronted with their current inability.

  • Recruitment is focused on potential. If your job descriptions include a statement such as ‘must have carried out a similar role in a similar organisation for x number of years’ you do not have a learning culture.

Again, is this new? Is it innovative? Given that I was doing some of this stuff in the 1980s it would be hard to make a case for this being a learning innovation. What’s new, and perhaps changing, is that finally there are people beginning to talk about this.

Maybe we have learned – the hard way – that launching a shiny new platform or some cool new online modules complete with user generated videos doesn’t change a thing unless there is an environment in which this technology is enabled to make a difference.

4. Data analysis

Notice I don’t mention ‘learning analytics’. Occasionally, this is simply an Amazon-style ranking of the popularity of different modules. There is no relationship between popularity and impact on performance.

We can all agree we have lots of data. It’s more accessible than ever. The skills of data analysis have become ever more important. I see many reports which cite data but which draw frankly bizarre, misguided or downright dishonest conclusions. 

Recently JISC – the agency that supports universities and colleges with deploying and innovating with technology – released a report. It showed there was a correlation between 16 year olds accessing the virtual learning environment (VLE) on their college course and success in their qualifications 2 years later. In short, if people logged in within the first 3 weeks of starting the course, they did better.

In the same way that we are able to be our own editors, publishers and marketers, we can also be our own researchers and data analysts.

When talking to a group about this, I was somewhat taken aback when many believed that this related to the quality of the content included in the VLE within the first three weeks. 

This ignored the other factors at work. These were the students who could get online. They had high speed broadband at home and were prepared to log in once they’d been told they should. These are the young people who are a) connected and b) biddable.

The college in question used this data not to trumpet the brilliance of their VLE, but to act as an early warning system – to find out why these young people hadn’t logged on and to take action to help.

Often it was about IT capability or technology access. For example: care leavers and others don’t get access to high speed broadband and don’t feel comfortable going into a library. Many, shamefully, because they expected to have to pay, having never experienced how libraries used to function in everyone’s high street. 

The skill for the L&D person is not in generating the data, or even being able to extract the important information. The skill is in analysing that data, seeing patterns and drawing robust conclusions. It’s easy to gather data in support of your case – especially if you’re not that fussed about ignoring facts that may detract from your arguments. The real skill in a data rich world is being able to make sense of what we have. 

Again, innovative? Hardly. Research is research and it should meet certain standards. But in the same way that we are able to be our own editors, publishers and marketers, we can also be our own researchers and data analysts. We can generate new insights and check the dubious claims made by others.

It is our responsibility to do so. Not just for our own practice but so we can really drive the capability of those with whom we work. 

Replies (0)

Please login or register to join the discussion.

There are currently no replies, be the first to post a reply.