Learning analytics: how to make a small bit of data go a long wayby
Learning analytics is constantly developing with new systems and information available all the time, so how do you get buy-in from management to take your programme to the next level? Securing a series of quick wins from your existing data will help.
What can be done with just a few data points of learning data from a couple of different systems, while waiting for more to be available? As L&D professionals start down the road with learning analytics, they will naturally start small, usually with basic descriptive data about which learners viewed which resources.
ROI optimisation and personalisation work is not limited to the LMS. Basic learning data (taking ‘learning’ very broadly) can be aggregated from a number of systems.
One of the goals of learning analytics is to eventually get to the point of generating predictive data about job success, but getting to this point is a journey.
This leads naturally to the question: how can I start getting value from the data that I have, and start improving my programmes right away?
Content engagement, ROI, and effectiveness
One immediate benefit from aggregating your learning analytics across multiple resource types (and, ideally, across multiple learning systems) is the ability to see what kinds of content, and indeed which individual learning objects, are the most engaging, and for which types of learners.
This can help you optimise your investment in the development of learning content.
For example, if a given course (or series of courses) contains a number of PDFs and a number of videos, you can see whether or not learners preferred viewing the PDFs, or watching the videos.
You can also see which of the resources is worth keeping current, and which may be considered for deletion.
Furthermore, if the engagement data looks poor for a particular resource that contains critical information, it’s useful to ask why the engagement is poor. Perhaps the information presented is confusing? Perhaps it’s boring?
With limited data, it is also possible to start evaluating the effectiveness of the content.
By looking at which learners successfully completed the course and at their pattern of consumption of the course resources, you can start to see which resources are likely to be more effective than others, and highlight those as especially important for learners to attend to.
It’s especially important to gather data across your resource types, as you may discover patterns across different document types. For example, those who watched the videos but did not read the PDFs may have performed well, while those who read the PDFs and did not watch the videos may have performed poorly - or vice-versa.
Certain modes of learning may be better suited for certain learner cohorts and you may consider splitting your course into one that is more PDF-based and one that is more video-based.
This is how, with limited data, you can begin to optimise your training resources and course effectiveness.
Personalisation and adaptive learning
While the ability to personalise your courses and implement some adaptive learning may be somewhat constrained by the learning systems you have available, you might be surprised at how much is possible with a bit of learning data to work with.
Let’s start with the engagement data and some completion or success data (whether the learner got a passing score in a given course).
If it turns out that certain individuals are highly engaged with certain courses, and less engaged with others, this is valuable information with which you can begin to craft a learning pathway ideal for this person.
This could be because they have a natural interest or talent (which is visible in their engagement data) or because they simply respond better to certain kinds of pedagogy (whether in a certain language, more activity-based, oriented toward a particular skill level, etc).
From this data you can start to recommend other courses along similar lines of interest or talent, or other courses where the instructional model is a better fit for that individual learner.
Getting some benefit from a small investment is usually the best way to get an more expensive and more complicated project off the ground.
You can also start to build a notion of learner adaptivity into their learning path or suggested courses.
Perhaps the learner is particularly engaged and successful at basic level Excel. A logical recommendation for this user to try next would be intermediate level Excel. If the learner is less successful, suggestions for further Excel courses may not be the best advice.
The ability to recommend courses that are a good fit in both content, level and pedagogic style will make the training more relevant, more efficient, and result in a better trained employee.
From here, a basic gamification framework can start to emerge: Not only can you reward employees (with badges, certificates, or other kinds of recognition) for individual course performance, but also for their overall level of engagement in their own professional development, with rewards for the number and type of courses they engage in.
With well-crafted further learning suggestions, we can start to encourage employees to keep developing themselves (and keep the employee engagement with the organisation high as well).
Subsidiary learning resources and third-party systems
This ROI optimisation and personalisation work is not limited to the LMS. Basic learning data (taking ‘learning’ very broadly) can be aggregated from a number of systems, using xAPI or other telemetry, which fall outside the traditional learning stack.
It is now easier than ever to implement third party learning services or courseware, and gather all of an employee’s learning data across all of these services.
If you’re using a learning experience platform or other technology to help track learning objects across various systems, you can start to feed this platform with learner data, thereby helping to surface relevant learning objects from anywhere in the organisation.
For instance, someone who does well at a particular course in the LMS could be encouraged to engage in more advanced work on that topic in other corporate or third party systems.
If users are being directed to additional learning resources (in another company system or in a third-party system) and they are not using those resources, this is a good sign that those resources are not worth investing in, and should be reconsidered.
All of this can be done with very basic interaction data, ideally from as many systems as a learner might touch in their lifecycle as an employee.
From here, of course, more and more advanced analytics can be captured and greater insights generated; but as with any journey, getting some benefit from a small investment is usually the best way to get an more expensive and more complicated project off the ground, and can provide short-term results to show benefit in your analytics initiative.
Interested in this topic? Read Setting a clear direction on your workplace learning analytics journey.
Jeff Rubenstein is the VP of Product Strategy - Learning + Collaboration for Kaltura, Inc. He has held senior roles in a number of educational and technology companies, including 2U and Wimba (prior to the Blackboard acquisition). He works with a number of other companies and standards bodies on learning interoperability standards, and how to...