Senior Analytics Consultant Jisc
Share this content
Brought to you by TrainingZone

Learning analytics in academia: the problems and pitfalls of collecting learner data

Learning analytics can be a fantastic tool to assess how well learners are engaging with their course, but it can also present real concerns and should be implemented with care. Universities, in particular, have some issues to work through with regards to this. 

20th Feb 2020
Senior Analytics Consultant Jisc
Share this content
 female college student in class, taking notes and using highlighter. Focused student in classroom
iStock/AndreaObzerova

First, let’s begin by defining exactly what we mean when we refer to ‘learning analytics’, as it can be interpreted differently in different contexts. When we refer to it here, we mean:

‘The measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs’.

What we’re about to discuss refers mostly to academic use of learning analytics in institutions such as universities, but there are applications and concerns that equally apply to a business context.

Even if datasets are complete, it’s worth remembering that even predictive algorithms aren’t perfect. They can have unconscious bias, just like the humans that put them together.

Analytics can assess individual engagement and wellbeing, or module engagement by the class/group. It can be critical to tell the trainer/tutor about how their students are progressing and then inform their conversations and interventions. It can inform the programme leader about how the department is coping and to predict outcomes for learners so the appropriate interventions can be made to get them engaging before it’s too late.

The insight that learning analytics provides gives great power to the institutions/organisations using it, but with that comes responsibility. Jisc has created a code of practice for learning analytics that provides guidance on privacy, consent, use of learner data and anonymisation. This code of practice is essential, as there are so many pitfalls and potential unintentional misuses of the data that need to be considered by anyone thinking about implementation.

Ethics and consent

The ethics of having engagement data, especially predictive data can be controversial. For example, if an institution knows that a student is at risk of academic failure through the data, they should have an obligation to intervene with that student. It can lead to some holding back the predictive outcomes from view until they’ve been tested against actual outcomes.

Some institutions will want to know how likely a student is to succeed before they’ve ever walked through the door. Others know they have mostly disadvantaged students so will concentrate on supporting them all, regardless of background and demographics.

It is the duty of any organisation to transparently let students know their objectives, data sources, metrics and how it will be interpreted.

Getting student consent can be a delicate balance. Students must be clear about what data they are consenting to have collected and why it is being used. It’s critical that when gathering data about students, consent is asked for in a way that is positive and makes clear that the data is being used to help them and not as a stick to beat them with.

Students need to give their consent, especially around protected characteristics, or the data will have gaps and any predictions won’t be accurate or work at all. The old adage ‘bad data in, bad data out’ is apt here. Even if datasets are complete, it’s worth remembering that even predictive algorithms aren’t perfect. They can have unconscious bias, just like the humans that put them together.

Using the data

Universities can get very excited about the possibilities of sending out notifications to students about lack of engagement, or if they’re falling behind in grades or attendance. A fine balance should be struck with interventions with students. If they are too frequent, then recipients simply turn them off. So, a note of caution to those who want to bombard students with well-intentioned nudges – they must be the right message and they have to be infrequent enough to make a real impact.

Learning analytics affects all parts of an organisation. Key drivers can often be an executive team with an eye on retention. There’s also been a realisation, however, that perhaps this data can assess staff engagement with resources too. This is a very recent focus, but any attempt to use learning analytics as a staff management tool will face strong resistance from staff and the unions that represent them.  

It’s the responsibility of the institution for the legal and ethical and effective use of the data. Anonymisation must be put in place where appropriate. There is a danger, however, that if too many data points are collected about a user, they may be identifiable anyway. Who might this be?

data collection example

Using data about student’s location has ethical issues. This data can’t be anonymous initially if it’s used for registers, but the data must be deleted after its no longer needed. There is real concern from students that their data will be misused or even sold. These concerns are completely unfounded, but it is the duty of any organisation to transparently let students know their objectives, data sources, metrics and how it will be interpreted.

Drivers for learning analytics

There are many motivations for learning analytics, but retention has always been key. It’s to the benefit of both the institution and the student experience that students have the correct interventions when they need them, but learning analytics exposes the fact there are many different motives within an institution for making it a success.

Leadership teams want ROI from learning analytics. Personal and academic tutors want to support their students and improve their course respectively and students want to know how they are performing on their course in terms of grades but increasingly want to benchmark themselves against their peers.

As has been mentioned, learning analytics affects all stakeholders. So people, processes and politics can get hinder progress. It can be a big change to working practice and the data must be right or users won’t trust it.

There is also the danger of education being prescribed by the data. These data points are starting to inform policy, curriculum and behaviour. The sector needs to consider if that’s the right approach or if some of the humanity is being taken out of decision making to the detriment of the student experience.

De-motivation of students is another potential pitfall of learning analytics. Students have a legal right to see all of their data. This will include predictions, outcomes, grades and benchmarks against others if those calculations have been made. Some students are extremely motivated to know where they are in the class with attendance and attainment.

For others who may be struggling, financially, emotionally or academically, they could get extremely de-motivated knowing they are trying their best only to be at the bottom of the pile. Other students may not be at university to pass with a top degree, their motivation is learning for its own sake and constant notifications that they’re under a certain threshold may be de-motivating a alienating.

Learning analytics can also have too much emphasis on the deficit model, so it’s easy to lose sight of those who are doing OK, or even those who excel. Institutions must endeavour to ensure those students get the same amount of attention and support to push them to excel further. For example if a student hasn’t been to class, has no virtual presence but good grades then how can staff intervene with that student to make them even better?

Issues with technology

Although learning analytics is essentially a people and change management project, the technology needs to be considered too. Vendors that provide analytics systems need to keep both the student experience and data security in mind. Each feature must consider what data is being collected and why and if it collects too much unnecessary data.

Another concern for those putting trust in the data being presented is its ability to be manipulated. Students can and will cheat the system. Whether that is sharing access cards to show they’d been present at a lecture when in fact they were elsewhere, or logging a few entries into the VLE to get their figures up, students have been show to game the system.

Recipe for success  

  1. Make sure you have a visionary to carry forward the project.
  2. Plan learning analytics as a people and change management project.
  3. Consult with stakeholders on how to implement the changes and be clear why you’re doing it.
  4. Make sure you ask students for their consent if not capturing data under ‘legitimate interest’.
  5. Make sure you have enough resource to extract, clean and manage the data.
  6. Have the right policies and procedures in place before you start.

Interested in this topic? Read Learning analytics: why we need to get better at measuring performance.

You might also be interested in

Replies (1)

Please login or register to join the discussion.

avatar
By ben_posmyk
24th Feb 2020 11:47

maybe one of the challenges that you mention with respect to de-motivation has to do with different learning styles of our students? Classroom experience works for people with similar skill and know-how levels. If we employ individual learning (AI-assisted), maybe we can "re-motivate" outliers?

Thanks (0)