Making sense of analytics for learning


As big data captures the hearts, minds and souls of everybody engrossed in the smorgasbord of delights on the internet, the education community is enthralled to a new and related trend, learning analytics.  Whilst this offers new insights into the factors determining success and failure on an academic course, have we missed the point ? How can analytics be used to support the learner and the learning process ?

The science and practice of learning analytics is still in it’s infancy, with research, practice and technologies seeking to grapple with the logistics of supporting the educational process, the ethics of collecting and sharing personal data, and the practicalities of linking data between university and supplier systems. And yet, Universities have been quick (relatively speaking!) to understand the potential for finding efficient ways to identify struggling students from mapping key indicators. Other providers, notably the Open University, FutureLearn, Coursera, EdX, Knewton, are beginning to harness the real-time data to offer differentiated support and content.

Learning analytics can provide students with an opportunity to take control of their own learning, give them a better idea of their current performance in real-time and help them to make informed choices about what to study. (JISC, 2016)

Beyond this, however, we find a self-service culture beginning to dominate. I’m sure I am not alone in enjoying a dashboard of my own progress data on MapMyRun or DuoLingo, for example. It is addictive, engaging and somewhat rewarding. Similarly, VLE providers such as Blackboard, Canvas and BrightSpace are now including comprehensive dashboards for students, academics, admins and seniors manager. Tempting as these are for revealing gross trends in personal and cohort practice, one is left with the feeling that the data is dislocated from reality, capturing numbers and displaying pretty graphics, with all the personality of a quarterly sales report.

What is missing ?

In our rush to collect, analyse and report on the data, we seem to have forgotten the learner and the teacher. Perhaps these are early days in the evolution of systems, but I suspect we are seeing a trend towards intensification of education in which pressures on teacher time for quality feedback is being sacrificed for automated reporting tools. MOOCs are leading this trend and analytics serves the poor retention figures well. Nevertheless, this weakness in the MOOC model should not reduce standards in mainstream education delivery.

The figure below depicts the three primary domains for data-informed learning.


Educational analytics represents the collections of information available from within University VLE/LMS, libraries and MIS data centres. It may also include UCAS admissions data, personal information and data from other sources, such as portfolios and assessment tools.

Learning systems themselves capture student page use, interaction with modules (activities, discussions and quizzes), together with assessment results.

Conversations spaces exist in most VLE/LMS, intranet, portfolio, intranet and emails systems, for assessment feedback, module discussions and peer-peer conversations.

Where these domains overlap we discover two of the main formats for reporting learning analytics to students and staff. These are dashboards and tutor spaces.

The Dashboards available within Learning Systems provide real-time representation of cohort data, often with traffic-lighting to alert faculty staff to low engagement or performance. The data is often un-processed, resulting in large data sets which take time to loads and are challenging to rapidly interpret. Typically, these lack any means to interact with the students other than through email or an in-system mail function.

Tutor spaces offer an opportunity for students and personal tutors to communicate. It is often outside the learning space, resulting in a dislocation from the learning data. Instead it will rely on imported data from an MIS to inform the conversation. These tools typically offer access for both groups to exchange messages, but lack any further functionality to direct the learner in their learning.


More about the ??? space

At the heart of this data-informed learning space is the intersection between data, learning and communication. For many centuries, we have successfully evolved practices to construct teaching models which inform, instruct, and coach learners, often involving a conversation with a teacher. Accessibility to complex learning data offers something new. Learning analytics brings vital contextual information for learners to reflect on, and to feed-forward to direct future learning. As real-time, comparative data, it allows both student and teacher a powerful tool to sensitively construct learning development in many new ways.

Do we still need a teacher when provided with rich data? Yes. Though rich and immediate, it has yet the intelligence to understand the learner’s emotions, their habits, problems and targets. A conversation with a coach would yield important new insights for moving beyond merely retention, through to stretching the potential of the individual.

Data-informed teaching may seem like an intensive model of learning which cannot be offered by all providers. Not so. The personalisation of the data informs the conversation, speeding up the formerly blinkered limited tutor understanding of student engagement and performance.

The learning space provides an important role in this conversation through situating the conversation in the learning domain, and providing access to both the content of learning (resources and assessment) and to the tools to direct future learning. This offers the teacher the opportunity to set activities for the learner to reflect, plan or explore a subject arising in the conversation.

What next ?

The JISC project on reviewing international practice in learning analytics is offering valuable insights into institutional developments, supported by a growing research community (eg: Gasevic, Rientes, Ferguson, Chow).The lessons being learnt from this will inform systems development for years to come. However, my fear is that the valuable educational dimension will be lost ahead of solving financial problems for providers.

A new conversation is required on how we best shape learning analytics to meet the needs of educators and learners. This should include academic practitioners, technology suppliers and learners to ensure that current and emerging pedagogies can benefit from the potential of big data without compromising the educational experience.

I’ll be exploring this further at ALTC this September with Steve Powell (University of Lancaster) and Patrick Lynch (University of Hull). Come along to the session or look out for future posts following the workshop.

To be continued…