Skip to content

Learning analytics: From inception to maturity

We’re here to discuss learning analytics so bear with me whilst I go off on a tangent… You’ll get my gist, I’m sure! Based just 5.7 miles from Learning Pool’s Oxfordshire-based office is London Oxford Airport. For several years, my partner has worked in the Air Traffic Control department there and often comes home with tales of the latest star to travel through – David Beckham, Emma Watson, and even Donald Trump himself have made an appearance or two over the last few years. Now, these anecdotes I find fairly interesting but when we’re out walking the dog and a plane or helicopter flies overhead and he proceeds to tell me it’s an Agusta 109 or a Cessna 182, I have to admit I wonder to myself “why do I care?”

But recently, I got to thinking about how a huge part of his role isn’t just about describing what’s happening in the air but analyzing the current situation. For example, the direction of an inbound aircraft, the runway in operation, even the weather conditions, help to determine what should happen next. It is in prescribing action that he and the rest of the Air Traffic team are able to ensure air traffic regulations are adhered to and the air space remains safe. 

What’s that got to do with analytics?

It’s this balance between describing and prescribing that L&D is still struggling to get right. For many learning professionals using data to inform future L&D strategies, one of the biggest challenges is the inability to see a path between where they currently are with learning data and where they need to be. For others, the difficulty will lie in taking the work they have already done in data analytics and assessing their progress in the context of learning data analytics as a knowledge field, observing where they need to go in order to develop their practice. 

Where are we with data?

One of the most interesting developments of recent years, affecting societies across the world, is that the subject of data is now a huge concern for “ordinary” people. Within our day-to-day lives, we are aware now in ways we never were before that everything we do generates data. Banking, exercising, liking things on Instagram – even sitting in a chair passively consuming TV. All these normal everyday activities generate data that affects the world around us, whether we own and control it or not. Through a tech lens, the underlying driver for this new awareness has been the digitalization of content and social relationships. And, like all aspects of digital transformation, this has been accelerated in the crisis produced by the global reaction to COVID-19. 

But when it comes to learning analytics, specifically in the workplace, the current state can look a little stuck. There seems to be a consensus among thought leaders, analysts, and high-profile practitioners that L&D needs to up its game on data if it is to improve its standing within organizations. More specifically, it needs to show that its activities result in tangible, measurable performance improvements. As industry expert Donald Clark puts it: “Learning departments need to align with the business and business outcomes.”

To learn more about the practical way forward with learning data, download our new whitepaper, ‘Data and learning: A new-common-sense approach’ now

What is the Learning Analytics Maturity Model (LAMM)? 

As a means to help organizations offset these challenges and to better understand where they are in their learning analytics journey, Learning Pool has brought together a simple diagnostic, the Learning Analytics Maturity Model (LAMM). The basic model has been adapted from L&D expert, Donald Clark’s forthcoming book, Artificial Intelligence for Learning, Chapter 14: Data Analytics. 

In this chapter, Donald describes an evolving schema to help organizations focus their analytics efforts on a specific goal; what are they trying to achieve with learning analytics?

His schema has four levels:

Describe – What does the learning data tell us about what things are happening?

Analyze – What does the learning data tell us about why things are happening? 

Predict – What does the learning data tell us is likely to happen?

Prescribe – What does the learning data tell us that should happen?

For the purpose of LAMM, to this list, we added a fifth, more basic level – Starting Out. Whilst not a goal in itself, there is a certain aspirational quality to being at the ‘starting out’ end of the scale. We can suggest that first, you must consider why you’d want to use learning analytics at all. 

What are the requirements?

The requirements and complexity of what is trying to be achieved accelerate significantly as you go up through the schema. Those companies whose goal is to prescribe actions and interventions based on data are at the very top end of what can currently be achieved.

Goals are of fundamental importance to the model. Those who aspire to recommend learning experiences based on data, for instance, will need to look seriously across each aspect of the diagnostic and consider how to start a path forward. Whereas, if they just need to know more about what is happening with the learning they already provide, the end goal could be closer than they think. 

The output of the model then ends up looking more like a matrix, showing maturity across areas of:

  • Strategic Buy-in 
  • People & Capabilities 
  • Market Best Practices 
  • Technology & Processes

Initial findings

From our initial set of results, we have observed that the average placement for an organization is a little over Level 2: Describe (2.1 to be precise). But overwhelmingly, more organizations are still stuck at Level 1: Starting Out with a few trendsetters skewing the mean average result higher than the mode.

Our results thus far have shown that organizations in the services sector are currently scoring the highest (2.3) whereas those in the retail industry are at the lower end of the scale (1.7). Having said that, there is little significant difference between varying industries, geographies, or even business size, although larger organizations do tend to perform better than smaller (2.2 vs 1.9) ones. 

Observations

But looking at the Overall position score and where organizations rank themselves in terms of Strategy, People, Market Awareness and Technology, we see for the first time evidence of what others have reported elsewhere; we are really quite deficient when it comes to the skills it takes to successfully deploy a data-driven learning approach. While the Strategy score is high (leadership says data is important), the People and Technology scores are low (leaders are not investing despite the vision). Perhaps unsurprisingly,  75% of all organizations surveyed suggest their technology is holding them back from collecting all the data they would like to use in assessing their learning efforts. 

For organizations further along the Maturity Model, the most compelling driver for them transpires not to be its leadership or its vision and commitment to People Analytics but its customers. If internal (or external) customers are demanding data, then we see organizations move to deliver it. Statistically, this relationship appears to be clear; the higher you answer the question “How important is Learning Analytics to your Customers?” the higher your overall position on the Maturity Model is likely to be. 

Find out where your organization sits 

So whether you’re looking up at the sky observing air traffic protocol, discerning your social media usage to cut down on screen time, or analyzing the data from your organization’s latest cohort of learners, the key is in understanding the data currently available to us and how we use that to get to where we want to be. From an L&D perspective, the aim of the Learning Analytics Maturity Model (LAMM) is to help businesses be better able to plan a learning analytics strategy to make use of data in supporting working people as they strive to improve their knowledge and skills while making sense of a confusing, complex, fast-changing business reality. 

The Learning Analytics Maturity Model diagnostic takes just 10 minutes to complete. Within a few days, you’ll receive a free five-page report that will benchmark you against your peers and provide an actionable roadmap for your future learning data strategy. The only requirement for the survey is that your company be >250 people or your L&D department serves an audience of >250.  Start the diagnostic now

Got a learning problem to solve?

Get in touch to discover how we can help

CTA background