Skip to content

Measuring the return on investment vs return on expectation of learning

If you’re responsible for delivering any training in your organisation, you are probably already aware of the concept of measuring the return-on-investment (ROI) of your learning. It’s commonly sought post-implementation to measure the value and cost-impact of training, but what about return on expectation?

Have you ever considered managing and measuring any level of return on expectation (ROE) from your learning? You should if you aren’t already.

By defining expectations and developing clear objectives that connect with tangible business goals, you can help to align any learning and development program to deliver results that resonate with all your key stakeholders. This means the results you achieve are directly correlated with the results your stakeholders are expecting. And, by defining these goals early on you are able to benchmark better and more clearly understand the true business impact of your training, way beyond just the cost savings.

Want real stakeholder buy-in? Then develop a clear ROE strategy for your learning and truly understand its value and impact within the organisational landscape.

Great expectations of training evaluation

Defining return-on-expectation is a process where learning professionals ask questions to clarify and refine the expectations of the key business stakeholders, so that they are satisfying these expectations whilst realistically achieving results in training. It’s about converting often generic expectations into observable, measurable success outcomes that the overall business values.

The concept has been around for a long time. In November 1959, Don Kirkpatrick presented his thoughts about training evaluation, where he talks heavily about four critical words that have since formed the “Kirkpatrick Four Levels Evaluation Model”: Reaction (Level 1), Learning (Level 2), Behaviour (Level 3), and Results (Level 4). Even back in 1959, Don observed:

“Managers, needless to say, expect their manufacturing and sales departments to yield a good return and will go to great lengths to find out whether they have done so… likewise, training directors might be well advised to take the initiative and evaluate their programs before the day of reckoning arrives.”

This statement represents what the ultimate goal of the four levels of evaluation was then, and is today: to showcase the business value and worth of training. It seems that very little has changed when it comes to trying to prove the value of our training, nearly sixty years later.

Building success from the ground up

The Kirkpatricks (of 2009 legacy) describe return-on-expectation as the ultimate indicator of value. It is a holistic measurement of all of the benefits (both qualitative and quantitative) realised from a program or initiative brought about through a package of interventions, with formal training typically being the cornerstone.

These Four Levels were designed to enable the definition and development of a plan to create an effective program and evaluation methodology, separate from the actual data collection of impact and sentiment. Unfortunately, since their conceptualisation, L&D departments across the globe have attempted to apply these Four Levels retrospectively, after a program has been developed and delivered. But it is difficult, nigh on impossible, to create and substantiate significant training value this way.

Kirkpatrick also observed that most organisations only tend to focus on achieving Level one and two, with the latter two Levels being notched up to a type of “hope for the best” scenario. It seems that we are putting the majority of our time into designing, developing and delivering training (L1 and L2), and spending almost none of our time on the follow-up activities that help to translate into positive behaviour change and achieving the subsequent results (L3 and L4) that we expect our training programs to deliver.

So how do we begin to understand whether our training instigated behaviour change, or achieved targeted outcomes?

Paving your own way

Sadly, there is no cookie cutter template for defining and measuring your return-on-expectation. Each organisation’s key stakeholder expectations for training are going to differ, so there is no simple, straight answer here. Ask yourself: “Who are we trying to impress with this training?” and remember that it’s extremely important to focus on defining expectations and developing objectives that link to meaningful business measures.

ROE could be focused around expectations being met across a range of parameters such as utility, relevance, and value. Could it really be as simple as asking key stakeholders if they are satisfied with the program? And if so, does that truly answer the question of value and impact?

Perhaps instead it’s about achieving objectives or certain outcomes in your business. For example, if you want to increase productivity or sales, your measure then becomes results or impact, Level 4 under Kirkpatrick’s evaluation framework. Contrastingly if ROE represents an objective you have about what learners should do after the learning is undertaken, then the results represent a shift in behaviour change (Level 3). If instead your stakeholders expect that participants acquire certain knowledge or skills, the objective is a classic learning objective, Level 2.  Remember – the goal here is to achieve business alignment and breaking down the barriers of the L&D department.

Where do you start in defining your own ROE? Find out in part two of this article where we’ll explore some of the ways you can begin to introduce it in your programmes. In the meantime, start thinking about your current training programmes and how you approached devising them, and how you’ve measured their success post-implementation.

Final thoughts

In order to really understand what the expectations of your stakeholders are, it’s important to look forward instead of back. Why not take some time to understand what we expect the future of learning to be, to help you understand what your teams will want in the coming years.

Got a learning problem to solve?

Get in touch to discover how we can help

CTA background