While many companies tout their “analytics” capabilities, there remains confusion amongst compliance practitioners around what the differences are…and the differences are vast. This blog post breaks down a few of those key differences so that your team can separate Analytics from analytics.
Behavioral insight, on a per learner level
In order to extract the best data, you have to deploy the best training. When delivering it in a web-based format, the best training is adaptive and dynamically changes throughout the course based on the level of proficiency each employee shows. Courses should be designed for maximum situational simulation, guiding learners through scenarios that may arise during the course of their jobs. In other words, training should speak to the learner and not at them, be realistic in the context of your company, and be interactive, not just for the learner, but for the analytics that kind of training produces.
As learners navigate the course, the difficulty level goes up or down depending on what they answer correctly or incorrectly. Learners who do not answer a scenario correctly are given immediate coaching and feedback, along with an equivalent alternate scenario where they must demonstrate proficiency before moving on.
Because of this, Learner A may see completely different scenarios as Learner B – but both are moving through the course at their own pace and both are being coached up the proficiency curve by demonstrating knowledge all the way through. For a seasoned employee, their knowledge on specific concepts and risk areas can yield a training experience that’s up to 50% faster.
It’s important to note that this type of technology is completely independent of “branching” technology, which creates different pathways for different sets of learners based on demographic data (i.e., managers vs. individual contributors, employees in Italy vs. Mexico, and so on). Branching is an important, useful technology, but it is not a dynamic learning experience.
The flow chart below shows an example of a sample course journey:
Garbage in, garbage out
Your analytics are only as good as the data they pull from. Many vendors that refer to “training analytics” are actually talking about quiz question data.
This means that all learners are getting all the same questions, rather than scenarios based on what they know and what they need to know. Although the journey may begin with role based or self-identifying questions, from there it becomes one-size-fits all set up as content, followed by a topically aligned question, followed by answers – say A-D. If the learner guesses ‘A’ and it’s incorrect, there is no equivalent alternate presented. Rather, they are presented with the same question again, and this time choose ‘B’, because they now know it’s not ‘A’. It’s process of elimination. Learners can almost brute force their way through.
While this type of training data is better than no data at all, it comes with limitations. Here’s a breakdown of the differences:
Quiz Question “Analytics” | Adaptive Behavioral Insight | |
What it is |
|
|
What it shows | The average number of attempts it took before answering the question correctly. | Behavioral data based on the choices the employee made within the given scenario and alternate equivalents |
Bias |
|
|
Statistical Validity |
|
|
Discoverability | High – brute force. Can’t demonstrate learning | Low – coaching up the proficiency curve demonstrates the learning journey to mastery |
Segmentation and Benchmarking | Low – no real point of comparison or meaningful insight that contributes to benchmarking | High – ability to compare segments across organization and industry to determine behavior risk hotspots and areas that need remediation. Built to be a powerful tool for assessing performance year over year. |
Actionability | Low – no true analytics that allow teams to target risk areas or trouble segments | High – situational simulation presents actionable insights compliance can use to target remediation and guidance |
As you can see, the differences are significant. While there are many visually strong training styles in the market, including extensive video content, it is not adaptive. The lack of adaptivity prevents in-course remediation and seat time savings. And while the quiz data does filter through to the LMS, when the client sees knowledge deficiencies, those deficiencies haven’t been remediated in-course and are now potentially discoverable.
Ultimately, all this goes back to the question of effectiveness. If you’re looking for jazzy training that’s fun for your employees and counts completions, okay. But if you’re in the market for training and looking to leverage meaningful training data to meaningfully impact your program, then ask questions about bias, statistical validity, and actionability, and other items that are important for you as the champion of your company’s ethics and compliance program.