Skip to content
How Learning Pool overhauled its approach to User Experience Design

How Learning Pool overhauled its approach to User Experience Design

LEAFS: An evidence-based approach to optimizing the user experience.

User Experience Design is often accused of being subjective, although, in a mature organization, UX Design relies heavily on data to make decisions. The effects of good design decisions can be shown through an increase in sales, while bad design decisions can show up in an increase in support cases. It’s often difficult to know which design decision led to those outcomes. In a learning environment, the user experience in combination with learning strategies and content has a huge impact on the learner’s experience. Even when the content is exactly what the learner needs, a poor UX can cause frustration that distracts from learning. A survey by CIL Management Consultants in 2021 found that user experience is the number one reason most respondents stated why they wanted to change their learning technology. 

In short, the user experience in any learning environment plays an integral role in whether someone is able to learn. Simple actions like finding and enrolling in a course can be affected by a platform’s user experience. If a user can’t find a course, they will not see any value in the learning platform and simply leave. Additionally, even once they can find a course if the steps to enroll are too complex or take a long time, they won’t complete the journey. A poor user experience on a platform can prevent a learner from accessing course content, every learner deserves a satisfying user experience that will empower their learning and encourage them to return regularly.

With this belief as our guide,  Learning Pool decided to invest and expand the UX team in an effort to provide our customers with a learning environment that has the best user experience in the market. The team added additional Senior UX Designers, UX Researchers, and a new Head of UX, Derek Doherty who brought with him a strong background in taking a data-driven approach to user experience design.

As a young UX team, there were so many opportunities to select areas to focus on that it was difficult to decide where to begin. In order to understand what our UX Vision could be, it was necessary to document the current state of the platform. In UX, this is a little more complicated than you’d expect because there aren’t any standard ways to measure the User Experience that wasn’t tied to metrics in other departments around the business. The team needed metrics that were focused on both the UX and the UI (User Interface) that could help predict an improvement that would later be shown in standard Key Performance Indicators (KPIs). Unique to researching user experience on a learning platform they also needed a way to incorporate the experience of a learner and the strategies that are used in a learning environment over a standard application. Pulling all these parts together was a challenge they were excited to tackle.   

Derek Doherty, Learning Pool’s Head of UX, presented the team with a tool to measure the user experience using a system, called LEAFS, which he had built while designing for EA, a leading video game firm. It used the main criteria of Learnability, Effectiveness, Attitude, Flexibility, and Satisfaction. In its original form, the description for each section was geared towards gaming, however, the overall system was based on his twenty-six years of experience in UX and leveraged the best practices of other evaluation systems. Together, the team reviewed the criteria and translated elements that were specific to the gaming industry into descriptions that were more relevant to evaluating experiences within a learning platform. Armed with the new Learning Pool LEAFS tool for measuring the user experience of a learning platform the real work began. 

Where design meets science 

The science behind LEAFS relies on decades of research by experts in User Experience. An example is the criteria to “Match the system language with real-world language”, which is part of Jakob Nielson’s Usability Heuristics, which states that the design should “Use words, phrases, and concepts familiar to the user, rather than internal jargon”. Although the standards may seem simple, applying them consistently in a product creates a seamless experience for users.

When LEAFS was included in the design process, the team was able to predict ahead of time whether the decisions being made were good decisions or decisions that may result in a poor experience. As our LEAFS scores increased so did the customer approval of the designs presented to them. Because customers were able to share, early in the design process exactly what made a design good or bad, versus just whether they liked it or not, designers were able to catch poor decisions long before they reached production as well as apply what we learned to all design decisions that followed. The use of LEAFS in the design process, and revisited at multiple stages along the way, has resulted in product features that are truly data-driven and best of class.

How LEAFS Works

LEAFS contains three different levels of criteria, with Level 1 being just the five main categories – Learning, Effectiveness, Attitude, Flexibility, and Satisfaction. Level 2 breaks those main areas down into smaller categories in order to have a more in-depth evaluation. Level 3 gets even more granular, containing more subcategories with very specific evaluation questions. Depending on the need, the level of criteria can be adjusted to help pinpoint problems that are causing issues with the user experience. See below for an example. 

LEAFS

Each criterion is assessed using a 10-point scale from 1(Terrible) to 10(Best in class). Anything less than a 7 needs to be addressed and improved to meet the UX Vision. The scores for the categories within each of the main areas are averaged together to get a score for that main area.

LEAFS

For example, if a level 2 evaluation is being used, the smaller categories (Ease of Use scored 7 and Language scored 3) would be averaged to give the main Learnability score of 5.

LEAFS

Each of the level 1 category scores is added together and multiplied by 2 to find the overall score on a scale of 100. The final LEAFS score for this evaluation would be 56. 

LEAFS

5+7+4+6+6=28

28*2=56%

To Ensure consistency and accuracy, and to make the evaluation process easier to use with customers and internal stakeholders, the team created a form that leads an evaluator through each category and provides the score at the end. The evaluator is also provided an option for comments for each main category as well as overall feedback so that scores could be explained and improvement ideas can be recorded from customers for actions later.

LEAFS

Once the form is completed, the scores for each category and the overall usability score is revealed.

LEAFS score

An example LEAFS Result

How LEAFS Is Used

When LEAFS was originally adapted for Learning Pool, the main use case was to find the current level of usability. As it was used more and more, the team identified it could be useful in other situations. One of the benefits of the LEAFS system is that it is flexible enough to be applied in all stages of the UX Design process.

Establishing a Baseline

Before a design project can begin, it’s important to establish the baseline of usability. LEAFS can be used with internal stakeholders, and with customers to determine what the current user experience is for a specific area of the product, a specific workflow, or even a specific course. The scores and the feedback are compiled in order to highlight areas for improvement.

Competitive Analysis

Product comparisons often consist of whether they contain specific features but not how easy or hard it is to use those features. Using LEAFS to score both internal and competitor products gives a better understanding of the user experience in both and the strengths and weaknesses of each product. 

Critiquing a Design

One of the hardest skills to master in UX Design is critiquing the work of others. Design critiques can often result in hurt feelings, embarrassment, anxiety, insecurity, and stress for the whole team. When done correctly, everyone leaves motivated with specific actions assigned that will improve their work. By using LEAFS as a basis for design critiques the designer and the team have shared criteria ahead of time, resulting in better feedback, more focused responses, and actionable items that everyone can incorporate into their own work moving forward. It also allows designers to show improvement between iterations of the same design when presenting to stakeholders. In an agile development environment, there can be several iterations before a design is released to customers and it is important to show progress with each version.

Usability Testing

The most common use of LEAFS is during usability testing with internal stakeholders and customers. The UX team uses this method by giving a specific task to a customer or stakeholder, such as enrolling in a specific course, and then taking them through the LEAFS evaluation process and recording their responses. The benefit of LEAFS in this use case is that it looks holistically at all aspects of the design and not just whether a user can complete the task or not. 

What’s Next for LEAFS

Learning Pool has already seen the benefits of applying LEAFS to the design process with the updated Dashboard UX/UI for their Platform.  A huge success at Learning Pool Live 2022, the new Dashboard increased the usability score by over 50%, giving the Learning Pool Platform a competitive advantage in the Learning Industry. The UX team is excited to continue to use this system across the full Learning Pool Learning Suite.  

Written by:

Dr. Page Chen,  Chief Experience Officer

Derek Doherty, Head of UX

Kristin Walker, Sr. Product Researcher

Got a learning problem to solve?

Get in touch to discover how we can help

CTA background