By far, version A testers thought it was too expensive.
Background: Online learning at work
Recently Pearson created a new team to explore how to address the highly competitive life-long learner market. We chose to address the segment of recent graduates who are employed and need to improve their skills to heighten their status and standing on their current team, or to find a new position. Our plan was to develop a minimum viable product, test it in a beta release, learn from that, and then make modifications in an effort to reach a good product-market fit.
The product, called Pearson Career Courses, was released in beta in July, and to the public in November, 2020.
Pearson Career Courses
Online learning platform combining asynchronous course material, individual mentoring, and real-world projects
Target audience: Early-career employees and individual mentors and reviewers
Problems to solve: Boost confidence and skills, and to strengthen teams, increase recognition internally and externally, incentivize mentoring
Business challenge: Could we create a learning model or pattern to create valuable experiences that are personalized, relevant, scalable and repeatable?
Team and Roles
As the senior UX designer for the business-to-business version of the product, I worked with a visual designer, a UX design lead, UX researchers, writers, product managers, and developers. The team was spun up during COVID restrictions and was completely remote, based in Atlanta, Denver, San Jose, Boston, New York, Chicago, London and Chennai.
Some of the work I was responsible for included:
User journey maps
Detailed wireframes, with annotations for developers,
Visual design system, including styles and components (in Figma)
Flow and pay-point analysis
User testing assets and questions
At the discovery phase of the project, we conducted user interviews. We knew that employers valued workers who continually develop their skills. We also learned what these employers valued most in a non-degree credential, but we wanted to talk to them to understand more about why. We focused on mid-level managers at large enterprises, as they were likely to have early-career employees, as well as a budget for learning and development.
Questions we explored
What challenges do managers currently have for developing their teams?
What are the motivations and barriers for managers to adopt a professional learning system with real work deliverables?
What are managers’ expectations on how involved they would be as part of proposed learning experience concepts that include real work deliverables?
What do managers need to demonstrate outcomes of learning?
Understanding types of users and improving onboarding
Managers were open to the benefits to their team from the courses and mentoring. However, they needed to have more guidance to determine what content or courses to pair with which team members. Merely seeing a catalog of options put many of them at a loss for what to do. That posed a problem for the system as a whole because it relied on managers to assign courses to their team members.
We improved the onboarding experience for managers. Instead of a “tour of the product,” the new onboarding focused solely on the manager’s goal of matching a single course to specific team members. A future improvement might entail offloading that work from managers to an assessment process that individual learners would use to find their own recommended courses. A manager could still override, but inaction on the manager’s part would no longer jeopardize a whole team’s successful use of the service.
A/B testing showed a stark difference in effectiveness
We wanted to know if our marketing landing page conveyed the value propositions for the product. So we put together an A/B test using UserTesting.com. The two designs were very different from each other, although they used similar copy. Version A used the corporate style and approach currently live across Pearson, while version B used a new design, a result of user research, that intended to highlight the key value propositions and product features.
Testers were divided into two groups. Each group only saw one version, never both.
Questions we asked
What is this product offering?
What about this product stands out?
What do you think about the price?
Results: Version A failed to convey the value props as well as B.
We were surprised to learn not only that one version was much better at conveying the key differentiating features, but that it was also more likely to be described as priced fairly. Both A and B presented the same price for courses.
Version A - Most thought it was “a bit too much” - by a lot.
Version B - had a more evenly distributed response, leaning toward “not too expensive” or “cheap” (their words).
Results: Version B communicated the salient product features better than version A.
“What you'll gain” section
“Learn it, Try it, Use it” structure
Portfolio, work samples
Working with a mentor
Doing a real-world project
Version B more successfully conveyed the key value propositions of working with a mentor and doing a real-world project. This awareness may have influenced the opinions testers had about the course pricing.
Among version A testers awareness of working with a mentor barely registered, and no one noticed that real-world projects and in-person video calls were part of the course experience. Those are the important value propositions and differentiators.
Using a new visual design within a corporate system
Visual design research and testing revealed a distinct preference for a more exciting palette and a more modern typographical approach. Our visual designers created a theme that harmonized with the overall corporate brand style guide, yet allowed us freedom to explore new solutions.
We translated the visual theme into UI design and I created a style guide for our design and engineering teams. Creating and maintaining a shared component library in Figma allowed us all to move quickly from wireframes to final visual design, not just for these landing pages, but for the entire product.
Process flows: Connecting teams and tasks
I created process flows for most of the user tasks. These included two classes of user we called the “learner” and the “manager.” The latter could be anyone whose role is to review and give feedback on a learner's work.
This wireframe shows a task for a manager. The high-level view for assigning new learners from your team to a particular course.
Dashboard design: Surfacing actionable items for time-strapped managers
My approach to dashboard design was guided by the fact that we were creating a new product. We knew that there could be a need for all sorts of information and metrics, as well as the ability to perform many different actions. But we decided to start with the minimal set of information and actions needed to successfully use the product. Later we would let user feedback and usage analytics guide the introduction of new dashboard features and functions, or their removal.
The dashboard for managers allows them to add people to their team. They become learners, which managers can then assign to a course. In addition, the dashboard is the place to see the status of each learner, as well as to take actions, such as reviewing their submitted work.
With any sort of list view, there are questions about sort order and display logic. It's not enough to simply design how a list looks, or even to design a few responsive configurations. Developers need to know what logic determines the visual layout and behavior of the list.
I indicated the core ideas within the design file. Additional detail is provided in separate documentation.
It's important to keep busy managers and their employees informed about the status of a weeks-long course, including when there is work to review, whether it has been reviewed, and whether revisions need to be made. Our research clearly showed that managers would not regularly “check in” to the application. So we created an email and push notification strategy.
The messaging strategy was similar to the dashboard approach: provide the minimal number of messages needed to keep things on track. I prioritized those points in the process where one user’s inaction would block another’s ability to use the product.
We limited messaging to the smallest number of notices and nudges. We used beta test data to revise the notifications to include reminders where we had seen inactivity.
Eliminate friction by showing less information
It's very easy to try to be so helpful to users that every situation, every case you can think of begets a solution or a feature. Trust that the research and the solutions you design actually deliver for the users. Of course, we don't want things to be unclear or frustrating, but providing too many nudges, too many functions, too much explanation to consider and learn can be equally frustrating, if not more so.
What if people simply stop using it?
My biggest concern was that users would sign up to take a course and then never return to finish it. Online courses have a dismal completion rate that only slightly improves when users pay for their courses. Would the fact that these learners were being assigned a course by their managers make a difference? Would a spot-on recommendation engine or process increase completion rates? What about recognizing and celebrating success? That was an important outcome to learners, but was not a part of the beta product.
In our beta test, we did, in fact, see low completion rates. This suggests that acquiring and retaining engaged customers who repeatedly interact with each other and with the product could be an up-hill climb.
Compelling content and valuable interactions with real people beat nudges and reminders
I realized that a robust messaging/reminder strategy is only effective if the core content and interactions are intrinsically important or compelling to the users. No amount of nudging was going to get an uninterested learner to get back to their course. The bottom line is that focusing on creating engaging and highly relevant course content is a far more effective retention strategy than focusing on building ever more messaging to try to keep users interested.
Without a great learning experience, no amount of nudging will convert a disaffected user into a fan.
The next phase of the project intends to address this by focusing on these important findings:
Engagement with the manager/mentor created very positive feelings among learners, which boosted completion rates
Compelling course content that is relevant and exciting keeps learners involved
Being able to shine in front of co-workers and managers, or in a job interview increases the likelihood that users share their positive experiences in their social circles.
Ultimately, the main lesson is a basic one: The core interactions and experiences need to be more compelling to users. We focused too much on the supporting experiences like uploading and commenting on files, setting up an account, and the minutia of rescheduling live learner/mentor calls. That left little time for all of us on the team to think about improving the core interactions that are at the heart of the experience.