UX Strategists
Made with


This report seeks to demonstrate that application of an evidence-based methodology involving end-users throughout the creative process resulted in the identification, conceptualisation and design of a solution which is measurably easier to use,
learn and remember.

The Project

I work as a lead experience designer for one of the largest British organisations that provide infrastructure consultancy, facilities management, property management, energy, and healthcare services in the United Kingdom and Ireland. As part of my role, I was asked to consult UX for the E-Timesheet solution that allows company engineers' mobile forces to input information regarding:

  • core working hours 
  • overtime/standby
  • annual leave/sickness 

This new digital solution was going to replace the paper-based processes currently used by the company.

The Problem

The company has previously attempted to design a solution without involving the end users and by entirely skipping the discovery phase and UX input. The project was considered as straight forward so the business unit formed project requirements in the form of outputs (rather than outcomes) and handed them over directly to the development team. 

As a result of this approach, users did not adopt the solution that was created and there was little budget left to fix all the problems.

When the business realised that end users were not able to use the solution, they asked me to identify the problems and define a plan of action to roll out the solution as quickly as possible.

The Solution

I set up the "product squad" which includes people from various departments to help me apply an evidence-based methodology that involves end-users throughout the creative process to identify, conceptualise and design a solution which will be measurably easier to use, learn and remember.

Introducing the Design Process

It was clear that the previous solution was lacking the essential user insights that would help the organisation achieve its goal. Without these key insights, the entire development team would be guessing the best solution. To get out of this deadlock, I had to capture and understand customer needs. 

In order to illustrate our design thinking process, I divided it into four distinct phases – Emphasise, Reframe, Create and Prototype –which is the Double Diamond approach that was invented by the British Design Council back in 2005.

Double Diamond Design Thinking Model

Double Diamond Design Thinking Model

First diamond
Doing Right Things

Step 1
Defining Research Method

Design Thinking is a model rooted in physical activity — designing solutions. Understanding user needs is a semantic activity and demands a semantic model, therefore within the emphasise phase, I've decided to embrace the idea of the customer’s “job-to-be-done” and use Outcome-Driven Innovation frameworks (ODI) which put the "jobs-to-be-done" theory into practice.


Step 2 
Identifying Markets

As a consequence of applying the ODI framework, I've managed to extract the job executors and their core functional jobs from the initial problem statement provided by the business.

Step 3

Job Mapping

Identifying the core functional jobs allowed me to select the correct participants for the end user interviews. These interviews started the journey of quantifying exactly where the user is underserved and where we should focus our efforts while developing our solution.

Survey structure to obtain importance and satisfaction data

Survey structure to obtain importance and satisfaction data

Step 4

Survey (Quantitative research)

In order to validate the insights generated through conducting interviews, I prepared separate surveys for each market and sent them to over 350 users.

Participants were asked to rate all the outcome statements collected during the interviews by importance and current satisfaction level. 


Step 5
Opportunity Landscape

Obtained in that way the opportunity scores for each desired outcome reveals: where the market is underserved and where it is overserved as well as what should be our priority in terms of improving customer experience.


Step 6

How Might We Notes

Prioritising highly important and less satisfied outcomes we turned them into How Might We questions.

We used the How Might We format because it suggests that a solution is possible and because they offered our team a chance to answer them in a variety of ways.

Second diamond
Doing Things Right

Step 1
Comparable Problem

Looking at the issues articulated in How Might We format, we knew there was no point to reinvent the wheel and sometimes the best ideas already exist. 

We've researched industries related ideas and reviewed similar business problems and their solutions in order to extract the best solutions.


Step 2

Crazy 8's

The How Might We notes become the starting point for divergent thinking and allowed us to conduct Crazy 8’s exercise.
It was a fast sketching exercise that challenged every member of our team to sketch eight distinct ideas in eight minutes. 

The goal was to push beyond our first idea, frequently the least innovative, and to generate a wide variety of solutions to our challenges.


Step 3

A Dot Voting

While each person had three minutes to present their solution the whole team was asking questions and discussing details in the sketch.

Each team member was voting for the best idea by sticking one of their 3 dots. 


Step 4


Because I've already established a comprehensive design system of assets we could move into rapid prototyping almost instantly, producing clickable prototypes which could be validated with the end of users. 

Validation by Usability Testing

In order to validate if our solution was measurable easier to use, learn and remember we've asked 8 participants (Engineers), to try to accomplish 4 real-life scenarios while we would record their screen, themselves using their phone as well as their face and body language.


We split participants into 2 groups:

  • The 1st group never seen the application before.
  • The 2nd group has been using the previous version for the last 5 weeks and complained about it. 


The results of the Usability Session

Existing users when through all given scenarios without any issues. While the group made from "new users" helped us to identify additional improvements.

"Quite user-friendly, quick to touch. You can go backward and forwards on dates very quickly.
Self-explanatory…is like riding the bike you won't really forget it. "
Participant of the Usability Testing session

Final thoughts

During the Usability Session, we’ve identified 35 problems. We’ve broken them down into 3 categories.

  • Business rule - 11 problems
  • UX improvements - 22 problems
  • Technical issues - 2 problems

All the problems classified as a Business rule or UX improvements were not stopping us from going live, all the problems classified as Technical issues were critical and have been fixed before we roll out the solution.  


Thank you...

Richard Catterill for helping conduct user interviews and for providing daily support to connect all the dots that I couldn't see.