Camille Horstmann
UX designer

Improving Availyst Usability

Usability evaluation for a delivery management platform aimed at saving users money.


Availyst is a startup delivery management platform that shows local options for ordering food, groceries, take out, convenience, and spirits. The user is able to choose from an array of third-party delivery services. This is most convenient when the user is trying to find the cheapest delivery fee for the service they need.

For this project, I contributed to brainstorming, scripting user tasks, identifying key issues to address, and presenting our findings to the client.

Time: 3 Weeks (June)

Role: User Testing

Type: Mobile Application

Tools: Figma, Miro

The Challange

The beta app Availyst needed usability testing to find areas of improvement and opportunity as well as usability success. 

Our team of five was tasked with completing 15 usability tests over a 3 week period, testing individuals from our own networks who used mobile food delivery platforms.

At the end of testing, we collected and narrowed down our findings to present to the client illustrated by mockups of some of our key findings.

The Process

  1. Beta App Install & Farmilerize/Initial Observations

  2. Brainstorming Usability Tasks and Create Script Questions

  3. Finalize Script and receive Client Approval

  4. Complete Usability Testing 

  5. Regroup, Identify and Compile Issues/Recommendations

  6. Present Findings to the Client

Initial Observations



  • Minor technical issues such as filtering and search features

  • Confusing experience when selecting a delivery option

  • Menus were included in some, but not restaurant options.

  • The home page appeared to change from the initial opening of the app 

  • Difficulty in adding new addresses

  • Some users might already have subscribed to one or more of the delivery vendors, which could limit the need for the product. 

Brainstorming & Prioritizing User Tasks

We needed to create tasks that would allow the users to find and identify the limitations of the app. Together we compiled a list of potential tasks that we could draw on to create the testing script.

User Testing Script

Team Zoom Call

Team Zoom Call

The final script consisted of four main tasks and a bonus task for us as interviews to be very aware of any heuristic observations. The script was then sent to the client who approved it with some small tweaks around the wording.

Task 1: Signing up

The signing up process for any digital product is often the first interaction with a new user. To make sure this was received well, we put it as our first step task.

Task 2: Identify Natural Flow

The next task was to select a restaurant and walk through the next steps of building an order. Our intentions behind this step were to make sure the objective of the app was clear as well as to see what was intuitive and what needed clarification.

Task 3: Selecting Filters

The app allowed for many filtering options, thus the next task was focused on how users responded to the various filters and their functions. 

Task 4: Selecting a Delivery Option

The final task was focused on the overall goal of the app which is to help choose the cheapest delivery platform available. This task helped us see how easily this could be done as well as highlighting any sticky places that were still present.

User Testing
Visual 1: Trends, Pain Points, and Wins & Low to High Priority

Visual 1: Trends, Pain Points, and Wins & Low to High Priority

User Testing

We had all agreed at the first meeting to break up the usability tests equally among the team. This meant each of us was responsible for conducting and recording 3 usability tests each. Each user recorded their own screen and agreed to be recorded during the test.

We each gathered some great feedback and met as a team to discuss findings and recommendations for the client. Using miro, we compiled our highest priority observations and other notes breaking them up into three categories: Trends, Pain Points, and Wins (see visual 1).

Issues and Recommendations

Visual 2: Detailed Suggestions

Visual 2: Detailed Suggestions


Our data went into a table of Low to High Priority (see visual 1). We then went on to further detail a final list compiled of 16 recommendations based on the testing feedback (see visual 2) We narrowed our suggestions down to 5 main issues we felt needed to be addressed to present to the client.


  1. The app has a “swipe up” feature for showing delivery options to the user. Once selected the app would send the user to a third-party app or website where they could build their order.

  2. Users were able to read the menus of many restaurants but they could not build an order from the menu given.

  3. The initial login Home Page was not the same as when a user already in the app pressed the “Home” tab.

  4. The delivery options were labeled in name only. 

  5. Users weren’t able to deselect filters.

Reasons to Address

  1. Most of the users tested came across this feature by accident or missed it entirely and were confused about how to continue the task.

  2. Being transferred out of the app to a third-party site was confusing for many users.

  3. Although the initial Home Page and the pop-up had the same information, the visual differences between the two were confusing to some users.

  4. When seeing delivery options in name only, the names could appear to be tags/keywords rather than companies.

  5. Filters when selected remained selected and users seemed unclear in how to unselect. This also meant that users who entered the app were already had preset filters limiting their choices from the beginning


  1. Redesigning the select delivery option/clarifying the swinging up feature.

  2. Allow users to add their food orders directly on the app.

  3. Leave the initial Home Page from login to select between services. 

  4. Add vendor logos under delivery options.

  5. Allow “clear deselect” opinions to all filters or the option to clear the filters when returning to the app.


Presenting to the client

For our final presentation, we walked through each step of the process, sharing some of our initial observations, demographic of the users we tested, wins, and areas of opportunity. We shared our drive with recordings of all the interviews as well as our plan of action, brainstorming, and final presentation. The rest of the time was dedicated to going through some of the high-priority issues we recommended be addressed.

Visual on the left shows issue #4- Visual on the right shows our recommendation

Visual on the left shows issue #4- Visual on the right shows our recommendation

Closing thoughts

The presentation went well and the client was happy and open to hearing our feedback. She mentioned that some of the issues we had come across were technical bugs that had already been addressed. The other issues we had presented she planned on revisiting as soon as possible.

Our team worked very well together. Communication was clear and timelines were met throughout the entire project! 

The main takeaways from this project are; communicate, contribute where you can, be an active listener, ask questions from those who have more experience, and have fun!