Ryo
Product Designer

Methodify

Transforming the product development process with user-centered design

Overview

Methodify is a market research platform, where businesses create custom surveys to learn more about what their customers are thinking.

Problem: I was tasked with designing new features and redesigning the app's UX. However, I quickly discovered that there were fundamental problems in Methodify's development process, and misunderstandings about UX in general.

Solution: I introduced my user-centered approach — including UX research to test assumptions — before designing solutions that improved the app's usability and saved the company a great deal of time and money.

My Role: I was in charge of UX research, and UX & UI designer; and the product manager for the Methodify 2.0 redesign.

My Team: I was on a team with 8 developers, and the director of development.

Tools: I used a variety of tools to complete this project:

  • Pen & Paper — Sketches & paper prototypes

  • Whimsical — App flow & wireframes

  • Inspectlet & Hotjar — Web analytics

  • Sketch — UX research visualizations, UI design

Context

What I was Told

When I started my UX work here, I was told that Methodify is yielding low revenue, and has low customer retention. They weren't sure why, but they predicted that increasing the amount of features would be the solution.

However, it quickly became clear that the company did not have a firm grasp on what the problem was. In fact, there were misunderstandings about UX, and its role in product development. The best way to explain this is with the comic below.

Methodify

The Role of UX

The misunderstandings about the role of UX meant that I had to conduct research outside of work hours in the beginning – at least until my research was recognized for its value. A large part of my job had therefore also become the educating of a user-centered design process.

Therefore, throughout my time here, I ran workshops for my colleagues on topics such as: the purpose and power of user research, how to utilize user personas, the role of user stories and acceptance criteria, usability testing, etc.

Research
Methodify

User Research

With so few repeat customers, I had to supplement the qualitative data I obtained from 3 user interviews with secondary research. I used a large market research industry report (the GRIT report) and web analytics, to validate and expand on my primary research.

User Persona

There was enough data to make one detailed user persona, which was instrumental in helping the team communicate about the users. While I was tempted to create additional assumption personas and validate them with further data, I felt that this would confuse my colleagues, who had never seen or utilized one before.

Methodify
Identifying the Problems

Three Assumptions

As the comic above indicated, there were 3 widely-held assumptions about the app:

  1. The app is used as a presentational tool

  2. The app is used on the go (i.e., the need for mobile)

  3. Users go through a 3-step process:

    1. Make questions

    2. Get answers

    3. Receive a report

From my user research, I concluded that each of these assumptions were false. Furthermore, making important design decisions based on these assumptions has been incredibly detrimental and costly for the company. I explain in the following sub-sections.

Assumption #1 – Presentational Tool

The assumption that the app would be used as a presentational tool caused the team to create features no one ended up using (e.g. showing all charts in a small grid), while neglecting basic things that users actually wanted. Because, unfortunately...

  • The UX is extremely cumbersome – for example, in order to filter the data (e.g., by age or gender), the user had to watch a loading screen for 20-30 seconds

  • Users do not want to present from a website, especially when they could not brand the charts (e.g., with their company colours/styles)

  • Users simply preferred PowerPoint over every other presentation tool

Assumption #2 – Mobile Users

The web analytics indicated that the vast majority of users are on Windows computers. And as it turns out, users do not want to spend thousands of dollars on surveys that they would have to type out on their phones.

Designs for web (left) & mobile (right)

Designs for web (left) & mobile (right)

This assumption caused two main problems:

First, an expensive design agency (prior to my employment) had doubled their work by making two versions of the app (web and mobile). Their mobile version was worthless, because it was never used.

Second, if the website were actually responsive, the decision to restrict the width of the website to 940 pixels may have been useful. However, it needlessly caused many problems for later features, with no benefit.



Assumption #3 – Three-Step Process

The research shows that our app neglected a fourth step:

          4. Prepare data (for presentation/delivery)

In fact, this assumption led to more resources being allocated towards the survey creation process, instead of the report of the results, which is by far what the users cares more about.

This explained the low customer retention — the app was not meeting users' needs.

Methodify

I made the User Scenario Map below to help convey what the research indicates is the fourth step. Users were struggling through the filtering of data in the report, and they generally used PowerPoint to redesigned the charts that we provided for them.

Scope of the Redesign

Having successfully demonstrated the value of my UX research, I was tasked with doing a complete redesign of the app's UX and UI.

Methodify

But considering our limited resources and the need for a high ROI, I recommended that we first focus on the report. As you can see pictured here, there are many issues to address.

The research strongly suggests that improving this page would have the biggest impact.



Planning Methodify 2.0

The Plan

I wanted to be sure we were solving the right problem, so I made a four-point plan.

  1. Identify user pain points

  2. Strategize for our business needs

  3. Brainstorm & sketch ideas

  4. Prioritize features & make decisions on solutions

Identify User Pain Points

Pain points are useful opportunities to improve the UX. I made a User Journey Map in order to communicate this to the team.

Methodify

Strategize for our business needs

We are currently unable to track users' behaviour as soon as they leave the app to use PowerPoint. This creates blind-spots for us, hindering our ability to understand and adapt to their needs going forward. Therefore, the redesign should keep users inside the app for longer, to enable us to gather data on how they actually use it.

Also, several colleagues (sales, marketing, customer success, etc.) are also often spending time trying to teach customers how to use the app. I want the UX to speak for itself, so as to minimize the time they spend manually answering questions.

Methodify

Brainstorming & Sketching

I ran some design workshops with a cross-functional team (including management), in order to include others in the process, and generate ideas. 

This was very useful in determining what we did and didn't want to include. For example, we realized that we didn't want the app to become an elaborate scientific research platform, but more of a simple "do it yourself" tool.



Methodify

User Stories

I created user stories for the user persona, and placed the 10 most critical ones inside a Red Routes Matrix. This helped determine what to focus on in my redesign.



Design & Validation

Solutions

I redesigned the report page, and created a second page that works in tandem with the report. These are represented by these two features:

  • Feature #1: Improved Data Manipulation

  • Feature #2: Story Builder

Feature # 1: Improved Data Manipulation

As demonstrated by the User Journey Map, filtering data is very problematic. Users must scroll to the top, open the filter menu, select filters, click "Submit Filters", wait 20-30 seconds for the page to reload, then scroll back down the charts, and click through each chart in a carousel to view them. Users must repeat this process each time they want to filter the data. 

Methodify

Vertical Scroll instead of Carousel

My redesign has users vertically scrolling through charts rather than clicking through a carousel; and I worked with the development team to ensure the data was remodelled to allow for real-time data filtering — no more lengthy loading times. 

When you click a filter, it affects all the data on screen, including each chart on the right.

Methodify

Dynamic Filter Bar

The filter bar is also fixed to the left side of the screen, with a human-readable sentence at the bottom that tells users exactly what is actively being filtered.


Methodify

Custom Chart Creation

In addition to being able to filter data by specific answers to questions, you can also use those filters to create your own chart, by clicking on the bar-chart icon in the filter bar (the new chart appears in a modal).

Feature #2: Story Builder

This is a presentation-making tool that is not meant to compete with/replace PowerPoint, but to capitalize on users' familiarity with it. Users create "Stories" –  slideshows that they may use to present their data. Stories consist of the data that user themselves selected from the report page. This allows separate Stories to be made from the same data set.

Users will be able to select the layouts and colour schemes of slides — things that make it easier to prepare for presentations.

The flow for adding a chart to a new slide

The flow for adding a chart to a new slide

As you can see, the strategic use of overlays are meant to guide the users through the flow, without burdening/confusing them with lengthy instructions.


Instead of users customizing their presentations inside PowerPoint every time, this UX would enable them to do so once, and then save their stylings for future Stories. This increases the value of future usage, which should help with customer retention. We don't yet have data to show whether this will be the case (it has not been implemented at the time of writing this); but at least by keeping users in the app, we are able to observe their behaviour and make adjustments accordingly.

Various other screens related to merged, filtered, and imported charts

Various other screens related to merged, filtered, and imported charts

Validating the Designs

Two usability tests I conducted on the current app showed that users found the report UX to be cumbersome and "a bit painful." The redesigned UX enabled people to complete far more tasks, and was found to have substantially improved efficiency and user satisfaction.

Screenshots of recorded usability tests

Screenshots of recorded usability tests

Throughout the design process, I made paper prototypes to run usability tests. There were eleven test sessions in total, spanning roughly 8 tasks per test. 

Tasks included things like filtering data, adding charts to a story, customizing the slides, and downloading the presentation. Pictured here are some of images from actual usability test footage.

Satisfaction

I made a survey of 7 questions, which I used at the end of each usability test to ensure that I could compare the data across participants. After averaging out the responses on a 5-point scale (from strongly disagree to strongly agree), the results were overwhelmingly positive:

  • This system was unnecessarily complex: Disagree

  • I thought it’s easy to use: Strongly agree

  • The various functions were well integrated: Strongly agree

  • There’s too much inconsistency in it: Disagree

  • I think most people would learn to use it very quickly: Agree

  • It’s very cumbersome to use: Strongly disagree

  • I felt very confident using this system: Agree

Some specific feedback from tests led to small improvements in the experience; but the overall feedback was quite consistent, and highly positive. As one tester put it, “Once I understood how to use it, it was all really easy.”

Results
Report Page, with the new filter bar fixed on the left side

Report Page, with the new filter bar fixed on the left side

Realistic example filtered, with the user typing out their own analysis

Realistic example filtered, with the user typing out their own analysis

Takeaways

What I learnt

  • Devs have great ideas for UX too Even though I was the only designer on the team, working closely with the developers was very helpful for me, and they provided a lot of insight that helped me to make decisions

  • Don’t make major design decisions based on assumptions I am glad I did research upfront, because I could avoid the same mistakes the previous designers made

  • Don’t work in a silo (especially in when there is low UX maturity) Try to include other stakeholders in the process, so they understand what you’re doing and why. This prevents push-back that may impede progress.

  • Implement analytics early We did not have systems in place to get baseline levels of certain behaviour until later in the process; but it is always helpful to be able to monitor how design iterations are influencing behaviour over time.

My biggest challenge: My biggest challenge (and in retrospect, the most interesting) was discovering the true problems, since the information I was given turned out to be only a part of a much larger story.

Impact

Culture: First of all, I impacted the culture of the team by changing the workflow, and the overall organization by showing the value of UX research.

  • Virtually everyone started to understand the value of research, and management even wanted more personas for other departments as well

  • I had us integrate validation (i.e., user testing) into the development process

  • I influenced the way we communicated, by having people specifically refer to the User Persona by name, instead of "the user"

Usability: As shown above, my designs were substantial improvements, in terms of user satisfaction, task effectiveness and efficiency. They were also far more practical (i.e., directly addressing users' pain points).

ROI: Compared to the design agency, my design process was beneficial in several ways. This was mainly because I had the benefit of UX research, which helped me avoid the costly mistakes they made. Furthermore, my usability testing had also prevented a disastrous feature (which I did not design) from being launched.

5x
less expensive
2x
faster
3x
more practical
$45k+
saved for
the company