Background
The advent of autonomous vehicles promises to increase mobility independence for individuals who currently lack the ability to drive. Concierge services are one method for connecting prospective passengers with autonomous vehicles. However, little work has examined the inclusive autonomous vehicle concierge service for users with cognitive, motor, auditory, tactile, and visual impairments. The case study examines the process of designing an inclusive autonomous vehicle concierge application ecosystem using a remote form of the participatory design process.
Co-designers:
3 older adults (85 yrs+) with assistive technology use living in Clemson, SC
My roles:
Interaction design, UX research


Literature Review
We reviewed extant academic literature focused on human interactions with concierge services (e.g., Uber and Lyft), autonomous vehicles, concierge applications, external and internal human-machine interfaces, vehicle interiors, and web-based platforms.
Archival data
We reviewed archival data of focus groups of people with blind and visually impaired (BVI) discussing autonomous vehicles and transformed their quotes to need statements.
Older adults:
8 sessions with older adults, 45 participants
52 statements about user needs
BVI:
8 sessions with BVI, 38 participants
57 statements about user needs
Interviews
I conducted in-depth interviews with each of the three codesigners to assess their perceptions of websites, mobile applications, autonomous cars, ride-sharing services, and a hypothetical ride-sharing service that used autonomous cars.
Summary
A total of 412 raw needs
A final total of 125 unique user needs

Finalize Key Needs
Affinity Diagram
Affinity diagramming of these needs revealed 22 primary needs categories. For example, the two needs, “The system drives safely near pedestrians,” and, “The system follows driving laws” were categorized under the primary need, “The system is safe.”
Importance Survey
Co-designers rated the importance of the 125 needs statements using an online survey. The results were aggregated and the needs list was rank-ordered by importance. We then used this needs hierarchy to guide the design ideation by prioritizing designing solutions for the most important needs first.

Brainstorm
Each team member brainstormed a minimum of 6 design ideas for each system, and then we created an affinity diagram of the ideas in Miro and refined the ideas. The design ideas were translated into scenarios and storyboards.
(Scroll over the picture to view all the personas)
Persona
The initial persona designs were presented to our co-designers for evaluation and feedback. A phone interview was scheduled with each co-designer after having been emailed copies of the personas to review in advance. While changes to the personas were suggested during the interviews, each co-designer believed, or showed, they would be able to answer questions or respond to situations from the point of view of the personas. We iterated on our personas based on the feedback and suggestions of our co-designers, leading to the final personas.
Scenario & Storyboard
I combined the personas and the design ideas into scenarios and storyboards to tell and show the story of how our personas would make use of different design features that we created. I also presented the scenarios and storyboards to our older adult co-designers to get feedback. The feedback received allowed us to know that we were on the right track.
Autonomous Car Concierge Service - Mobile App
The primary goals: enable users to book a ride, set up their accounts, and customize preferences to match their individual needs
Key elements
High contrast, minimized visual noise
Step-by-step account sign-up
Customizable interface and vehicle driving preferences
Voice interactions
Query-response-confirmation format
App asks preference
App confirms choice
Autonomous Car Concierge Service - Web-based Platform
Key elements
Allows users to select alternate routes
Split costs with others
Interactive animations
Voice interactions
The application is voice-enabled and allows users to book a ride, set up an account, edit preferences, or contact help using voice interaction.

RideEasy: in-vehicle human-machine interfaces (iHMI)
The iHMI was designed to enable users to have control over their trips and also to serve as an entertainment service. The iHMI enabled users to alter their destination during the trip, choose to pull over whenever they felt like, provide easy access to help, and constant tracking of the vehicle intentions (e.g. current speed, alerts when the vehicle is about to overtake) to enhance trust in the vehicle, live tracking of the trip, options to adjust the temperature, seat and fan speed and finally an integrated infotainment system.
RideEasy: Interior of Vehicle
Key Features
Ramp with handrail
Wi-Fi
First class plane-style seating
Extra leg room
Charging outlets
Self-securing wheelchair slots

RideEasy: external human-machine interfaces (eHMI)
The co-designers' feedback and suggestions were compiled into the final prototype. The final eHMI prototype consists of external spotlights, speakers, digital displays, video cameras, keypad, color-changing door handles, proximity sensors, and an adjustable suspension. In addition, the vehicle is over 9-feet tall to provide ease of movement within the vehicle and provide increased visibility for those outside the vehicle. This final design fulfills many of the high-level user needs by creating a system that is safe, operates independence of users, reduces cognitive load, is useful, and is welcoming to users.