Uber Eats App Design for Accessibility
An examination and redesign of the Uber Eats mobile application through hands-free features.
Role
Software
Duration
Adobe XD, Adobe Illustrator, Photoshop, Atom
~ 1 Month
UX/UI Designer, UX Researcher, Prototyping, UX Developer
Context
The following project will focus on users with temporary, situational, and permanent touch constraints when using the food delivery mobile application, Uber Eats. The purpose of the project is to further understand users with various constraints and the possible benefits of a hands-free feature within the UberEats mobile application. Through the research and analysis of UberEats and its accessibility for users with touch constraints, there will be a greater understanding of the potential benefits of hands-free features and its implementation within the app itself.
Problem
For accessibility, the Uber Eats mobile application currently uses external digital screen-reading services like VoiceOver on iPhones and TalkBack on Android devices. According to Uber’s help page, users can have their food options and screens read aloud to them with the screen readers. However, the user must still physically touch the screen and double tap on buttons, such as suggested addresses or add to cart, on the application in order to successfully use the application.
Therefore, Uber Eats does not yet have a full hands-free experience for users.
Problem Statement
How might we make the process of ordering food online through mobile applications, like Uber Eats, more hands-free for users?
Research
To better understand the user experience of the Uber Eats mobile application for those with touch constraints, I conducted a competitor analysis, several user interviews, and participant observations.
Competitor Analysis
One major competitor that Uber Eats faces is the food delivery company DoorDash. One of Doordash’s main features involves their partnership with Google in which it enables users to order through their application using Google’s services such as their voice assistant, search, or map applications. Unlike Uber Eats, the interactive, hands-free option DoorDash provides through Google, creates a more user-friendly, accessible experience for its users.
User Interviews & Observations
The interviews consisted of nine open-ended questions about the participants’ experiences and thoughts with the Uber Eats application and its accessibility. For the observations, each participant was asked to use the mobile Uber Eats application to order food (minimum 3 items) until the checkout page. The target audience and scope of participants were those who have used Uber Eats at least once and experienced or currently have touch constraints.
Key Insights:
People with touch constraints, especially more permanent touch disabilities, cannot use food delivery applications unless there are people who can assist them
Current tools on mobile devices like voice assistants and text-to-speech are not enough to completely and efficiently use Uber Eats and complete orders without some form of touch
With more hands-free features, people with touch constraints would be more willing to use food delivery applications like Uber Eats
Define
Based on the gathered data and findings from my research, I created personas, context scenarios, and user journey maps for each type of user with a touch constraint.
Personas
Three user personas were created for the three types of users with touch constraints (situational, temporary, and permanent touch). The user personas were constantly referred to throughout the redesign process as a way of understanding the goals, motivations, and expectations of users.
Context Scenarios
In addition to personas, three context scenarios were created for each type of user with a touch constraint (situational, temporary, and permanent touch). The context scenarios helped to not only further empathize with users but to also understand their goals and challenges in more depth through the power of storytelling.
User Journey Maps
Three user journey maps for Gina, Sam, and Kathy were also created as a way to visualize the process users must go through to order food through the Uber Eats mobile application using the hands-free voice feature.
Design
During the design stage, I first sketched out my ideas to understand the initial visual layout and designs I wanted to include in the Uber Eats mobile application. Then, I created various wireframes for user testing and to finalize my designs.
Visual Design
Below are the colour schemes, icons, buttons, and typography I created and established for the redesign of the mobile application. I based my decisions off of the brand guideline for Uber Eats so that my features would visually fit seamlessly with the rest of the application.
Low Fidelity Wireframes
After sketching ideas, I created low fidelity wireframes using basic, greyscale shapes and text in order to visualize how I wanted each screen of the mobile application to look.
User Testing: Round 1
With the low fidelity wireframes, I tested my hands-free voice assistant feature with the same research participants from before.
After testing the wireframes, two of three of them reported they had difficulty locating the voice assistant because it was not a familiar spot for voice assistance functions to be. The participants were able to understand the basic functions of the voice assistant and were satisfied to hear the abilities the voice feature would have (ideally). However, the location of the feature and how it blocks part of an already busy screen, was not ideal.
User Testing: Round 2
After reiterating the design process, I moved the filter button to the left of Uber Eats' address header and placed the voice assistant to the right of it.
During the second round of user testing, the participants were able to distinguish the feature from the screen and feel more incline to use the function due to its easy access. As a result of the better organization, layout, and understanding of users and existing hands-free options, I was able to improve my initial layout and design.
High Fidelity Wireframes
After iterating my design process for the low fidelity wireframes, I was able to finalize on my design and create a high fidelity version of the wireframes.
Protoype
With the high fidelity wireframes, I created a prototype of my hands-free feature on Adobe XD which can be tested below. Please press and hold down the spacebar while talking in order to activate the feature. Holding down the spacebar for the prototype simulates what the hands-free voice assistant would ideally do without touching when fully developed and coded.
Reflection
Through this case study, I learned a great deal about the time and amount of reiterations it takes to create a product or feature for a mobile application. For instance, at first, I had believed my initial low fidelities were user-friendly, but through user testing and better understanding users, I was able to see that this was not the case. I was able to see how important and crucial it is to always re-iterate and be unafraid to make changes to designs.
The biggest struggle I experienced was ensuring that my voice assistant feature could help users hands-free while also being minimal and straight-forward to use. For instance, the feature is able to help a multitude of functions on the app such as reading menus, searching for restaurants, and confirming orders. I had to be realistic with ideas and understand that some features may require more time to implement later on in the process. Thus, I learned to balance my own desires, abilities, and the project goals to ensure that my product could still help in a multitude of ways while staying simple and straightforward.