top of page
UI render.jpg

iRobot App

MY ROLE

I led the team of 21 designers responsible for all digital and physical user experience. 16 of the designers were focused on the iRobot Home app experience. I oversaw all of their work, their processes, the user testing and led the strategic vision of the app design roadmap.

RETHINKING THE APP'S JOB

When I started at iRobot the app was a single page interface focused on the robot. We redesigned it to be more home-centric to match how our customers think.

Our research showed that the app's product centered view didn't match people's mental model of cleaning their homes. Users tended to first consider what rooms needed to be cleaned, then how they wanted them to be cleaned, and finally about the product/robot.

Before:

We decided to rearchitect the app to reduce our customers' cognitive burden. 

We started with storyboards rethinking our users high level home cleaning journey. 

In the above example we explored how a Roomba could keep track of which rooms need to be cleaned on its own, how it could automatically avoid disrupting its owner, and how it could follow up on its goals autonomously to ensure the house got cleaned as the owner intended. 

To achieve this we needed to track and display the cleanliness of the house as well as capture some high level information about ours user's home and goals. 

Below are early wireframes of how that could work.

By understanding user goals and leveraging both robot and user data we were able to start making pretty accurate inferences about where to focus our cleaning efforts.

Sometimes, though, there are situations where we need the user's help to ensure we can keep the house clean. Therefore, we designed notifications to provide clear and actionable requests for assistance.

After multiple rounds of refinement in user testing, v1.0 of this vision was shipped in late 2023.

 

It added a new "My Home" tab that showed a real-time map of the predicted cleanliness of the house, giving users a one button way to clean dirty rooms.

 

It also unlocked the ability for the robot to automatically decide if there were rooms that needed to be cleaned or not, removing the need for the user to manage the frequency of cleaning.


The robot was also now able to automatically prioritize rooms that were missed in previous missions due to interruptions.

This was an industry first, and a huge step towards a truly autonomous and goal driven robot rather than an appliance that simply waits to be told what to do.

My Home Final Designs.png

SETUP & ONBOARDING

Setup is a critical first step to owning a Roomba, and due to our deep portfolio we invested significant time into making the process as smooth as possible.

We maintained a series of Master Files that kept track of all the variations of setup that existed across our product line. 

Our analytics and customer care calls told us many of our users were having problems with setup. 

We conducted dozens of user tests to get more information and pinpoint specific fixes.

Based on this data, we built and maintained a prioritized setup improvement chart. We then added these fixes to our roadmap and shipped them out to our users in regular release cycles.

Our products have a lot of moving parts, so education is a critical part of a good first time user experience.

 

We architected a new, modern, detailed, and animated design language to increase customer understanding, confidence, and pleasure during their setup journey. 

We created a Master File system to track, maintain, and improve this content for both us and QA. 

BETTER MAPS FOR ROOMBA

When I joined iRobot the map in the app was very difficult for users to understand as it only displayed the part of the home the robot could access.

This meant that furniture, counters, islands, and anything else that limited robot movement showed up as mysterious holes in the map, and the rooms looked nothing like our users expected. 

My team worked closely with engineers to build a design system to interpret the robot's camera data so that we could show the actual walls of the house and the placement of furniture, counters, and appliances. 

Here is a small sample of the design spec.

The v1.0 implementation of the improved map showed true walls, doorways, and objects such as tables, chairs, ovens, couches, and beds. Improved display of counters and islands is planned for future versions as well as a larger collection of appliances and furniture. 

Maps Before and After.png

ROBOT HOME SECURITY

Since our robots relied on cameras to generate maps of the home and avoid obstacles on floors, we explored if users would be interested in using their Roombas as autonomous mobile security devices.

Because having a camera driving around your home recording video could be unsettling to many users, we knew we had to start with a conservative and transparent experience.

Below are some early wireframes showing how users could define "waypoints" they could send the robots to check on things like their oven, or their pets.

Here is a version of the same experience in later, higher fidelity explorations. 

As we knew our customers were very privacy oriented we limited our initial release to manual driving and sending to waypoints. 

 

We discovered that designing an easy to use remote control interface for Roomba was quite the challenge due to the strange two wheeled steering style of the robot plus the inevitable network latency. 

We ended up building 5 completely different algorithms and corresponding interfaces and testing with thousands of Beta users to eventually develop an intuitive interface that was easy to control. 

IN-APP STORE

We knew that our users found it hard to keep up with their robot's maintenance needs, so we built a robot health dashboard and replacement part store into the app.

As a bonus this netted us around $14 million a year.

Roller brushes, bags, filters, mopping pads, and batteries all wear out and it can be hard for users to recognize the symptoms of wear (and for the robot's sensors to detect), so we built a series of predictive algorithms to let the user know that they need to replace a part. 

We also built a store into the app that shows users the parts that fit their particular robot. 

VOICE ASSISTANTS

We built an easy way to setup and manage all three major voice assistants in our app. 

The integrations with Alexa and Google Assistant were relatively straightforward as seen in this sub-set of our design spec below.

Siri, however, was more challenging because HomeKit does not yet support robot vacuums. To get around this we built a custom phrase generator using Siri Shortcuts that allowed users to target individual rooms or run custom cleaning routines. 

AIR-PURIFIER INTEGRATION

When iRobot started making air-purifiers we redesigned the app to support their unique requirements. 

We made the most common settings easily accessible on an expandable control panel, while focusing the rest of the screen space on showing the user real-time and historic data about the air quality of their home. 

PERSONALIZED TROUBLE-SHOOTING

One of the challenges with robots that operate in the chaos of a home is that errors are unavoidable. My team and I designed new techniques for making errors easier for our users to understand and fix.

We partnered with our Customer Care team to craft step-by-step guides with accompanying videos. If a user encountered the same error often, we then proactively prompted them to call us for more hands-on help. 

INTERACTION LIBRARIES

As part of our efforts to standardize our design systems and ensure clarity of communication with our engineering and QA friends, we built an interaction library. 

This library defined the standard interaction patterns we used throughout the app. Below is a small sample.

bottom of page