Delightful conversational experiences are a key long-term invention arc needed to unlock future growth for Amazon. Shopping through screen based echo devices is deemed as the next frontier for how customers will shop for products.
Sr. UX Lead
Collaborators: Product Management, User Research, Voice Design, Engineering
I developed a foundational Browse and Discover framework, created a North Star vision, conducted customer research, aligned with leadership and designed the end-to-end customer experiences. These CX’s have a projected MAU of 150,000 in Q2, 2022.
Smart home devices like lights, cameras etc is a category of business interest for conversational shopping because of the traffic and its relative newness.
Customers signal high interest in the smart home device category but also have a lot of questions about its usage. Market research data indicated that smart lights and plugs are the typical gateway products for customers looking to upgrade to smart home products.
Based on customer research I had conducted, there were two main browsing behaviors or mental models that smart home customers had: “I don’t know what I want” and “I don’t really want anything”. Both these types of mental models are characterized by high curiosity about features and low confidence about actual benefits of a new technology. When they discover or encounter new products, they are looking for reliable and trustworthy ways of understanding the basics and the “big picture” of why they should consider something new.
Based on the business objectives, customer problem and Alexa capabilities, I created the following central value proposition:
Through rich informational exchange and search capabilities, Alexa will provide customers a faster way to learn about smart home devices
This helped gain alignment with the product team and leadership. It also provided design direction in what was a very ambiguous space with multiple possibilities. For the initial launch/MVP, I focussed on making Alexa a trustworthy source to learn quickly about smart home devices. This experience would focus on answering customers' questions that are broadly about benefits of a new technology.
After analyzing current utterance data and google auto-complete data, I categorized the question types customers have about smart home devices into following five categories:
This entire experience was built on a browse and discover framework I had created based on customer research data about fundamental shopping behaviors. The north star involved creating a journey flow with a customer vignette
This vision was created entirely based on some "first principles" I had derived from browse behavior of customers that are shopping for a completely new or unfamiliar category of products:
Once there was alignment on the vision from leadership and partner teams, I developed the main principles that helped me design the experience.
I kept gathering feedback on low-fidelity wireframes to develop the final experience
The smart home device shopping experience was launched in Nov’ 21 with an average of 2.4 conversation turns and 44% positive follow-on rate. In addition, the components developed for the smart home use case, were combined with another conversation shopping experience- gifting, that was being led by my teammate. We collaborated and abstracted new patterns like how to present web articles and videos, suggest utterances, show novel answer formats like infographics etc. These patterns act as building blocks for future Alexa conversational shopping experiences.
Multi-modal design is challenging because so little is known of the user’s environment including where the device is placed. One of the biggest challenges on this project was to decide when to assume a customer would be looking at a screen and when not too. I had several debates on this- while there was little to no ethnographic data on the matter, my point-of-view was to assume by default that the device was in place which is hard for touch interactions and most times people would only be listening to Alexa and not looking at her. To address this type of problem with multi-modal, I added voice prompts like “I’m also showing you more information on the screen”. However, this is only a makeshift solve as this way of shopping is a completely new human-computer interaction paradigm. To that end, ideally a lot more ethnographic research and low-fi testing would’ve helped getting a deeper understanding of customer behavior. A purely Q/A based experience without products, providing answers in a personalized context, and seamless hand-off to Amazon mobile app are also directions I’d explore.