I designed conversational experiences and led the user acquisition effort for AVA, a machine-learning chatbot for Asurion that resolves issues through natural conversation

AVA

RESPONSIVE WEB


MY ROLE

Principal product designer

Duration

May 2018 - Feb 2019

Users

12k+ users, 2k+ enterprise users

Outcomes

17% increase in user acquisition


Major outcomes

We surpassed our OKRs a quarter early, increasing user acquisition from 3% of Asurion customers to 20%, and doubling the claims filed.

Anticipating a lower FPY and higher bounce rate from the influx of users, myself and the team optimized the experience to maintain these metrics. As a part of this work, we also decreased design and development time for A/B tests to 1.5 weeks from 1.5 months.

Problem space

Customer service and CRM are at the center of Asurion’s business. As a leading provider of device insurance, it has relatively low operational cost aside from the resources it dedicates to its customer service efforts.

The AVA product is a machine-learning, natural language processing chatbot meant to reduce Asurion’s operational cost, while also providing a better customer experience. The product performed well as an MVP, proving that it was capable of handling basic problems, such as filing claims, answering questions, and checking customer’s statuses in a conversational way.

However, AVA faced a variety of problems that prevented its growth. From a business perspective, AVA wasn’t attracting enough users to show that it could grow. Not only that, AVA’s conversational flows were non-optimal, causing large user dropoff in important features. Its visual and interaction design were both overly simplistic and inconsistent, which appeared untrustworthy or “unfinished” to its users.

To discover more about these problems, I conducted research with stakeholders, actual users, representative users, and team members.

Main problems

AVA had slow growth, non-optimal flows, and inconsistent design, negatively impacting both user experience and business results.

Qualitative and quantitative research

Low user acquisition was a known problem when I first joined the team.

Asurion customers can opt in to use AVA through a variety of ways, but most of AVA’s traffic came from the web. 100% of Asurion customers who view the website on a mobile device see the option to use AVA.

However, only 3% of these users opted to use AVA - an extremely low amount of traffic.

In order to identify the reasons why users weren’t opting in, I conducted user research with past users who had used AVA to understand the reasons why they picked it. In addition, I user tested the actual website and AVA product with users in representative demographics to understand how a user without context would act. Part of my research was also looking at quantitative data to see what were the most popular paths on both the entry website and AVA itself.

A quick analysis of quantitative data for both the web entry into AVA, and AVA itself, showed that starting a new claim and continuing an existing claim were overwhelmingly the most popular user choice.

Qualitative user research with past users also revealed why users were not choosing AVA. The AVA option was displayed as a tertiary button that said “Use Claims Assistant”. Users tended to ignore it because of its low visual hierarchy. The content was also too vague to tell the user any useful information. When asked what they could expect from the “Use Claims Assistant” button, user answers greatly varied from a real agent chat to a forum - with 90% of guesses being inaccurate.

Once users entered the app, reactions varied, independent of scenario. About half of the users were neutral or positive, and the other half were strongly negative towards the look and feel of the app. The simplistic design made the users feel like they had entered an insecure or third party website.

Design ideation & strategy

The quantitative and qualitative research pointed to three major problems that were preventing user acquisition from seeing any significant increase on the web.

User acquisition problems

The AVA option had (1) low visual hierarchy and (2) confusing content. Once users entered AVA, (3) they did not fully trust its design.

Although all three problems were important, based on the research I had done, we viewed the first and second problem as being top priority. Once more users were entering the app, we would monitor AVA’s bounce rate for these new users to assess the third problem’s priority.

I ideated with the team on minimal ways we could validate these problem’s impact. Each idea did well in user testing, but there were unknowns about development. Another team owned the website which allowed users to enter AVA - we didn’t know their development capabilities, priorities, and relationship with us. Our team had never done an A/B test outside of AVA before, so we didn’t know how we might impact other important metrics by testing on the web.

Risks and blockers

Our solution had a dependency on another team, which posed many unknowns about development and the risks of implementation.

There were many ideas which came out of ideation that I worked on. Based on user testing results, each design had anticipated advantages and disadvantages, as well as anticipated risks. With the additional development risk in mind, we decided on a step-by-step approach.

We would start with the most minimal of the designs, and direct 5% of web traffic to it. Once we were confident it wouldn’t severely or adversely affect an unanticipated metric, we would increase it to 25% and monitor performance.

Depending on each design’s performance, we would either progress to the next stage or reassess and pivot.

Development and results

The quantitative and qualitative research pointed to three major problems that were preventing user acquisition from seeing any significant increase on the web.

Despite it’s simplistic design, Phase 1 took 45 days to design and develop, with development taking up more than 60% of the time. It’s low-risk design yielded its expected low reward result: an increase from 3% to 8% of users choosing AVA.

The long development time was a result of the lack of front-end developers on the web team who could develop the UI, implement the A/B test, and implement tracking. The web team developers explained that developing the UI was the lengthiest part of the process.

Based on Phase 1, we decided to implement Phase 2 in parallel with a different approach. Instead of handing off my design to the developers right away, I developed a prototype of the UI myself. I would then pair with the web team’s developers (who acted as QA) to implement the A/B test on the web and monitor results.

Considering how aggressive Phase 2 was, we anticipated both higher user acquisition and higher risk. In comparison to Phase 1’s increase to 8%, Phase 2 acquired 20% of users on average. AVA’s bounce rate and FPY for Phase 1 and Phase 2 were maintained, but the entry website’s bounce rate was negatively affected, increasing by 5%. In our next steps, we plan to iterate on both Phases in order to maximize the positive results and minimize the negative.

Next steps

The next steps in the user acquisition effort is to advance both the Phase 1 and Phase 2 strategies to find the right balance between user acquisition and other important metrics (such as FPY and bounce rate).

Iterating on both results can yield a better balance between all of these success metrics. Based on Phase 1.5 and Phase 2.5, the Phase 3 design can either iterate on the winner of these two ideas, or be something completely new.

Previous
Previous

Allstate Virtual Assist

Next
Next

UXD With Me