PLAN ALERT | Usability Study

While in the course "Usability Studies", I worked with a team of peers to complete a series of usability tests for the startup PlanSpeak. We worked closely with the stakeholders to complete a thorough and holistic evaluation of their web application using various methods.

Role

Usability Researcher

Methods

Heuristic Evaluation
Moderated User Testing

Timeline

Jan 2019 - Mar 2019

Team

Marlo Huang Straub
Victor Wang
Quilla Valdez Graves

Project Background

PlanAlert is a web application optimized for a mobile experience. By using this application, users to input the details of their current phone plan in order to uncover alternative plan options that would result in monthly savings. This website will provide the user with all cheaper alternatives to the plan, including options from the current carriers offerings and from other carriers as well.

Our team was tasked with evaluating the usability of the beta version of the PlanAlert web application. No usability testing had taken place prior to this exercise, so this test was intended to provide initial research that might assist the developers and design team in refining their product.

Flow Overview

Outlined below is the flow that we focused on testing for the PlanAlert website. This feature allows users to input their plan into the website, and opt into email alerts to track if a cheaper alternative to the plan is available at any time. This flow will be referred to as the "Savings Alert" flow.

The alternative flow that users could take is the "Shopping Alert" flow. This flow encourages users to actively shop for a new plan, and is geared towards users who are intending to find a new phone plan in the near future. This flow was not directly tested in the usability study.

Test objectives and goals

Our primary goals were to test how easy it was for users to understand and progress through the PlanAlert's Savings Alert flow. Since the company's (PlanSpeak) main product goal was user account creation, and successful new sign-ups, the testing team’s key research focuses were testing the usability and conversion. This was done to uncover if and where users struggle or drop off with the product.

Another research goal included testing of the decision screen (seen on the right)  in which users can select between savings and shopping alerts. The testing team’s goal was to learn whether users understood the decision that they were making at this step in the product flow, and if they knew what to do when faced with these two options.

Methodology

Participants tested the usability of “Savings Alert” flow, which was separated to 7 semi-tasks shown below:
              • Task 1: Read Home Screen Text and Continue
              • Task 2: Select between “Get Savings Alert” and “Get Shopping Alerts”
              •Task 3: Create a savings alert
              •Task 4: Configure a plan
              •Task 5: Select a plan
              •Task 6: Set alert preferences
              •Task 7: Complete the Savings Alert process

Researchers employed two common usability test methods, Moderated User Testing and Expert Heuristic Evaluation. Using these two methods allowed the team to better validate findings through comparing results while staying within testing constraints.

Test Environment and Setup

Both tests was conducted in the AHO building on the University of Washington Seattle Campus.

One pilot test was conducted prior to the formal testing. Moderated testing was conducted with 6 different participants. An iPhone 7 was used to conduct all testing, and the Safari browser was used to access the PlanAlert website.

Screen recording technology was used monitor the screen, while two cameras were set up to capture video footage.

Test Overview

For the Moderated Usability Testing, each test session lasted around 60 minutes using the "think aloud" format. The test includes the consent form, pre-test, survey, think aloud session, post-test survey, and debrief. Participants completed tasks as prompted by the moderator. During testing, one or two notetakers collected information, both qualitative and quantitative, while observing the participant.

For the Heuristic Evaluation portion of testing, the testing team used Nielsen’s 10 Heuristics framework. In order to complete the evaluation, the team carefully studied the product test site, while also looking at screen shots and flow maps.

Data Analysis

Raw data was collected via a Google survey form with results extracted into a spreadsheet.

Collected data included structured yes/no responses and likert scale rankings, as well as unstructured text in notes fields. Additional raw data notes taken during the usability testing were added to the spreadsheet as well. Results were analyzed manually by reviewing the spreadsheet. Unstructured responses/feedback notes were translated into structured rankings for more accurate comparisons and summaries. Notes fields were reviewed and parsed for qualitative findings and evidence of user attitudes. Data was further condensed and rolled up into high level summaries and tables.

Findings and recommendations

During testing 31 unique usability issues were discovered in the Savings Alert Flow. Each of the 31 issues was reviewed throughly and distilled into 9 top findings. Each finding includes a severity rating and a recommendation from the usability team for resolution.

On the right is the severity rating scale that was used to assess each finding. Outlined below are a few of the key findings and recommendations we provided to the client.


Confusing CTA Content/Language

Severity Rating: 4

Users were universally confused about the differences between the Savings alert versus Shopping Alert option. When asked during the post testing interview, 100% of the users were unable to describe the difference between the two. Lack of understanding the flow split resulted in frustration and delays in the user getting past the step of selecting which alert they would like to select. Specifically the language used to describe the different options was confusing to users. It is possible that some users would not have gone past this page in an unobserved testing environment, as the moderator had to prompt at least one user to keep going, and two users, after a delay of 10+ seconds, just said, “I guess I’ll just click this one."

Participant Quotes

"What does “Shopping Alert” even mean?..I hope the explanation for the two paths could be more easy to understand through the information on the page.” -P5

"When I get to that page, I don’t know that I would want to sign up for either because I don’t know what I am signing up for.” - P2

“I still don’t know which one I’d choose to find out if my plan is actually the cheapest.” - P6

Recommendations

Using concise and intuitive language, specifically, articulate the value proposition and alert set up process as early as possible in the flow. Remove “Get Savings Alert” and “Get Shopping Alert” language. Consider replacing button text with something more intuitive and understandable to the user such as, “Find Cheaper Plans”. Ensure that any description text is contextually relevant and actionable, giving the user a clear understanding of what would happen if they clicked the button.

Working under the assumption that most U.S. people 18+have a cell-phone plan already, we would recommend making the Shopping path less visible/accessible. A phrase like “Don’t have a plan?” could be presented below the button in smaller text, similar to how sites show a smaller “Forgot Password?” link near a login prompt. Focusing users on a preferred path relieves the users from a confusing decision making step in the alert creation process.

Improvements

After providing recommendations to PlanSpeak, the company was able to implement some changes to improve the usability of their product as seen below: 

Before

After


Information Hierarchy and Text

Severity Rating: 4

66% of participants spent less than 2 seconds on the homepage. The remaining two user inspected up and down the page for a longer period of time, clearly reading the text. One user indicated that the page had so much text that she would close her browser and not proceed (P3).

Heuristic testing from a visual design standpoint uncovered that the homepage has lots of text, much of which is at the same font size. Within this body of text, the fonts are also in two colors (potentially due to the gradient blue background). Additionally, there is no clear place to prioritize reading, outside of the“Welcome to Plan Alert” and “Click Here” call outs. Thus, users either speed through the page without reading or slow down to spend extra time reading the text.

Participant Quotes

"There is just too much text here for me to figure out what this app is really for. If I just stumbled on it, I would just click check it out without seeing the text because it's too much for me." - P5

‍"The words [on the homepage] didn’t help too much. Just noise


Recommendations

Remove the color gradient on page backgrounds to improve text contrast and visibility. On the homepage, use font selection, font size, and/or weight, to call attention to the 1 or 2 pieces of information that you want your users to know, in addition to the “check it out” button. Reduce the total amount of text on this page in general. On the Alerts Decision page, also use font selection, size, weight, and/or color contrast to stress what consumers need to know so they can make the appropriate selection.


Improvements

After providing recommendations to PlanSpeak, the company was able to implement some changes to improve the usability of their product as seen below: 

Before

After


Sort Filter & Compare

Severity Rating: 3

When participants were presented with the plan cards, either to select a new plan or review the cheaper plans presented by the tool, they universally struggled to understand, filter, and compare the information with ease. Due to this, users experienced a slower process, and expressed some degree of frustration. This was also a consensus issue in the heuristic evaluation. Specific issues related to the plan cards include:
              • The amount of scrolling one has to do, which half of the users mentioned
              • Specific plan names are not very legible or are not visible, whereas the carrier name takes up lots of space
              • Confusion regarding total pricing; not all users understood what the numbers in circles meant
              • Lack of a filter button
              • Lack of details and information on the cards beyond price

Participant Quotes

“I’m scrolling so much... is there a filter button?” - P1

"I can't tell what the actual total. Do I add the numbers in addition to the base cost? How am I supposed to look at this information?" - P

"I'd rather the card say the plan name at the top instead of the carrier." - P2


Recommendations

Revisit card design so that the plan information is highlighted, and more detailed information is provided to the user. Since color is being used to identify carrier, consider using header/title text to outline plan information. Also consider adding sort and filter tools to assist users with information consumption. With high volume of cards with various information, allowing users to sort by criteria like price, coverage, and more will help improve usability of the product. Users are also not able to identify what the differences between their current plans and the new ones being offered to them. Adding a comparison tool that allows users to see the new plan and their current plan side by side. This allows users to make a d more informed and motivated decision on if they would like to pursue a new plan.


Example

No changes were implemented in this particular area of the product, however, for reference see the images below:

Impact

I found it really rewarding to work for a client when completing this usability study. Since our team was working with a startup, we got to make a huge impact - not only on their product, but also on the way they think about usability and user experience on a whole. It was really gratifying to see PlanSpeak implement some of the changes we suggested in such a short amount of time.

We got great experience with planning and conducting a usability test, and had the opportunity to learn a lot because we were the only usability consultants working on this project. Since we were able to take the reigns on this project with some guidance from our professor, we were really able to learn a lot about usability testing during this project. I am glad that we had the chance to explore various types of testing as well, and have continued to use informal heuristic evaluation in my day to day.

Another great result of this project was the consistent communication we had with the client. This gave me experience with building and managing client relations, as well giving guidance and explanation about usability concerns to a client who is unfamiliar with the field.