Home > Others > A Guide To Conducting A Mobile UX Diagnostic

A Guide To Conducting A Mobile UX Diagnostic

November 4th, 2014 Leave a comment Go to comments
01-mary-jane-opt

Today’s mobile users have increasing expectations, they are intolerant of faults in their mobile experiences1, and they complain about bad mobile experiences on social media and through word of mouth. How do you make sure that your mobile experience meets or exceeds users’ expectations?

One quick way to identify potential problems is to conduct a user experience diagnostic, by having a few mobile specialists look for potential problems with a mobile presence. A diagnostic can be done during design and development to ensure that the mobile website or app adheres to best practices and guidelines. It also serves as a great starting point for a redesign to identify particular opportunities for improvement.

While a diagnostic can be done by a single evaluator, it is more effective when conducted by multiple evaluators with different strengths and backgrounds. These evaluators should be practitioners well versed in principles of user experience (UX) for mobile interfaces and in mobile platform guidelines, and they should not be closely involved with the design itself. A diagnostic is not a replacement for testing with end users, but rather is a quick method in a user-centered design process2.

This article will describe a process you can follow to evaluate a mobile UX, be it for an app or a website accessed on a mobile device. The steps in this process are:

  1. identify users and scenarios,
  2. conduct an evaluation,
  3. conduct a team review,
  4. document and report.

Alongside the explanation of each step, we’ll illustrate the step using the United States Postal Service as an unwitting real-world example.

Identify Users And Scenarios

A mobile UX diagnostic is conducted by expert evaluators who may or may not be active users of the mobile product. To help the evaluators walk a mile in the user’s shoes, select one to three personas based on the target audience, along with scenarios based on common user tasks and goals. Define the boundaries of the evaluation, and make it quick and efficient by asking the following questions:

  1. What should the evaluation focus on?
    Is it a website that would be accessed on a mobile device or a mobile app? If it’s an app, which platform?
  2. Which devices do your target users use?
    One way to find out is by looking at web traffic and analytics. If that’s not available, then select popular devices based on market share.
  3. Which OS versions are being used?
    Base this on the platform and device.
  4. Who are the main competitors of the website or app?
  5. Is any relevant market research available?
    This could be industry trends, reports, etc. One example would be Forrester’s Customer Experience Index3.

We’ll evaluate the app for the United States Postal Service (USPS) — “over 2 million downloads!” — on an iPhone 5 running iOS 7.1. We’ll illustrate it through the eyes of Mary Jane, an average residential postal customer. (The persona and scenarios are made up for this article.)

Persona

I will illustrate the evaluation of the USPS Mobile iOS app (“over 2 million downloads!”) on an iPhone 5 running iOS 7.1, through the eyes of Mary Jane, an average residential postal customer (the persona and scenarios were made up for this article).

Mary Jane is a 37-year-old working mother of two, married to a traveling consultant. She has a job with flexible working hours that align with her kids’ school hours, but juggling it all is no easy task. She shops online a lot and has depended on her iPhone for the past five years. Mary rarely sets foot in the post office, instead relying on USPS for her shopping deliveries, occasional bills and frequent mail-in rebates.

Scenarios

  • Track packages
    Mary frequently shops online and gets deliveries to her door. She likes being able to track her packages to make sure she receives everything as expected. She wants to be able to use her phone to check the status of pending deliveries.
  • Find location
    Mary is on her way to pick up her kids from school when she realizes that today is the deadline to postmark one of her rebates. She wants to find a nearby manned post office or a drop-off location with late pick-up hours.
  • Hold mail
    The family takes three to four mini-vacations a year, during which time she places a hold on her mail to prevent any packages from being left at her door in her absence. The family’s anniversary getaway is coming up in a few weeks, and she wants to place a hold on her mail.

Conduct The Evaluation

A best practice is to have two or more evaluators independently conduct the evaluation in three parts:

  1. scenarios and related UX,
  2. rapid competitive benchmarking,
  3. overall UX.

Scenarios and Related UX

The first part involves evaluating the UX using defined scenarios of use, followed by an inspection of other aspects of the UX.

Step 1: Pick a device and OS. Test “glanceability” with a five-second test. Launch the app or website and look at it for five seconds. Then, cover the screen and answer the following: What is being offered, and what can the user do? The app or website passes if your answer closely matches its core offering.

Step 2: Put on your “persona hat” and use the website or app to walk through the scenario. Look for and identify UX issues that the persona might face in the scenario — anything that would slow down or prevent them from completing their tasks. Document the issues by taking screenshots4 and making notes as you go. Where possible, use contextual testing in the field (i.e. outside of the office) to uncover issues that you might not have exposed otherwise (for example, spotty connectivity when using a travel or retail app, or contrast and glare).

Repeat step 2 until every scenario for each persona is completed.

Step 3: Chances are, the scenarios did not cover everything that the website or app has to offer. Switch from your “persona hat” to your “UX specialist hat” to evaluate key areas not yet covered. Use a framework such as the one detailed in “The Elements of the Mobile User Experience85” to organize the evaluation, continuing to document issues and take relevant screenshots. I find that focusing on the problems to be more valuable, unless you are using a scorecard, such as Forrester’s6, or you specifically need to document strengths as well.

For an app, repeat steps 2 and 3 for the other identified platforms and devices to ensure that the app follows the guidelines and conventions of those platforms. For a website, verify that it renders as expected across devices.

For our example, I chose the “Find Location” scenario to evaluate USPS’ app for iOS.

Find Location: Mary is on her way to pick up her kids from school when she realizes that today is the deadline to postmark one of her rebates. She wants to find a nearby manned post office or a drop-off location with late pick-up hours.

Notes for “Find Location” Scenario

Here are some notes jotted down during the evaluation of the app in the “Find Location” scenario. Testing was conducted on USPS’ iOS app, version 3.8.5 (the app was updated 18 December 2013).

  • When the app launches, a splash screen appears for varying lengths of time (as little as a few seconds to over a minute over public Wi-Fi, simulating the guest Wi-Fi network at her children’s school).
  • The home screen does not have a logo or prominent USPS branding — just a screen with icons.
  • The screen titles do not assure Mary that she is heading down the right path. Tapping “Locations” leads to a screen titled “Search,” and the titles of subsequent screens don’t match either (one says “Enter” and then “Refine search”).
    02-locations-information-scent-opt
  • The “Location” screen does not have sufficient information, forcing Mary to tap “Show Details” to understand the different options. Why wasn’t this made the default view?
    03-show-details-unnecessary-click-opt
  • The same icon is used for “Post Offices” and “Pickup Services.”
  • Locating all services at once is not possible. Mary is forced to look them up one at a time (for example, first looking up “Post Office” locations, then going back and looking up “Approved Providers”).
  • Location services are not activated for the app, and there is no alert or reminder to turn it on to use the GPS. Mary is under the impression that that functionality does not work.
  • No option exists to enter a search radius. Results from almost 50 kilometers away are returned.
  • The location results do not indicate whether a location is open or closed.
    04-location-details-click-open-closed-opt
  • When Mary selects a location to view its details, she has to expand the boxes for “Retail Hours” and “Last Collection Time” individually to view that information.
  • Going back from the “Locations” screen crashes the app. Every. Single. Time. (Even after deleting the app and reinstalling.)

Related UX Notes

  • The titles used in the app are not user-friendly, but rather oriented around features and functionality. For example, “Scan” (Scan what? Why?); and “Coupons” (Get coupons? No. What coupons can one add? No clue is given.)
  • Tapping the “Terms of Use” on the home screen results in a confirmation prompt to leave the app (taking users to the mobile website). Really?!
    05-tos-link-opt
  • The input field for the ZIP code does not bring up the appropriate numeric keyboard. In the “Supplies” section, the keyboard that appears for the ZIP code is the alphabetical keyboard, not even the alphanumeric one.
  • Many screens do not have titles (for example, try entering an address for “Supplies”).
    06-nameless-screens-opt
  • The scanning experience is inconsistent. It took a few minutes for one, but was quicker the next time.
  • The app is missing expected functionality (such as expected delivery date, app notifications and a change-of-address option). The app has fewer features than the mobile website (such as an option to change one’s address).
  • The screen to track a package has a “Scan” button, instead of the conventional camera icon.
  • Information is not shared between screens in the app, forcing the user to enter the same information in multiple places (for example, for “Next day pickup,” “Get supplies” and “Hold mail”).
  • Deleting a scheduled pickup in the app does not cancel the pickup, and no warning message is displayed either.
    07-pick-up-delete-cancel-opt
  • A minor issue, the “Terms of Use” link on the home screen does not align with the rest of the sentence.

Rapid Competitive Benchmarking

Rapid competitive benchmarking is a quick exercise to compare how your mobile UX stacks up against the competition’s. To do this, pick a couple of primary competitors or services that offer similar functionality, and complete similar scenarios, followed by a quick scan of their functionality. Look for areas where competitors offer a better user experience, and document with notes and screenshots. For a more detailed analysis, compare features to those of key competitors (Harvey Balls7 do a good job of showing the relative completeness of features).

Competitive Benchmarking: Notes for “Find Location” Scenario

UPS:

  • An option exists to view all types of locations, but with no way to distinguish between them.
  • Results are displayed only on a map (no list view).

FedEx has the best store-locator experience among the three:

  • When location services are turned off, the app gives clear instructions on how to turn it on.
  • A single screen contains both “Use current location” and search by ZIP code, with filters to show one or more types of locations.
  • Location results can be viewed as a list or map.
  • Location results show at a glance whether a location is open or closed.
  • Results show multiple types of locations and identify the type of each location.

Overall UX Feedback

The final step in the individual evaluation is to step back and evaluate the big picture. To do this, review the following:

  • how the user installs the app or finds the website;
  • onboarding help if it’s an app,
  • the cross-channel experience (i.e. comparing the app to the website on different devices),
  • the cross-device experience,
  • reviews in app stores (for apps) and social networks (for websites and apps),
  • comments and feedback received by email (if you have access to this).

Overall UX Notes

  • When the app first launches, the user is forced to accept the terms and conditions to use the app. (I’ve fought my share of battles with legal departments on this topic as well — and lost many.) However, there are no terms and conditions to accept before using USPS’ mobile website.
  • The app has no onboarding help when first launched, and no help within either.

Here are the notes about the cross-channel experience (i.e. between the app, mobile website and desktop website):

  • The logo on the mobile website is low in resolution, with notable pixelation on “Retina” displays.
  • Branding across the three lack consistency in look and feel.
    08-cross-channel-opt
  • Carrying over shipment-tracking or any personal information between the three channels is not possible.
  • The main functionality is not ordered consistently across channels, nor is key functionality available in all three channels.
  • Touch targets are too close together on the mobile website.
    09-touch-targets-opt

Here are the notes about the cross-device experience:

  • Branding appears on the home screen of the Android app, but not of the iOS app (even though it is shown in Apple’s App Store).

And here are the notes about reviews in Apple’s App Store (negative feedback abounds):

  • Location services are inaccurate, and results could be more relevant.
  • Scanning doesn’t always work.
  • The app freezes and crashes.

Conduct A Team Review

Conduct a team review session to compare, validate and aggregate the findings of the individual evaluations. Evaluators with diverse skills (for example, visual designer, usability analyst) tend to have different areas of focus when conducting evaluations, even though they are using common personas and scenarios and a common evaluation framework.

During the team review, one evaluator should facilitate the discussion, bringing up each problem, verifying whether the other evaluators identified that issue and are in agreement, and then assigning a level of severity to the problem. The evaluators should also identify possible solutions for these issues. The result would be a consolidated list of problems and potential solutions.

For an extended evaluation, invite other designers to the team review session, maybe over an extended catered lunch meeting or towards the end of the day over pizza and drinks. The other designers should have spent some time prior to the session (at least 30 minutes) familiarizing themselves with the website or app. This will enable everyone to explore the website or app together as a team, identify and discuss problems as they find them, and discuss possible solutions.

One evaluator should set the stage by outlining background information and problems identified. This should be followed by a facilitated review of the website or app (often using a structure like the one outlined in “The Elements of the Mobile User Experience85” to guide the discussion). Assign a team member to document the session, including the problems identified, ideas, questions and solutions.

Download the sample evaluation list
Download the sample evaluation list9 (XLSX, 10 KB)

Document and Report

The evaluation spreadsheet is a nice way to capture and organize problems and recommendations, but communicating the issues visually is easier. I usually create a slide presentation, organized by the article linked to above10. One slide is dedicated to each severe problem, with screenshots and callouts to elaborate. Less severe problems are grouped together according to the screens they appear on. Along with each problem and its impact, list actionable recommendations. For detailed evaluations, also mock up key recommendations that address the problem and incorporate best practices.

Begin the presentation with slides that set the context and explain the methodology and approach. Mention that the evaluation focuses on identifying problems, so that members of the design and development team do not start passing around antidepressants when they see the laundry list of problems they have to painstakingly work on.

Conclusion

A mobile UX diagnostic is not a replacement for testing with actual users, but rather is meant to quickly identify problems with a mobile website or app using trained eyes. A diagnostic will uncover most of the top usability problems11, and because it is relatively inexpensive and quick, it can be conducted at multiple points in a user-centered design process. Diagnostics go a long way to improving a mobile experience, reducing flaws and meeting users’ expectations.

Related Resources

(da, ml, al, il)

Footnotes

  1. 1 https://econsultancy.com/blog/65041-making-the-most-of-mobile-moments-to-transform-the-customer-experience
  2. 2 http://www.smashingmagazine.com/2011/05/02/a-user-centered-approach-to-web-design-for-mobile-devices/
  3. 3 http://blogs.forrester.com/megan_burns/14-01-21-introducing_forresters_customer_experience_index_2014
  4. 4 http://www.itworld.com/article/2832575/mobile/how-to-grab-a-screenshot-from-iphone–android–and-nearly-any-other-smartphone.html
  5. 5 http://www.smashingmagazine.com/2012/07/12/elements-mobile-user-experience/
  6. 6 http://blogs.forrester.com/adele_sage/10-01-13-announcing_forresters_web_site_user_experience_review_version_80
  7. 7 http://en.wikipedia.org/wiki/Harvey_Balls
  8. 8 http://www.smashingmagazine.com/2012/07/12/elements-mobile-user-experience/
  9. 9 http://provide.smashingmagazine.com/evaluation-issue-list.xlsx
  10. 10 http://www.smashingmagazine.com/2012/07/12/elements-mobile-user-experience/
  11. 11 http://www.measuringusability.com/blog/effective-he.php
  12. 12 http://www.nngroup.com/articles/summary-of-usability-inspection-methods/
  13. 13 http://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/
  14. 14 http://www.uxmatters.com/mt/archives/2014/01/conducting-expert-reviews-what-works-best.php
  15. 15 http://www.smashingmagazine.com/wp-content/uploads/2014/08/mobile-user-experience-diagnostic-sample-slides2.pdf
  16. 16 http://provide.smashingmagazine.com/evaluation-issue-list.xlsx

The post A Guide To Conducting A Mobile UX Diagnostic appeared first on Smashing Magazine.

Categories: Others Tags:
  1. No comments yet.
  1. No trackbacks yet.
You must be logged in to post a comment.