Home   Articles We've Written   Test for Success

Usability Test for Success

Is your company's Web site as good as its people? Take this usability test to find out.

Do your customers enjoy dealing with your company? When they call for information, are the marketing people helpful? When they place an order, do your salespeople make it a quick, pleasant experience? And when they have questions, does your customer support staff provide solid answers?

If you work for a successful company, great customer service is most likely a core competency. Of course, it could be a different story behind the curtain. There, you may be relying on archaic software to schedule presentations, provide answers, order products, and process cancellations—software that's hard to learn and use, but easy to crash. Of course, you don't let your customers see any of this. Instead, you endure the software, translate its output—and treat your customers to positive interactions.

But time is running out for setups like this. Companies that want to stay successful are quickly moving these interactions to the Web, where your customers will expect the same great treatment. Will your marketing information be interesting, understandable, and just a click or two away from the home page? Will placing an order online be a fast and pleasant experience? Will your customers quickly find solid answers to problems?

There's only one way to know for sure: put your new Web site through usability testing.

At its most basic, usability testing requires no more than a Web site, a usability tester, and an observer. The usability tester uses the Web site as a real user would while the observer watches, noting usability problems as they appear. This certainly works, but you'll get more bang for the buck by carefully crafting test scenarios and by getting additional testers, observers, and a facilitator involved.

The key to conducting a successful usability test is planning, starting with understanding the goals for both Web site and users. That understanding will enable you to develop meaningful test scenarios and pick representative testers.

Make sure you understand your company's goals for the site. Are you trying to attract new customers or provide better service to current ones? Do you need to reduce the cost of orders by eliminating phone sales and moving order entry to the Web? Do you want to improve customer support by offering current and complete documentation on the Web?

After you clarify your business goals, move your attention to your customers. Take note of their interests, motivations, and hot buttons. The better you understand your customers, the better job you can do when picking representative testers.

You also need to understand what you're testing. Early in the development cycle, it may be only a paper prototype or an online graphical mock-up. Later on you might be testing a partially completed Web site. The database might contain limited test data, and some features might be stubbed out. Only near the end of the development cycle will you be testing a working beta site. And if you're revising an existing site, you need to understand how the changes will fit into and impact the current site.

Focus your testing on problem areas

With this understanding, you're ready to create test scenarios. These describe a task and ask the tester to accomplish the task using the paper prototype, graphical mock-up, or actual Web site. Arrange for a few colleagues to try out the test scenarios, and then refine them as needed. Make sure they're absolutely understandable and not too much of a challenge to complete in the allotted test time.

Figure 1. BizTalk.com usability test scenarios. This sample test scenario specifies a realistic set of related tasks a user might perform using the BizTalk.com Web site. It's one of a series you'd develop to fully test the site's usability.
Page snippet: Usability test scenario.

If time is an issue, you can plan to test only those parts of the design that especially worry you. Say you're concerned about how much emphasis to place on your Web search. You've put it at the bottom of each page, trying to make it available, but not distracting. For this situation, you'd design your test scenarios to reveal placement problems. Design one test scenario that requires use of the search tool and another that would be better handled without it. But don't be surprised if you learn more than you expected. Odds are more than half of your customers will be search-dominant and gravitate to the search tool, no matter how prominently you place the desired links on the page.

Now turn your attention to selecting the testers who will represent your most important kinds of customers. If your regulars will use your Web site repeatedly, try to select testers representing a broad range of skills and knowledge. In your initial tests you have a chance to observe how quickly people new to your site learn how to use it. If you can conduct multiple tests with the same users, you can also observe how well the site helps users make the transition from beginner to intermediate to expert. But even if you don't have time for multiple tests, you can get a sense of your site's learnability from using testers with a broad range of capabilities.

If you pick well, you won't need many testers to uncover most of your site's usability problems. Five to six testers probably reveal more than 80 percent of the usability problems, while you'd have to enlist at least 15 testers to uncover 100 percent. So save time by keeping your test group small. Then put the dollars you save into running additional tests at each major stage in the Web site development cycle.

You also need to select a facilitator and line up a set of observers.

Facilitators need top-notch people skills to put the testers at ease. A facilitator must convince the testers their job is to uncover problems with the software's usability. When this isn't handled skillfully, testers can get defensive and uncommunicative. Unless convinced otherwise, testers will assume you're testing their own skills and knowledge, not the software's usability.

I recommend involving the entire development team as observers. Watching a user struggle with one's pet design feature produces quite an impact.

The tester's job requires an intense and exhausting level of concentration, so limit each person's test time to 30 minutes if possible. Plan time in the schedule for debriefing each tester, then each observer, after each test session.

You also need to select the test area. If your company has its own usability lab, you're in business. If not, you still have several viable options.

No equipment? No sweat!

If you're using video equipment, you can set up a simple, yet effective, test space with two back-to-back cubicles. The tester and facilitator can work in one cubicle, and the observers can watch video camera output, played back live on a television in the other cubicle. This inexpensive approach has the advantage of enabling several people to observe (quietly please!) without distracting the tester.

If cost is an issue, video equipment is not mandatory. Any office or conference room with space for a computer and three people—the tester, the facilitator, and an observer—is sufficient. Rotate observers so as many developers as possible get to learn firsthand about any usability stumbling blocks.

Other options include renting a portable test lab and renting time at an outside test facility.

Keep testing the tests

Now let's take a closer look at the testing process.

The facilitator will begin each test session by putting the tester at ease, encouraging honest criticism and asking the tester to think out loud. Unless the testers are experienced with the process, they need to hear about how usability testing works and how the findings will be used.

Facilitators should minimize interaction with testers once the test starts. When real customers use your Web site, they won't be able to ask questions of a facilitator. So tell testers they're on their own with the test scenarios. The facilitator should explain it's OK—even expected—for them to get stuck and not be able to complete one or more test scenarios. This just means they've helped find a spot where the Web site needs more work.

As a usability tester works through the scenarios one by one, the observers should document the UI features applied to each task, noting any related usability issues. When you have more than one observer, consider assigning one the task of documenting the tester's emotional responses. A furled brow or wince can be revealing. Some testers are quieter than others. If necessary, the facilitator should ask questions occasionally to encourage testers to voice their thoughts on the usability of the site.

Debriefing is an important part of the process.

A facilitator halts the testing when it's time to debrief a tester. The debrief session lets the tester provide general feedback about the Web site and answer observers' questions. When many observers are involved, they should consolidate their questions and pass them to the facilitator. After a tester leaves and before the next one arrives, the facilitator should debrief the observers. The team can then consolidate observers' and testers' findings while the test is still fresh.

Pull facilitator and observers together in a final work session after all the testing is done. The team can come to a common understanding about your Web site's usability problems by working together to analyze and report the findings. If you taped the tests, bring the videotape to the work session. This lets you review portions to clarify observations. Later, you can assemble a persuasive set of clips from the tape to illustrate and support your findings.

To drill down on different aspects of usability testing, you can use a variety of books and sites on the subject as this chart shows.

Testing Books and Sites

Books:

Web sites:

Make usability testing part of your standard Web design toolkit—especially if your company is opening critical business processes to direct customer access on the Web. You'll be glad you did.

Home   Articles We've Written   Test for Success

Copyright © 1996 - 2014 SoftMedia Artisans, Inc. All Rights Reserved.