Skip Navigation Bar
Usability.gov
Home
Dictionary
Site Map
Frequently Ask Questions

Methods for Designing Web Sites - decidingh which type of test to conduct

Deciding Which Type of Test to Conduct

How formally or informally you conduct a usability test depends on

Goals for usability testing
Usability testing can be for:

  • diagnosing problems
  • comparing alternatives
  • verifying that you have met usability goals

Diagnostic usability testing. During development, the main goal of a usability test is usually "diagnostic" — to find out what is working well about a site and what is not working well, so that you can continue with what is working well and fix what is not working well. Usability testing today is more often done during development with the "diagnostic" goal — and the earlier and the more iteratively, the better.

Comparative usability testing. At any point, you may include a goal of "comparing." Instead of arguing about alternative designs, test them both. That's a comparative test. You may also want to compare your Web site to competitors' Web sites.

Verification usability testing. At the end of development, the goal may be "verification," — to show that all is well or to show how close you have come to preset usability goals.

Top of Page

The type of data you want to have
Data can be
:

  • behavior or opinion
  • objective or subjective
  • qualitative or quantitative

Behavior or opinion. Usability testing is primarily about behavioral data — performance — what people actually do when working with a Web site (or any product). If you do not watch and listen to users working, you are not doing a usability test. The behavioral aspect makes usability testing different from other techniques, such as interviews and focus groups.

Usability testing often also includes opinions. For example, you might give users a questionnaire at the end that asks them to rate various aspects of the site or to tell you about their preferences. In a usability test, however, opinion is much less the focus than behavior.

Objective or subjective. Objective data is what you see or hear. Subjective data is what you infer from the observations. In a usability test, you are usually primarily interested in objective, behavioral data — what the participant actually does.

Part of a usability test, however, is also participants' subjective thoughts about why they are taking the paths they are, how they interpret what they see on the screen, etc. We ask participants to "think aloud" while they are working so that we can hear their rationales and interpretations.

Top of Page

Qualitative or quantitative. Your data may be just notes about what participants did and said. It may also include counting various aspects of users' behavior, such as:

  • time to complete a task
  • number of errors or problems in completing the task
  • number of requests for assistance

How quantitative you are in conducting usability tests depends on the other factors — your goals and your philosophy. When the goal is comparison or verification, you are likely to want to include some quantitative data. For diagnostic testing, quantitative data may be less important than a qualitative lists of problems.

Qualitative and quantitative is not an either/or choice. Usability testing always includes qualitative notes. It almost always includes reporting some numbers, such as percentage of participants who had a particular problem or who took a particular path to completing a task. But it may or may not include the types of quantitative data that are traditional in psychology research, such as time and errors.

A qualitative usability test can be just as "rigorous" as a quantitative usability test. That is, if you have selected representative participants, had them do realistic and representative tasks, and taken careful and objective notes on what they did, you should be able to tell with confidence what is working well on the Web site and what needs to be fixed.

Top of Page

Your philosophy about testing and interacting with users
The third factor in deciding how formal or informal you want to be is whether you see usability testing more as "research" or as "partnering with users."

Usability testing draws people from many different backgrounds. Some people come to usability testing from doing academic psychology experiments in a very quantitative, hands-off approach. Others come to usability testing from anthropology, ethnography, sociology, and other fields that stress naturalistic, qualitative observations.

What about comparative testing? In a comparison, you are usually collecting quantitative data to show the differences between two or more alternatives. Therefore, comparative testing is usually formal.

Two points about comparative testing:

  • There are two ways to do comparisons.

    • Half of the people work with one version; half with the other. For a fair comparison here, the people in the two groups must be matched on all the relevant characteristics.
    • Everyone works on both versions. This avoids the problem of matching users, but introduces the possibility of a "practice effect." Therefore, you must alternate which version people use first.

  • You can combine comparative testing with diagnostic or verification testing. That is, the comparison might be just one of the issues in a usability test. The versions you are testing might differ only for some parts of the Web site.

If you are doing informal, diagnostic testing, the note takers might measure time and errors just for the scenarios that relate to the comparison. (You should do this only when the note taker is in a different room, as would happen in a usability lab, so that the participants are not aware that their time on task is now being measured.)

Top of Page

What's New area includes New usability Lessons Learned, More Tech companies see the value and profitability of usability, New W3C guidelines.

Moving Forms to the Web - Thursday-Friday, October 14-15, 2004

Goal Oriented Planning and Testing October 21, 2004

New Research-Based Guidelines Sorting Tool: Sort Guidelines by Importance, Strength of Evidence, and Other Options

What's New Archive

   
HomeDictionarySite MapFrequently Asked Questions
About This SitePrivacy PolicyContact UsAccessibility
 
 

Department of Health and Human Services Logo
   

Department of Health
and Human Services

FirstGov.gov