A conversation I often have:

Company X: “Our company is already customer-centered. We talk to our customers all the time. We’re already asking them what they like about our product, what they dislike, and what features they’d like. Why should we start doing Contextual Inquiry interviews?”

Karen: “We talk to our customers for a lot of reasons. But when we’re trying to gather requirements we need to be sure we’re getting design data.”

When I came into the industry 20 years ago, it was the unusual company that talked to its customers as part of product design. Being customer-centered was simply unheard of. Today you would be hard pressed to find a company that does not claim it is customer-centered; few companies are going to publicly admit that they don’t listen to their customers. But how are they listening, what are the questions they are asking? Are they using product design methods? Or, are they trying to make marketing methods or usability testing do something they are not designed to do?

Pick a customer data collection process that matches your goals

When considering whether to use Contextual Design, many companies want to know if the first step—Contextual Inquiry—is necessary to collect customer data. They argue that they already use traditional methods for customer data collection, such as focus groups, surveys, usability testing, and domain experts. They want to understand what unique value Contextual Inquiry provides beyond those existing processes.

What they aren’t recognizing is that each methodology for data collection is geared toward a different purpose, enabling customers to articulate a specific kind of information that supports that purpose. In order to collect the right kind of data, you must use a methodology that matches your intent. If the goal is to design a product that doesn’t omit tasks or leave out individual steps, then it’s necessary to gather data that can generate a shared, detailed understanding of the nuances of the customers’ work.

Focus groups test sales points and reveal hot buttons

The focus group is one of the most popular marketing techniques. By “focus group” I’m also including meetings with customers at users’ groups, user conferences, trade shows, and the like. These meetings bring people together in a room to talk and bounce ideas off of each other. This environment doesn’t allow close attention or questioning of any one of those people. Often companies will show product concepts, prototypes, or even running prototypes to see the customers’ response. This opens a dialogue, but in reality only identifies hot buttons and sales points. It’s important to know hot buttons and sales points, but they do not provide detailed design data.

Even when you attempt to question customers closely with questions such as “What are your problems?” “What features would you like?” “What do you think of this design?” you are asking them completely out of the context of their jobs to tell you how they really work. You are forcing them to give you their opinions. Opinions emerge from a myriad of self-observations and experiences — questions about why someone holds an opinion won’t elicit design data.

The designer wants to know, “What new concepts or features would make the system significantly more appropriate to the job at hand?” Asking this question in a focus group will not give the underlying detailed data and work context needed to correctly design those features.

Surveys address broad issues and can provide quantitative data

Surveys assume that the surveyor knows what questions to ask and they tend to generate yes/no answers and quantitative data. But as soon as design starts, no one knows what will turn out to be important. “Installation is the #1 problem,” reports a customer survey (a marketing technique). But what is wrong with installation (a design question)? When do installations happen, and who does them? What information is available when they do them? Which of the many alternative fixes is best? When collecting detailed data about a job, there is no way to know beforehand and out of context, what tasks are involved, or what constraints, whether physical or cultural exist. Yes/no answers don’t provide any detail of why someone does a task in a particular way, or why an existing feature works or doesn’t work.

Surveys are useful for assessing a market’s potential, to see what people will pay for a certain function or find out how many people own a specific product. They are good for large issues and directions, but don’t provide the level of detail required for product design.

Usability tests the interface, after the fact

Recognizing that marketing data doesn’t provide detailed design data, companies will employ usability testing as a method for getting customer feedback on the design. The problem is that usability tests rely on made up tasks in a laboratory environment; they don’t provide the grounding of a real work practice in a work environment. Usability focuses on the user interface; it doesn’t test the validity of the underlying structure of a system and doesn’t provide design data. Further, this testing requires a working prototype, which requires a significant development investment. Without collecting design data before investing in development, there is the potential risk for significant re-work and redesign down the road.

Domain experts are distanced from their old work

Many organizations, in particular IT departments, include “domain experts” on their team for insight into the people they’re supporting. But when someone becomes the domain expert, they can provide only one person’s idea about that work; they don’t represent a user population. Furthermore, because they’re distanced from the work they used to do, their ideas quickly can become out of touch and out of date with how the work is done. Removed from their old work environment, they frequently become technology people.

Contextual Inquiry elicits detailed work practice data that can be shared

To design products that fully support a work practice requires detailed data — the tacit, explicit, and implicit details of work practice — that is external, public, and sharable. This data must identify specifically what works and doesn’t work, to allow change to existing or the introduction of new technology. It must include all the steps that workers take to perform their jobs, with an understanding of their goals and intents, to make certain that when the work practice is modified, even if steps are omitted, their purpose will be met. Contextual Inquiry relies on a set of proven techniques that ensure that the right kind of data is collected to drive user-centered product design. It was developed specifically for the purpose of product design, to enable designers to elicit and share detailed, reliable data on customer’s work practice to drive design of products that are both useful and usable.