Issue |
Article |
Vol.28 No.3, July 1996 |
Article |
Issue |
This workshop, held at CHI 95, focused on field research methods that allow us to incorporate a holistic understanding of users and their work and its context into the design process at the earliest stages. Although these techniques hold great promise and have attracted attention and interest in the CHI community, they have not been widely adopted or systematically discussed in the published literature. This workshop was developed to address this deficiency. We were fortunate in receiving a large set of position papers in response to our call for participation; we accepted 12 of about 20 papers submitted (and we both had a case to contribute, for a total of 14 cases).
We planned the workshop activities to allow for maximum interaction among the participants. So that we could all arrive already acquainted with each others' studies, we circulated the participants' position papers ahead of time. Our goals for the day-and-a-half workshop (extended by popular demand to include working lunches and extra time on the second day) were to develop a common understanding of the work done in each of the studies, to map the work done in each study to the stages of the design process where it fit best or had the greatest impact, and to come away with a systematic model of the terminology, methodology, and effectiveness of field research as represented in this set of cases. These goals were naturally larger than we could achieve in the short time allotted us, but the enthusiasm and dedication of the group has enabled us to continue to work on them since CHI 95 and will yield a published collection of case studies and essays this Fall.
Our first activity was to set up a bulletin board display that allotted room for each of the studies. We had a beginning work session in which each participant, based on his or her preparatory reading of the cases, could post questions to anybody else's study. The leaders of each study then took some time to answer everybody's questions. The purpose of this activity was to ensure that all the participants had a shared understanding of the research cases. (We also had a "parking lot" posting area where we could put up questions of general interest: issues having to do with developing a common terminology, development of a framework for field research, and ways to make field research a success.)
To provide structure for the following discussion, we set up a map of the stages in the design process. We then asked each participant to position his or her work in relation to that timeline. In walking through the timeline, we discussed numerous issues related to the pressures of doing this kind of research and translating the results into useful forms for product development.
We had set ourselves six goals for the workshop; below we comment on our ability to address each of them.
We will briefly summarize the workshop cases and then extract some general principles for conducting field research successfully.
The thirteen cases make up six groups: focus on roots in ethnography and participatory design (three studies); focus on the design process (two studies); two organizational case studies; focus on specifics of method (three studies); focus on contextual design (two studies); and one case of bringing the field into the lab. The cases are summarized briefly below.
In the first three studies, the researchers borrowed methods and techniques of field research from established practice in Ethnography and Participatory Design and then modified the methods so as to accommodate the specific demands and constraints of their product-design contexts:
Larry Wood of Brigham Young University described an approach to interviewing that he developed by synthesizing insights from the two fields of cognitive science and ethnography. The goal of the interviews was to produce a description of current work practice, derived from practitioners themselves, for use in later stages of the design effort.
As part of a larger effort to define the requirements for a proposed imaging workstation for diagnostic radiologists, Judy Ramey and her research team at the University of Washington conducted a study in which they combined the ethnographic "stream of behavior chronicle" and the retrospective verbal protocol ("stimulated recall") to capture both radiologists' task performance and their own expert commentary on it.
Michael J. Muller and Rebecca Carr of US West Technologies came up with a new use for two methods that originally were created to support the classic Participatory Design goal of direct end-user participation in design. Both CARD and PICTIVE rely on paper representations of system states that analysts and users can discuss and modify. In this application of the methods, the investigative team used the system images provided by CARD and PICTIVE to iteratively exercise, discuss, and critique detailed work scenarios.
The next two studies shift the emphasis from methods per se to the design process itself, how it can accommodate or respond to field usability data, and how field research fits into the overall flow of activities:
Dennis Wixon and his team were able to integrate a variety of field research methods into the design process. In defining the product concept, they conducted Contextual Inquiry and work-based interviews. In the second phase, defining product capabilities, they used surveys of customers and a computer-based tool for analyzing scores given possible product features. In the third phase, design of the user interface, the team returned to field research methods. During implementation, they did user-interface prototype tests and evaluations. And finally, during the formal product test, they conducted tests in which customers used the product for production work. Thus field methods were integrated with a suite of other methods, each used at the point in the process where it would be most effective.
Martin Rantzer describes a method meant to serve as a framework to link together existing usability tools and practices systematically to supplement the early phases of traditional software development. Called the Delta Method, the approach focuses on design of software interfaces (including user documentation), with the goal of supporting the system designers and technical communicators (neither of whom could be assumed to have formal education in human factors or usability) as they carry out the customer and user analysis and the design of a prototype of the interface.
The next two cases addressed the "politics" of field research in product design -- that is, the varying needs and goals of the partners in the research and the susceptibility of the research to larger business decisions.
Robert Graf of Dun and Bradstreet Software, Inc. (DBS) describes field research he did for a new release of an existing system intended to support the internal sales staff. When it became clear that user acceptance of the system was a major issue, Graf proposed a field study so that he could get a realistic picture of the intricacies of the sales process. Overcoming a number of situational challenges (participants unwilling to be videotaped, difficulty in synthesizing the data) he developed a summary report, three detailed task analyses (one for each of the critical audiences, with a "day in the life" approach), and two business models (a model of the idealized sales process and a model of all the information the salesperson needs to do his or her job).
David Rowley of Varian Chromatography Systems also had an organizational perspective on field research, but focused specifically on issues related to field research done by a cross-functional design team -- engineers, technical writers, marketing personnel -- that was tasked with the design of a specific product. Rowley finds that removing the barriers often found between functional departments results in improved communication and coordination within the team, but the lack of centralization may have reduced the impact of field study findings. Also, the amount of data generated was overwhelming; Rowley recommended a number of changes to their process so as to make the research findings more accessible and manageable.
The next three cases looked at the nuts and bolts of field research that can often determine its success or failure.
Susan Dray and Deborah Mrazek report on the field research they did for Hewlett Packard for the global home and family market. The researchers used a number of methods: naturalistic observation, contextual inquiry, ethnographic interviews, and artifact walkthroughs. The unifying idea, though, was that to gain insights as to how families use computer technology, they would go to the homes of representative families in the US and in Europe to see firsthand. Dray and Mrazek provide substantial detail about the logistics of arranging the family visits and the actual visits themselves.
The special challenges of doing field research in the medical equipment industry was the focus of Diane Brown's case. Brown's employer, ATL, Inc., builds medical ultrasound imaging systems. Historically, ATL's engineers designed in response to requirements statements and evaluations provided by former users now employed at ATL. But in response to customer complaints, management agreed to create a usability group. This group saw a need to conduct field studies; Brown described in detail the evolution of their approach from the first, unfocused site visits through the process of redefining and sharpening their focus, choosing new sites to visit based on more specific criteria, and improving and diversifying their methods.
Kristin Bauersfeld and Shannon Halgren of Claris Corporation presented three field study techniques, adopted from traditional methods (the condensed ethnographic interview, passive video observation, and the interactive feature conceptualization), that they designed to work with very short time frames for conducting research, interpreting the findings, and applying the results. This effort to do field research was motivated by an opportunity for the Interface Design Group to get involved at the conceptualization phase of design of several new products, a departure from the more typical pattern of coming in at the end to do lab-based usability tests.
Stan Page and his cross-disciplinary team at Novell Inc. did field research to get design input for the next generation word processing application. Their focus was broad: the work practice surrounding the making of documents. The team, composed of members from development, human factors, documentation, marketing, and usability testing, used the Contextual Design process as taught by Karen Holtzblatt and Hugh Beyer (1995). When the research effort showed promise, management decided to double the effort; one team continued the research into the making of documents and the other expanded its scope to include all business work practice. Page described the way the two teams organized themselves to conduct their work and communicated with each other and with the company at large.
Dianne Juhl of Microsoft Corp. used Holtzblatt and Beyer's Contextual Design process to determine the main activities and projects that people do at home, with special focus on understanding the integration or connections between activities. Since the research team was interested in home activities regardless of whether the activity was done on the computer, the team felt that going to the home environment and engaging consumers in a dialog would be more useful than traditional lab or focus group methods. During the 30-day survey of the activities in the home, the CI team observed 19 people in six households and collected over 2,000 separate data points. The data were analyzed using group data analysis techniques, especially the construction of affinity diagrams, work models, communication models, context models, etc.
Janette Coble of Washington University developed a number of interesting variations of the Contextual Design Process for the design work on physician's medical workstation. Her approach began with a traditional Contextual Inquiry Interview and data interpretation using the several of models data representation models suggested by Contextual Design. From this process she generated a set of 542 requirements. She approached the problem prioritizing requirements by having physicians rate their importance. This quantitative approach neatly complemented the qualitative methods used in Contextual Design. Overall this work produced a requirements document for a physician's workstation.
The last of the case studies presented an alternative to going out to the users' environment to gather data; instead, it brought the field into the lab.
Mary Beth Butler described a method in which she and her co-workers at Lotus Development Corp. have users bring samples of their work to the Lotus offices. Users sit with product team members around a conference table and use these samples (data files, sample applications, or hard copy printouts) to explain their work. The sessions, a form of artifact analysis, are focused on uses of specific product features or work patterns. Participants are encouraged to bring along a co-worker, from whom the team might get additional information. The discussion between the user/participant and the development team members is informal; however, the team does have an agenda of questions that they hope to cover during the course of the Roundtable.
Based on the workshop, Wixon (1995) has produced a general taxonomy of field research methods; see also taxonomies by Muller (1993) and Card (1996). Rather than recapitulate those ideas here, however, we will simply state some general principles for conducting field research successfully within a product development environment and provide examples from the workshop.
The overall process for completing field research in a product context is simple (as outlined by Ramey and others):
The first step contradicts the well-established principles of questionnaire design, experimental research, and usability engineering. In general these methods rest on defining explicit issues to be investigated, establishing explicit data-gathering and measurement techniques, and stating a priori desired or expected results. In contrast, field research begins with relatively few constraints and explicit questions. For example in David Rowley's case he points out how their original explicit detailed questions were combined into general open-ended focus areas to avoid presumptions about work organization, etc.
This openness creates a challenge when structuring the data. In fact several participants in the workshop reported feeling overwhelmed by the amount of data they had collected. This sense of a mass of data has been pointed out in other literature on field method (Miles and Huberman 1995). The challenge is compounded by the fact that development schedules usually require an extremely quick turnaround of results. Ways to address this problem include structuring the data as you gather it -- as is done in the Card or Pictive method discussed by Muller (1995) and having well-defined models into which you structure the data (Holtzblatt, and Beyer). Finally the need for structuring or presenting the data is minimized if a cross-functional team participates in the initial interviews and thus experiences the users' work. This experience often finds its way into design without reliance on formal approaches of specification and communication.
In the broadest sense designers considering field research face the challenge devising or adopting an approach the best addresses the following goals:
It's relatively simple to meet any one of these goals. In-depth understanding can be achieved through months of interviews and/or participant observation, but the challenges of impact and timeliness may go unmet. One can achieve a timely result which achieves only superficial understanding and probably minimal impact. Finally, one can test a design which is close to completion. Such approaches have a demonstrated impact on the design, but do not generally produce (by themselves) much understanding of user work. The most effective approaches meet all these goals.
In some cases the overall environment made using field research methods easier. For example, Larry Wood reported on a well-planned and carefully conducted study carried out in a university research project. Similarly, Judy Ramey reported on studies conducted as part of a research project. David Rowley pointed out how the existence of small cross functional teams made it easier to conduct field studies with the whole team. Similarly, projects done at Digital benefit from the positive image Contextual Inquiry has developed over the last 10 years. In other cases, researchers (Halgren and Bauersefeld, personal communication) used a "stealth" approach. When the marketing organization claimed that there would not be time for field research, Halgren simply said that they would be doing a specialized kind of situated focus group talking to users about their work. Re-labeling the work made it palatable.
Ramey reports how they were used to discover the fundamental characteristics of radiologists' work and thus to orient the overall design of a set of products.
In several examples (Microsoft -- Juhl, and Hewlett Packard -- Dray) companies turned to field research when entering new and undefined markets. In these cases the users and tasks are not simply undefined; instead the a general framework for understanding this market and design space does not exist. Interestingly corporate management often approaches these areas with the expectations derived from developing products in well defined and highly competitive markets such as word processing or financial analysis.
In contrast, in the focus Usability Roundtables which Mary Beth Butler has developed at Lotus, provide input into well established product set with a well understood user and market base. These studies are notable not only because the users come to the laboratory at Lotus, but also because they are designed to solicit relatively open-ended reaction to the design of new capabilities for well established Lotus products
In studying users of word processing systems Stan Page discovered that users needed a way to incorporate graphics into a fixed amount of space in a text document. The result was the "make it fit" capability of WordPerfect.
The disconfirmation of assumptions is often pointed to as one of the strengths of field research. In product development simplifying assumptions, like using the a requirements document from a related "Windows" product for a Macintosh product may appear to save time. However, when disconfirmed by the early application of field method with actual target customers, they can successfully redirect the product development effort (Wixon).
In researching the requirements for a system to be used by the sales force Bob Graff of Dunn and Bradstreet discovered that the metaphor of "the deal" could be used to generate a coherent design. Similarly, Diane Brown when investigating users of a ultrasound system found that a musical metaphor of staffs and measures could make the overall design coherent.
Rather than a well-defined and somewhat rigid set of methods, field research techniques represent a loose collection of approaches united by a few simple assumptions or guiding principles (Wixon, Holtzblatt, and Knox, 1990). As a result these methods combine well with other techniques to produce hybrids which are more vigorous than either of their parents. Below are some examples:
Larry Wood presents compelling examples of how the general approaches and assumptions of cognitive science combine well with field oriented methods. A key assumption in the development of Wood's method was that the people currently doing the work should be respected as experts at it. Thus, Wood used the insights of cognitive science into the nature of expertise and the organization of expert knowledge: expert knowledge is generally organized hierarchically; it is stored as "chunks" of patterns with associated procedures, and thus can be viewed as "object" knowledge and "process" knowledge; and, to a large extent, it is automatic or tacit and thus may be difficult for the expert to articulate.
The participatory design approaches like CARD and PICTIVE with their focus on user work and uncovering the thinking of users through the use of concrete scenarios spring from a shared set of assumptions/values such as the primacy of the user and their situated knowledge of work.
In their design work Wixon used early field research methods such as artifact walkthroughs to produce scenarios which were then formed the basis for several iterations of prototypes.
In designing a medical system Janet Cobel reports how field research methods generated a detailed rating survey of 542 requirements. Convention test design would recommend against a 542 item survey as being too cumbersome. However, since the physicians were already part of the design process and since the survey was based on the results of previous interviews, responses were complete with additional comments added.
As part of the Delta method, Martin Rantzer used field based interview results as a basis for user tasks and usability goals, which were then validated by the users.
The situations to which field research has been applied are quite diverse. Only over time will a body of practice develop on which new work can be based. Some examples are:
Susan Dray faced the challenge of conducting interviews throughout Europe in doing them in people's homes. She found that several techniques such as bringing dinner, focusing on children, collecting artifacts and sending the participants pictures from the interview served to establish excellent rapport throughout and after the interview despite language barriers
When attempting traditional Contextual Inquiry interviews Mary Beth Butler found that users were reluctant to allow investigators into their work environments. However, they were willing to bring samples of their work to Lotus and participate in a round table. This approach created better rapport and reduced costs and allowed for more targeted interviews. The potential loss of context was minimized since for many spreadsheet users their context is carried in their spreadsheet models
In interviewing ultrasound technicians, Diane Brown and Judy Ramey found that much of their work was cognitive and not directly observable. Traditional methods such as thinking out loud could not be applied since the work was conducted under time pressure. The alternative approach was to video tape the users as they worked and then conducted a follow up interview based on the video. This approach which is a specific instance of stimulated recall worked well.
One of the biggest challenges is sharing the data gathered from field research. Often the data is not easily condensed or explained. The interview team understandings the results, is convinced of their validity and can see how to apply them. Success may depend on others who did not help collect the data. Stan Page reports that the interview teams were challenged with sharing the data across many design teams. They employed several novel methods including: liaisons -- where a member of the interview team would work with another team on an on-going basis; mentors -- to get a new team up and running quickly, model sharing -- where consolidated work models where put on line and shared electronically, open houses -- where interview teams would invite other teams to share their results, bulletin boards -- where were weekly progress would be posted and open discussion could take place, and design reviews -- where specifications would be handed off using detailed walkthroughs of the design documents.
Halgren and Bausefeld report on three methods developed for different purposes. The condensed interview was designed to provide an understanding of the users work environment and culture. A passive video recording was used to sample the flow of work without the disruptive presence of an interviewer. An interactive feature conceptualization in which users rated the importance of features and grouped them conceptually.
Diane Brown classifies field methods into three basic types. The first have a narrow focus; are conducted quickly and produce a data within a limited range. The second type are oriented toward understanding work tasks which are currently performed. Their goal is to incorporate them into system design. They often result in incremental improvement of an existing system. The final type of research aims for a complete rethinking system design and thus probes more deeply into the work to be supported and the design possibilities provided by technology. These broad categories provide a valuable way for the practitioner to plan a field research study and structure the results.
One of the criticisms of field based research methods is that they lack an a well defined output (Cusumano and Shelby, 1995). A general recommendation from several of the participants was that the data analysis and presentation of the results should be planned at the outset. Several of the workshop cases provide well-defined deliverables:
At Varian design implications drawn from the site visits were fed directly into the change control system. This allows for recommendations to be systematically tracked and uses existing mechanisms to provide that tracking.
At Microsoft, Claris, and Word Perfect the findings were entered into a data base where they could be easily accessed and shared. Since the findings are based on user work or home life they tend to have long term applicability to the design issues.
In the Contextual Design multiple representations are produced from the data. These include sequence models which simply trace user action, context models which represent the organizational environment of work, physical models which represent the physical work environment, flow models which represent action and interdependencies over time, and artifact models which are used to guide the creation of new artifacts.
The problem of representation and hand off can be avoided altogether if the entire design team and their management can actually experience user work. Several teams made video tapes then were shown to those who could not be on the visit. Audio transcripts can serve a similar purpose. It's often best if other members of the team can take part in visits. However, success will depend on the team members motivation and skill.
Wixon and Brown used a prototype to demonstrate how a system would look that reflected user work. Other methods such as Contextual Design or CARD and PICTIVE produce intermediate deliverables that can be easily structured into a design
We expect that field research to be more extensively used in product and tool development. The methods will become more standardized, widely shared, and better understood. Tools from ethnographic research (Miles and Huberman, 1994) may well be incorporated into design. The documentation of "success cases" such as the origins of Word Perfect or Quicken will provide an incentive for adoption. If the decade from 1985 to 1995 could be called "the decade of the usability lab", the decade from 1995 to 2005 may become "the decade of field research". Since we believe these methods in their many forms constitute some of the most effective ways to meet the twin goals of making technology better serve people and providing companies with profitable products and cost saving tools, we welcome such growth.
Judy Ramey joined the faculty of the Department of Technical Communication, College of Engineering, University of Washington, in 1983. In 1989 she founded the UW Laboratory for Usability Testing and Evaluation (LUTE), and since then has served as its director. Through LUTE, she has conducted a number of corporate-sponsored research projects.
Dennis Wixon joined Digital Equipment Corporation in 1981. He currently is program manager for usability at Digital. He has been active in CHI community and has published numerous papers on interface design process.
Issue |
Article |
Vol.28 No.3, July 1996 |
Article |
Issue |