Sign In

Communications of the ACM

Communications of the ACM

Organizing Usability Work to Fit the Full Product Range

In May 1998, Microsoft's usability organization celebrated its 10th anniversary with a party at the Microsoft Museum. We welcome this opportunity to reflect on our growth and history. At Microsoft, usability work is performed by a group of roughly 100 people, including full-time employees, temporary employees, and contractor staff. The staff includes 63 full-time usability engineers and 28 contract usability specialists, supported by five usability recruiters, 14 usability schedulers, two data entry clerks, and one gratuity coordinator. Usability staff is closely aligned with product organizations, sharing their successes and failures through a common reporting structure. This is apparent when reviewing the goals of the usability organizations, which are defined in terms of product success:

  • Develop good product designs by helping product teams to understand all relevant information about user behavior at each stage of the design process.
  • Measure the success of product designs so product teams will have an accurate measure of how well their product performs for the user.
  • Improve the design process by informing product teams about empirical methods and by creating ways to make these methods a routine part of design.

Following recent trends in large software companies, Microsoft usability groups have been committed to product organizations. A central group serves the needs of some small product teams, and also provides a source of company-wide leadership, as well as general support for all of the usability groups through a staff of usability coordinators and technical staff. Usability coordinators and support staff reserve lab space, schedule and coordinate site visits, recruit participants fitting a very wide spectrum of profiles from all over the world, and manage the distribution of gratuities for participating. The lab technicians keep the laboratory tools and equipment working and at the state of the art. Together, the staff conducts an average of 74 studies per month, involving an average of 670 participants per month, as well as an average of 15 site visits per month involving an average of 57 site visit participants per month.

Usability evaluation begins early and continues on an as-needed basis.

Product work is done by separate usability organizations in many different areas including children's products, Web products, hand-held devices, office products, and operating system platforms. Finally, there is a recently formed team of usability experts in Microsoft Research.

Back to Top

A Wide Product Range

Microsoft products are generally intended for a very wide diversity of customers. This range leads to different challenges from those faced by usability engineers for products with a narrowly focused target user group:

  • While Microsoft product teams have tried to understand their customers and to develop software that meets consumers' needs, we're always working with an unpredictable plurality of goals, tasks, and needs. This generality makes it extremely difficult to pin down what the right set of product characteristics might be for any given task or user domain. Through home and business visits, Microsoft teams, often led by usability engineers, have watched users carry out their daily tasks and activities in order to get a better understanding of how to design software that would more neatly integrate into the consumers' lifestyles. If consumers cannot understand how to install, set up and master products in this class, they will simply not use them, or will use them in a very limited manner.
  • Microsoft develops some software products to be components within other manufacturers' products. In these cases, such as hand-held computers or computers in car radios (AutoPC), manufacturers add value by designing their own hardware packages, and sometimes additional software functionality. Usability staff must work in a way that is specific about the Microsoft components of the product and generic about any additional components and the hardware. At times this may require working across corporate boundaries and with different corporate cultures toward a common usability goal.

Back to Top

Product Cycle

Usability fits within Microsoft's well-defined product cycle model, as one of a number of ongoing, parallel, interrelated activities. Usability staff often visits customer sites, sometimes in collaboration with other product-team members. Usability engineers may write a usability specification, which is then incorporated into the overall product specification. Key factors in the specification process are the specification review, prototyping, and evaluation work carried out by the usability engineer on the product team. If incorporated properly, this role can completely reshape the specification prior to "freeze" (the point in the product cycle after which it is difficult to change the specification). However, the usability engineer's role is not limited to this time frame. Working with other product team members, the usability engineer works on design at various levels of specificity, including the following:

  • User experience
  • Conceptual organization
  • Task flow
  • Detailed design

Usability evaluation begins early and continues on an as-needed basis, sometimes on a cycle of less than two weeks. Prior to specifying the product, teams will perform contextual inquiry [1] with usability engineers. Early user interface evaluations may take the form of paper prototypes (see [5, 9, 10]), or user interface inspections, such as heuristic evaluation [8]. Later evaluations almost always involve users, either as test subjects (usability testing) or as co-inspectors (participatory heuristic evaluation, [6]). Usability testing is most often done to identify problems, rather than to evaluate a product or a design against quantitative goals. Small sample sizes are the general rule. However, if there is a usability problem that requires quantitative analysis (for example, comparison with previous benchmarks or competitors' products), then larger-scale tests are conducted with inferential statistical analysis.

Finally, usability engineers organize additional usability evaluations during beta testing. These can be benchmark studies [3], field studies, or longitudinal studies, often in the field [7]. One of these later procedures, called Beta Buddies, involves pairs of product team members visiting users' sites. These beta-testing activities are often the only opportunity for product team members to see users and their environments.

As is true of most usability organizations, much of the work is informal and time is a critical factor. This is, of course, both a strength and a weakness. Care is taken to provide a written transcript of the usability methods and findings, however, and these usability reports are used to share the usability engineer's recommendations. The reports are made available on a company-wide intranet site, which is used widely to review prior findings and design decisions so that previous work is not repeated, or to help improve the design decision process. At times studies are replicated using different products or user interface designs, leveraging our corporate usability resources to abstract broader design principles.

Back to Top


While the number of activities and methods used by the usability engineers at Microsoft is broad and varied, our ability to influence design effectively is a function of when in the product life cycle we gather our user data and give feedback to the team. We educate teams to embrace user-centered design as early as possible in the life cycle of the product. One obvious reason for doing this is borrowed from software engineering. That is, the earlier a problem is identified and fixed, the lower the cost. This is true for usability problems as well as for other software problems. This is an argument that makes sense to project managers, who are responsible for product budgets.

A second reason is that Microsoft teams must share the cost of customer support calls for their products, which comes out of their annual budgets. Including UCD work up front will offset charges to the team after the product ships, and so is a strong motivator to include usability as early as possible in the product cycle.

Finally, the number of usability activities that can be carried out early in the cycle is larger than those that can be done later in development, approaching beta. This means that the team members have more options in terms of how they collect user data to inform their design decisions, including many inexpensive research alternatives. All of these arguments combine nicely to make a strong case for using user-centered methods at the earliest possible point in the design of a product.

Naturally there is still much more that could be done to ensure that every product at Microsoft is designed from a user-centered perspective. Still, it is gratifying to see management occasionally recommend user studies as new user interface ideas are considered. Recently, one large team has formally stated their number-one goal as simplifying and improving the user experience with their product. To this end, cross-divisional usability activity has been incorporated into the daily working atmosphere of this particular team. While this may not seem like the radical shift in attitude that a software team should have, consider the competitive environment in which most products are developed today, and the various pressures to adhere to the continual addition of features, as opposed to simplifying the product. It is an encouraging trend.

Back to Top


1. Beyer, H. and Holtzblatt, K. Contextual Design: A Customer-Centered Approach to Systems Designs. Morgan Kaufman, NY, 1997.

2. Dumas, J.S. and Redish, J.C. A Practical Guide to Usability Testing. Ablex, Norwood, NJ, 1993.

3. Kulik, C.C. and Kulik, J.A. Effectiveness of computer-based instruction: An updated analysis. Comput. Hum. Behav. 7 (1991), 75–94.

4. Lieberman, D.A. Interactive video games for health promotion: Effects on knowledge, self-efficacy, social support, and health. In R.L. Street, Jr., W.R. Gold, and T. Manning, Eds., Health Promotion and Interactive Technology: Theoretical Applications and Future Directions. Lawrence Erlbaum, Mahwah, NJ, 1997.

5. Muller, M.J. Retrospective on a year of participatory design using the PICTIVE technique. In Proceedings of CHI'92 (Monterey, Calif. 1992).

6. Muller, M.J., McClard, A., Bell, B., Dooley, S., Meiskey, L., Meskill, J.A., Sparks, R., and Tellam, D. Validating an extension to participatory heuristic evaluation: Quality of work and quality of work life. In CHI'95 Conference Companion (Denver, CO 1995).

7. Nardi, B. Some reflections on scenarios. In J. Carroll, Ed. Scenario-based Design: Envisioning Work and Technology in System Development. Wiley, NY, 1995.

8. Nielsen, J. Scenarios in discount usability engineering. In J. Carroll, Ed., Scenario-Based Design: Envisioning Work and Technology in System Development. Wiley, NY, 1995.

9. Rettig, M. Prototyping for tiny fingers. Commun. ACM 37, 4 (Jun. 1994), 21–27.

10. Tudor, L.G., Muller, M.J., Dayton, T., and Root, R.W. A participatory design technique for high-level task analysis, critique, and redesign: The CARD method. In Proceedings of the Human Factors and Ergonomics Society 1993 Meeting (Seattle, Wash. Oct. 1993).

Back to Top


Michael Muller ( is a research scientist at Lotus Development Corporation in Cambridge, Mass., and previously worked as a usability manager at Microsoft.

Mary P. Czerwinski ( is a cognitive psychologist at Microsoft Research in Redmond, Wash.

Back to Top

Back to Top

Back to Top

©1999 ACM  0002-0782/99/0500  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.


No entries found