Issue |
Article |
Vol.28 No.4, October 1996 |
Article |
Issue |
"Schedule, function, and cost: those are the tickets to a good evaluation. Usability is gravy, earns little and risks much." Speaking under the veil of anonymity, a software development manager puts forth this insightful, but not surprising, observation. Another manager volunteers his view: "Usability personnel have been regarded as a necessary evil ... designers seldom listen to their suggestions." Armed with powerful commentary such as this and results from a usability leadership assessment, a small team of skilled assessors can help even the most recalcitrant software development management team improve their software development process. With help, you can do the same for your organization.
The Plan for this SIG
In a SIG conducted at the Denver CHI '95 conference, participants were provided with a summary of assessment findings from 53 organizations, shown the assessment process and background, asked to complete a self-assessment of their organization, and invited to discuss their self-assessment results.
For the past seven years teams of senior usability/human factors professionals and highly experienced software developers have been evolving an assessment process and honing evaluation skills at 53 usability leadership assessments in 28 different organizations worldwide. The teams use four inputs and two tools during the week-long assessments.
Assessment Locations N. America Europe Japan
Applications & Systems 35 3 4 Microcode 2 Multi-location solutions 2 1 Assisted Self-assessments 5
The majority of assessments were conducted in North America and focused on application software being developed by a single organization. Other assessments were conducted for multi-location products, micro-code, and even assisted self-assessments. Additional assessments were conducted in Europe and Japan. Table 1 shows the distribution of assessment types and locations.
Table 2: Assessment Program Results
Maturity Stage 5 4 3 2 1
Organization Awareness 4 19 18 9 3 Activities 13 23 14 3 Improvement actions 9 27 14 3 Skills Character, vitality, impact 5 27 18 3 Resources 4 23 20 6 Process Early/continual user focus 13 24 14 2 Integrated design 8 26 17 2 Early/continual user tests 3 20 22 8 Iterative design 7 24 19 3 Overall 4 30 16 3 0 ----- ----- Participant rating 43% 35% 22%
A summary of the assessment program results is shown in table 2 and includes:
These assessment results were of high interest to SIG participants. However, greatest interest came in discussion of characteristics which are shared by the highest rated organizations along with comments from individuals in these organizations. Some of these characteristics and some typical comments are:
=> "When you meet (executive name) in the hall he may ask about function or performance, but you will never meet him when he doesn't ask about usability."
=> "Management actions (usability) are visible and effective."
=> "Usability has been the guideline for every decision."
=> "The project was redirected a few times and fine-tuned many times for usability."
=> "This team showed more commitment to usability than any project I've worked on"
=> "End users were set up as design partners .... they had veto rights..."
=> "Every step of development, from functional design through implementation and beyond, has included direct user participation."
=> "Users are in the driver's seat, our management formed this project on that basis."
Noticeably absent is mention of usability skills and a usability process. While present and critical to the success of many organizations, these did not appear to be determining characteristics of a higher usability management maturity level. The two most successful organizations, as defined by assessment results had no explicit usability activities identified in the documented development process and no one specifically assigned to usability engineering. Rather, these organizations possessed an incredible "user centered culture" that was at the core of all they did. They practiced good usability engineering out of organizational desire, not management or process fiat. Interestingly both of these organizations, at the completion of the assessment, decided to assign a person as usability focal point in an attempt to further improve usability effectiveness.
The self-assessment was facilitated by a booklet organized into the nine assessment categories, each containing category attributes and a rating scale. The categories, descriptions, and rating scale were similar to those used to conduct the formal assessments described above. Each participant rated his/her organization in the booklet and on a worksheet which was collected and summarized for the attendees. Due to the differences in the processes used, direct comparisons of the Usability Leadership assessments and results of the SIG self-assessment (table 3) should not be attempted. As in the assessments, overall ratings were very low, management understanding rated batter than management actions and activities, HCI skills, impact and resources ratings varied widely, and process-related categories fared better than the other areas assessed.
Table 3: SIG Self-Assessment
Maturity Stage 5 4 3 2 1
Organization Awareness 11 14 10 6 3 Activities 19 12 8 4 1 Improvement actions 18 17 7 1 1 Skills Character, vitality, impact 7 9 16 10 2 Resources 14 11 9 6 4 Process Early/continual user focus 8 14 13 6 3 Integrated design 9 21 11 2 1 Early/continual user tests 10 12 9 10 3 Iterative design 8 19 13 2 2 Overall 12 18 11 1 2 Participant rating
SIG participants were genuinely interested in the assessment results and open to the idea of organizing a formal Usability Leadership Assessment or sponsoring a self-assessment for their organization. Several noted that presentation of the assessment results alone could have a positive effect on executive management. However, most felt that there was greater potential value in the formal assessment because of the perceived objectivity of outside consultants.
George A. Flanagan
Senior Consultant
IBM Consulting Group
10508 Whitestone Road
Raleigh, NC 27615 USA
Telephone: +1-919-847-3954
eMail:
gaf@vnet.ibm.com
Mr. Flanagan is a senior consultant with the IBM Consulting Group's Application Development Effectiveness consulting practice. He has more than 30 years of software development experience. He has managed large complex development projects for both mainframe and desktop platforms and has consulted with companies worldwide in the areas of application development process, software usability, and GUI design. He has been a frequent speaker at education seminars and industry conferences on these topics.
Issue |
Article |
Vol.28 No.4, October 1996 |
Article |
Issue |