No earlier issue with same topic
Issue
Previous article
Article
SIGCHI Bulletin
Vol.28 No.4, October 1996
Next article
Article
No later issue with same topic
Issue

The Missing Link: Hypermedia Usability Research & The Web

Report on the British HCI Group Symposium, 1st May, 1996

Simon Buckingham Shum

Introduction

Since the early 1980s when the concepts of hypertext and hypermedia became buzzwords, researchers have been investigating the usability and usefulness of hypermedia across a wide spectrum of domains. Specialist conferences and journals were launched, and countless research papers published the results of theoretical analyses and empirical evaluations of hypermedia systems in use.

Then the World-Wide Web arrived. Hypermedia has gone global.

Suddenly everyone is a hypermedia designer, making assumptions and decisions about non-linear structuring, users' needs, and the use of different media to communicate.

The contributions to this symposium were submitted in response to the following questions:

Specifically, papers were sought which discussed:

This symposium also had the advantage of being able to build on two preceding ACM workshops at Hypertext 96 and CHI 96. Intruigingly, these events were conceived completely independently from this symposium to address almost exactly the same problems, confirming growing recognition of the need to forge closer links between the Web and existing knowledge of HCI and hypermedia design. Since these two workshops and the one reported here, a central Web HCI resource site has been set up at: http://www.acm.org/sigchi/webhci/

The symposium, held at The Open University's Knowledge Media Institute, was very well attended, with 49 delegates drawn from industry and academia, primarily in the UK, but also from Belgium, The Netherlands and Brazil. 15 position papers were submitted in response to a public call for papers, and reviewed by the programme committee, leading to the acceptance of seven in the final programme.

The morning was spent exploring how the Web relates to pre-Web hypermedia systems, and the afternoon then considered a number of proposed tools and methods for developing user-centred Web sites. During lunchtime delegates browsed and discussed several demonstrations and web sites (see end for list). The final session was spent in discussion (chaired by the author), picking up issues which had arisen during the day. Abstracts of all the papers are appended at the end of this report; the full position papers are available at the Missing Link Web site (see end).

The remainder of this report notes some of the main issues which arose during the course of the day, particularly in the final discussion period.

The Agenda for User-Centred Web R&D

Four questions were posed at the start of the day, and revisited in the final 90 minutes of open discussion:

Obviously we were unable to give any of these the kind of detailed treatment they deserve---each is the potential subject of a whole article---but the main goal was to table ideas to guide the agenda for future work. This report presents my personal reflections on the day.

What Do We Know about Hypermedia Usability?

What do we know about...? is a dangerous question to ask in almost any field, since it has the nasty habit of revealing that even `generally accepted views' are rather less generally accepted than one assumed. In design particularly, the nature of the work is so open ended that declaring principles of any kind invites criticism.

HCI, a relatively young field, faces the tricky problem of trying to be a rigorous design discipline at the same time as taking into account the richness and complexity of human work. Efforts to codify any design principles necessarily trade depth against breadth, with the core qualifier being context---users, tasks, environment. "Well, it all depends" is not an uncommon response from HCI specialists when asked to comment on a design principle, and this became something of a theme at this symposium. This theme is returned to shortly in the context of communicating HCI design principles.

Hypermedia research Pre-Web (about the equivalent to "B.C." in world history!) consistently identified user orientation and navigation mechanisms as an issue, under the banner of "lost in hyperspace." How do you keep users from geting lost in the `liberating' net of links now on offer? The most popular interface techniques include: birds-eye view maps/graphs, with footprints to show which nodes have been visited; `breadcrumbing' pages to leave a trail which you can later recover; hotlists of valued pages; salient visual cues (e.g. red pages mean "advanced level"; always providing `go back' buttons (= Undo); always providing `go to top' buttons (= start again); using a metaphor to organise material (e.g. block of flats; library; the human body); providing sequences of nodes which can only be traversed in a fixed order (`guided tours').

Jakob Nielsen (1995) and several of the presenters at this and the ACM workshops have documented which of the most popular features are currently used in Web browsers, as well as those not yet supported. But what is badly needed at present is data on actual user problems on the Web. Large scale surveys such as the 4th GVU Survey (Pitkow & Kehoe, 1995) are one resource for tracking broad trends and user demographics, but more detailed studies are needed to begin to understand the interaction between users, tasks and tools. Interestingly, in her position paper, Davies highlighted the GVU Survey finding that disorientation is not a significant problem for Web users.

In response to the question `What do we know?...", perhaps the most precise claim to be made at the symposium was in Smith & Newman's position paper, which notes that there is good evidence that search tasks are best supported by hierarchical hypertext structures, and browsing by network structures.

The impression I gained at the symposium was that we were generally aware from hypertext research of the problem of disorientation, of the variety of navigation mechanisms that can be deployed to assist users, and that richer node/link semantics is one feature sorely missing in most Web sites (see Future R&D for discussion on solutions to the last point). A comprehensive analysis of the strength of evidence for these and other phenomena not listed here would be most useful, in order to set the Web in context. As the next section discusses, the Web does differ in some important respects for usability analysis.

How is the Web Different from Previous Hypermedia Systems?

The awe and wonder which the Web inspires in so many is by virtue of its unprecedented scale, simplicity, and break from the boring ascii text which dominated the net beforehand. However, we are now realising that we have suspended our judgement in this honeymoon phase, and that things need to improve. The Web is now in a secondary phase, playing `catch-up' with the rest of the interactive world. For instance, in the non-Web world, we do not expect large static images that can only respond to our actions by loading a new version, but rather, ones we can directly manipulate. Similarly, most of us neither want, nor need, to format documents by manually inserting markup codes; tools have evolved from such a mode of working to today's page-layout environments (but see Thimbleby's paper for a critique of how Web authoring tools need to move beyond the page layout model of those currently on offer).

The symposium brought out how the Web is different from previous hypertext systems in a number of important respects, discussed below.

Speed

This is probably the issue that most people think of first. The Web is a global hypertext infrastructure, with all the bandwidth problems that one experiences and reads about every day. Performance can be excellent at certain times of the day, considering the scale involved, but clearly, the internet does not generally deliver the response times to which one is accustomed on standalone machines or smaller networks. User frustration is the order of the day, particularly when the media being downloaded move beyond text to graphics, audio and video (see also Johnson's (1996) analysis of time-based usability issues on the Web). Web designers must therefore prioritise different criteria to ones they might use in designing a smaller scale hypertext or multimedia CD-ROM, in order to balance interactivity with acceptable speed of access. If a page takes too long, it won't be visited. Views vary on whether the bandwidth will never be available (the syndrome of opening a new motorway only to see it jam-up immediately), versus optimism that a brighter future awaits us, once we've installed fibre-optic...

Interactivity

The Web is currently impoverished in the interactivity it offers. Direct manipulation of objects and networks is only now beginning to be possible with the advent of such technologies as Macromedia's Shockwave plug-in (for Director animations), Sun's Java and it's off-shoots. (In this context, work at KMi was demonstrated which illustrated Stadium, a Java-based application for large-scale telepresence, and the Virtual Microscope, which has trialled both Shockwave and QuickTime to provide interactivity previously only accessible on standalone machines). Richer functionality means of course greater authoring complexity -- simple text and images are fine for publishing a wide variety of material, and this simplicity is the key to the Web's success to date. However, to use the Web in collaborative work such as design or learning, the Web needs to support the kinds of functionality that CSCW research has been wrestling with in non-Web systems for many years.

Commercialisation

In his position paper to the CHI 96 HCI & The Web workshop, Austin Henderson (Apple) made the following comment:

The big industrial efforts will struggle to retain control (...) so that they can force everyone through their turnstyle. They will do this by moving faster than everyone else. Which also means that the direction they move will not necessarily be well thought out; indeed it may tend to force things to be badly, but expediently defined. Deep changes (...) are not likely to be considered; you can't stay ahead and be thoughtful too.

This is a clear challenge for user-centred Web research---to engage in deeper reflection than the companies have time for. The Web is one of the fastest growing commercial markets, with companies striving to make sure that their products are used most widely. This is both good and bad. New ideas stand the chance of being implemented as real products more quickly (but it's not clear how many of these are good ideas). Defenders of the net contend, however, that good ideas tend to float to the top and news quickly spreads of useful tools, so that the net is in some senses its own judge of what's worthwhile. The W3 Consortium has the important role of trying to maintain an even keel as the different companies strain to cut a unique niche for themselves, but greater reflection invariably means slower movement. There seems no way to avoid these two forces from pulling in opposite directions, except that straying too far off the core lingua-franca is presumably not in the interests of anyone. Two obvious examples of this tension are the extensions to HTML which Netscape introduced to enable tighter control over page appearance (and which probably gained them a major share of a market used to having such control over corporate image), and Netscape's Frames, which provide some useful new functionality, but in the process violate all user expectations about navigation (the dominant back/forward keys on Web browsers no longer control the material which is the main focus of user attention).

Variable End-user Environment:

There has never been a hypertext system that was so large that no-one could be sure what hardware or software the end-users might be using. The user interface design community has had to get to grips with the concept of designing with this uncertainty. Graphic/typographic designers have had to jettison the basic assumption underpinning their training, that they have complete control over the placement and form of text and graphic elements. One never knows how big the user's window will be, whether they have tables or frames capability, or a particular plug-in, what speed of connection they have, or what colours will be available. Again, HTML standards are beginning to address some of this by allowing tighter control over document styles, but before the Web, it was not even an issue.

How To Make the Biggest User-Centred Impact on the Web Community?

Henderson (cited above) also made the following comment:

Suppose the workshop comes up with something that would help the world. How will we make it have some effect? (...) Can we figure a way to break the hegemony, and provide for the same kind of open-ended experimenting that has characterized the internet so far? Or can we figure a way to inject better ideas into the fast-track creation processes that are driving the standards now? (...) Indeed, the route to influence might interact with the thing invented, so possibly it could be a framing notion.

How does one make a splash in the ocean that is now the Web? Yes, everyone can publish, but that simply shifts the economics to those of attention---how do I get people to visit me and use my ideas? It's all very well to publish neat ideas in the research literature, but in the world of the Web, only one thing counts---whether it gets implemented and used widely. Henderson suggests above that the route we identify as having the greatest potential clout could or even should shape what ideas we invest resources into developing---there's no point developing a newer and better approach which will be for some reason be incompatible with the existing user base.

A number of ideas were suggested in response to this challenge.

Quality Sites Will Speak for Themselves

If we create sites which put into practice the best user interfaces that we can create, these will become `Sites of Excellence' that will serve as design case studies for others. The challenge is a design education problem, and faces precisely the same problems that HCI design training faces: how to pass on the skills to contextualise usability principles to one's own design problem. Sites which put principles into practice could include supplementary "design masterclass" pages that go behind the scenes to discuss how the ideas came together, the kinds of trade-offs that had to be made, how it was/is being evaluated, and so forth.

User-centred Web Guidelines

UI guidelines have been around for a long time, and have well known problems associated with them as well, deriving from their breadth at the expense of context-specific depth. However this seems to be a very popular form in which to codify Web design knowledge at the moment, and complementary to the above suggestion of instantiating design principles as concrete examples rather than simply stating them in the abstract, as guidelines.

A Usability Standard Kitemark

Analogous to the British Standard kitemark that appears on some products, or the various awards that have been set up on the Web for Top 5% site, etc., this would be an award to show that a site had satisfied a number of basic user-centred requirements. The big issue of course is what requirements, and specified by whom? Example requirements might be that it had been evaluated in some form, or that a task analysis of some sort had been carried out, or that all images have their sizes marked to warn users about download time. Would a concerned subset of the HCI community be capable of producing such a standard?

New Features List:

A list could be created of new usability-features that would be particularly useful to have in a Web-based work environment. A simple template for this might be:

Problem: users have the following problem under the following circumstances (preferably backed up by a link to hard evidence that this was indeed a problem)

Proposed solution: an idea of how this might be tackled---this might suggest immediate solutions using adaptations of existing tools, or longer term solutions (see next section).

Such a list has already been proposed following the Hypertext 96 workshop. Opportunities for Improving the WWW lists numerous user-centred extensions which could be made, and calls on interested parties to adopt and develop ones in which they are interested (http://www.cs.bgsu.edu/hrweb/opportunities/).

Such a list could be a valuable resource for third party software developers who are often looking for a particularly useful utility or extension which will augment the functionality of standard, widely used tools. The shareware archives are full of such extensions. Research students might also find useful ideas in such a list for short term development and evaluation projects. Such a list could (i) be a forum for disseminating new design requirements and ideas which were backed up by user-centred analysis, and (ii) serve as a clearing house to link problems to solutions, which might in practice mean linking researchers to commercial developers.

Hotlist for Future User-Centred Web R+D. . .

A number of interesting topics arose in response to this question.

Long-term Solutions vs. Quick Fixes

Lots of new Web interface features have been, and will be proposed. There's always something to fix (e.g. see the SmartBrowser site which allows individuals to propose useful new functionality: http://www.smartbrowser.com/idea/). Sometimes, solutions can be provided in the short term by a work around, or use of existing functionality, although in the longer term, deeper changes (e.g. at the level of HTTP) may be required to properly implement. Example: HTML currently lacks the facility, offered by older hypertext systems (e.g. OWL), for users to bring up a pop-up box which defines a term. One has to go to a different page to find such a definition, which is disruptive. However, the functional equivalent to a pop-up mechanism can be provided using current Netscape extensions, which allow the author to set up multiple frames in a given browser window, each of which is treated as essentially a separate browser, or alternatively, to explicitly direct output to another window. Thus, on clicking on the term, its definition can be shown in another adjacent area, without losing the main context and focus of attention. This is not an ideal solution (being restricted to one proprietary browser for a start), but until such functionality can be introduced at a deeper, more elegant level, it is still possible to provide the facility. It was felt that similar kinds of fixits may be possible to provide useful extensions to standard functionality. The `opportunities for improving the WWW' resource listed above also proposes a very useful template, which highlights the above, and other, issues which need to be considered when proposing new functionality.

Usability of Search Agents and Engines

The sheer size of the Web has pushed to the forefront technologies for managing and searching vast quantites of information. Various search engines such as Jazz and Excite offer more than simple indexing - how effective are these, and how easy to understand are they? (See Wired magazine, May, 1996 for one review of the different approaches they take). Other solutions are based around intelligent agents which seek out information for you (e.g. work at MIT). Relevant issues here include: How easy are these to program and maintain? How do users manage large numbers of agents?

Towards the World Wide Knowledge Web

Several times, stimulated by Clibbon & Callaghan's paper, the issue of enriching the semantics of node and link types arose, which would then support much richer computation over Web structures, and filtering/visualisations from different perspectives (e.g. show me all demo links from wave refraction nodes). In several influential hypermedia research systems, system-interpretable node and link types played a central role: for analysts seeking to develop an appropriate schema to analyse a domain (a common use for pre-Web systems such as NoteCards and Aquanet), for mediating and recording argumentation (e.g. the gIBIS system - now available as QuestMap), or for structuring material according to different pedagogical models (e.g. the IDE system, and as described in Benyon et al's paper). At the Hypertext 96 workshop, it was emphasised that the HTML specification does in fact support the typing of links (Connolly, 1996), but this is little known and hence not used much or recognised by browsers yet. Newman and Smith described a novel feature in this respect---their system automatically displays different link types in different colours, making it easy to pick out relationships of interest by browsing the document.

Taking Links Seriously

This is related to the above. It was argued that we need to break away from the view that a Web page is a fixed structure defined by the author. Instead, greater emphasis should be placed on exploring the potential of computing virtual pages tailored to user's needs (as with typed nodes and links, this was one of Frank Halasz's (1988) Seven Issues for Hypertext (revisted at the 1991 conference). Smith & Newman pointed to their GENIE system as one which constructs pages flexibly in this way. Reference was also made to the Microcosm open hypertext system, which stores links in separate `linkbases' allowing arbitrary linking between any application (in Windows) at any granularity. A version of this is now available for the WWW (Carr et al, 1995).

We Need New Metaphors

Certain kinds of spatial metaphors came in for some criticism in Davies' paper, but it was suggested that they aren't all necessarily inappropriate. There are many different kinds of geographical space (e.g. contrast a desert with a city), which might be profitably used to enrich Web sites with meaning which could assist navigation. One was idea was that the relationship between sites might be expressed using a cogwheel metaphor, to capture the extent to which sites `meshed.' The PICS system (from W3C) was mentioned as one spatial metaphor-based example to look at. `Cyberspace architects' are increasingly exploring concepts from physical architecture as they seek for ways to present large scale virtual environments in coherent ways (e.g. Benedikt, 1990). On this theme, Alexander's work on architectural patterns for cities ("A City is not a Tree") might hold promise for web site structures (the Royal Society of Arts Web pages follow a lattice structure suggested by Alexander).

Content is More Important than Usability?

Let's face it, useful sites sometimes get used despite poor usability---content and style win. Of course, content and style should not be set up in opposition to usability! But as noted earlier, the Web reflects very closely the style-driven, commercially-driven priorities with which we're familiar from other glossy broadcast and print media. This is what people are tuned to, and what looks cool or shows off the latest plug-in will often draw in the crowds. This is the trend in lightweight surfing anyway. But maybe work sites, which one visits regularly to do serious business, are the ones where usability issues are of more importance. Combining user-centred design criteria with excellence in the production of multimedia should be the name of the game. The Highfive site (www.highfive.com) which awards (one person's view of) excellent Web design, is an interesting case in point. It critiques the quality of interaction offered by sites at a fine grain of detail, without explicitly mentioning usability, but with this implicitly one of the criteria amongst others. This would be a useful site for HCI specialists to visit, to see to what extent the judgements made accord with their own judgements. It should be said that Highfive focuses primarily on commercial sites which you'll browse around. Which brings us to the next question...

What is Browsing? (or: I'm an HCI Analyst -- What's the Task?)

User-centred design depends on understanding the work of the user. The task is the unit of currency for many HCI design approaches. But what's the task for Web users? Obviously, tasks vary, but browsing is the primary mode in which most Web users work. The browser interfaces, and the network structure of undifferentiated nodes and links supports this kind of navigation. "Browsing" sounds terribly vague and fuzzy as a task, and some suggested that this made it very hard for user-centred design. However, it was suggested that we should not simply doom "browsing" to be a mode of working which we cannot seriously analyse or better understand. It may be possible to break it down into generic subtasks which can then be supported more specifically (preliminary work at Middlesex University on this). But browsing is (hopefully) not the be all and end of all of the Web. What about other uses of it? Perhaps...

We Shouldn't Talk about `The Web'; It's About Tools to Support Tasks

The Web is increasingly being integrated with the tools that we already use---it's just another network. As the initial wonder of being able to browse around the planet fades, it will be all too clear that the user-centred task is the same as ever---understand work practices (cognitive and social) and integrate new tools into these practices. It's just that some of these tools will utilise the Web, with all its strengths and limitations. The HECTOR system described in Picking's paper reflected this perspective.

A Task Model for Web-User Interaction

Towards the end of the discussion, a particularly promising theme began to emerge. Clearly, it is always possible to think of new interface fix-its. But the hardest thing is to teach awareness of the interactional levels at which usability can be designed. I stated at the beginning of the symposium that by "usability", we weren't just talking about "choosing the right screen colours." Instead, I proposed a working definition of "supporting users by providing the right information and tools in the right form at the right time." It's about providing appropriate, accessible functionality which serve the current task at hand. Obviously, screen colours can make a screen illegible, but this is only one of several levels at which one should consider usability.

As HCI researchers discussed the different levels of interaction at the symposium, one software developer from industry asked directly for guidance on what the different levels of interaction were that should be considered in developing a Web application. My feeling is that the need for this kind of guidance is widespread, and that the Web, as the fastest growing interactive system in the world, offers a golden opportunity for HCI to make a difference. HCI has a solid grounding which should be exploited to serve Web designers in thinking about what a user's goals and tasks are, and how these are to be accomplished. Scenario-based design offers a strategy for imagining future interactions in rich detail. Work on task analysis emphasises that goals and tasks vary in their importance and level of abstraction, from `get the new upgrade' to "find the right directory" to "click the right button." At each of these levels, the Web interface can support or impede the user. As indicated above, as the Web becomes an everyday extension to existing tools, all the disciplines of HCI/CSCW need to be applied, as with any other work technology.

So, a Challenge Now Faces the HCI Community...

We need to understand our market, why certain innovations catch on, and who have most influence on the uptake of new ideas. We need to deliver design support in appropriately packaged, tailored forms, which themselves are examples of best practice. Somebody should be tailoring existing user-centred design methods for the Web design culture. This could be done for free by willing researchers, or commercially by HCI consultants. Depends on who gets there first.

Abstracts for Symposium Position Papers

The abstracts are collated here for convenience. The full position papers are accessible from the symposium's Final Programme (http://kmi.open.ac.uk/~simonb/missing-link/ml-prog.html).

The Web - Hyperspace, Hypermedia or Just Hyped?

Clare Davies

International Institute for Electronic Library Research, De Montfort University, Hammerwood Gate, Kents Hill, Milton Keynes MK7 6HP, U.K. E-mail: cdavies@dmu.ac.uk

Is the Web hypermedia? The term implies more than hypertext: it implies the use of multimedia such as images, video, and sound. While most Web sites include some graphics, these are often merely as text illustrations or colourful formatting techniques (such as multi-coloured bullet points): few sites have really exploited graphics, sound or video to a useful extent. We may be on the verge of a Java-inspired multimedia explosion; again, we may not, as the Web becomes less like a superhighway and more like a rush-hour traffic jam in which additional obstructions will be especially unwelcome.

Is the Web even `true' hypertext? Most literature on hypertext and hypermedia systems has focused on systems for teaching and learning. Such systems tend to contain only selected material, rather than letting users roam freely across information created by anyone anywhere. They attach great importance to semantically appropriate links between text passages, whereas links between Web pages range from the functional (e.g. `mailto') to the useless (deleted pages, trivial information, personal `home' pages containing nothing but a person's name).

This paper will examine the assumptions behind viewing the Web as a hypermedia system (or a set of linked hypermedia systems), given what we already know of Web users' problems, and will describe a small-scale survey of Web users within two contrasting academic establishments. The survey, currently being planned and piloted, will attempt to elicit not only the problems experienced by Web users at various experience levels, but also the importance of different problems to their effective use of the Web. Problems drawn from the hypermedia literature will be included, alongside network and browser issues.

Beyond Halasz's Hypertext Research Agenda - The WWW?

Kelvin Clibbon and Mike Callaghan

LUTCHI Research Centre, Department of Computer Studies, Loughborough University, Loughborough, LE11 3TU UK E-mail: K.C.Clibbon@lut.ac.uk

Department of Computer Science, De Montfort University, Leicester, LE1 9BH UK. E-mail: jmc@dmu.ac.uk

The emergence of the World Wide Web (WWW) over the past three years has provided extraordinary challenges, and unprecedented opportunities, for the hypertext research community. Hypertext research, at the birth of the WWW, could indeed have been described itself as a rich "web" of ideas, developing freely in many directions, with a variety of paradigms, a strong degree of openness and a distinct lack of dogma. It was characterised by a productive mix of theory and implementation, and a strong desire to fully explore the foundations of hypertext, which remained a rather elusive concept. It could equally be said that this very openness was a disadvantage, leading to a proliferation of incompatible research prototypes and working systems, and preventing the synergistic effects of a more focused community effort.

In a few short years, it would be fair to say that hypertext research has been turned inside out by the explosion of interest and activity based around the WWW. We have even reached the situation where the significance of any piece of hypertext research is measured by some in terms of its actual or potential application in the WWW environment. In any period of such revolutionary change, it can be expected that there will be casualties. This paper examines the consequences of the shift in direction in hypertext activity, using the well-established criteria arising out of Halasz's research agenda (Halasz, 1988,1991) as a foil against which to assess the WWW version of hypertext. The current status of each agenda item is discussed from a hypertext perspective and in terms of its perceived importance in research based around the World Wide Web. In the course of this examination, some attention is necessarily focused on the hypertext model which underlies the WWW, and the significance of this in determining the success of the WWW (and the importance of `usability' in this), and in generating a whole new field of WWW-related hypertext research dedicated to overcoming some of its weaknesses. An important issue raised is the lack of fundamental support in the model for knowledge-based models of hypertext (Clibbon, 1995a).

Applying Usability Research to the Web:
Virtual Hypermedia Domains and Virtual Search Hierarchies

Pauline Smith and Ian Newman

Dept. of Computer Studies, Loughborough University, Loughborough, Leics LE11 3TU. E-mail: P.A.Smith@lut.ac.uk

This paper presents a framework in which existing hypermedia usability research can be applied to the World Wide Web (WWW or Web) to assist individuals to find the information they need and to help Web authors present their information more effectively. The paper first examines the relationship between the assumptions underlying usability research and the way in which the Web is built, maintained and used in practice. It is concluded that the Web cannot realistically be considered as a single hypermedia structure to which usability research can directly be applied. Nonetheless, it is argued that, because many people perceive the Web to be a major information source for the future, it is essential that the Web be made more accessible/usable (i.e. that the results obtained from the usability research be applied in so far as this is appropriate). One possible way in which this could be done is presented using, as a case study, a system which has been developed to provide information for Global Environmental Change researchers and which uses the Web as a delivery mechanism.

Web in a Spin

Richard Picking

School of Computing, Staffordshire University, Stafford ST18 0DG. E-mail: r.picking@soc.staffs.ac.uk

In his seminal survey of hypertext, Conklin wrote:

... hypertext, far from being an end in itself, is just a first crude step toward the time when the computer is a direct and powerful extension of the human mind, just as Vannevar Bush envisioned when he introduced his Memex four decades ago. Conklin, 1987 (40)

The present author suggests that World Wide Web (WWW) interfaces are still taking this "first crude step", even after almost another decade. A number of usability problems are evident, but in particular, attention is drawn to the following criticisms:

  1. Web interfaces provide inadequate tools to support authoring.
  2. Web interfaces do not encourage the semantic organisation of work.
  3. The WWW does not integrate sufficiently with other media.

These problems are expanded on and a way forward, in the form of an ongoing hypermedia project, will briefly be described.

A Link-oriented Tool for Evaluation of Hyperdocuments

Renata Pontin de Mattos Fortes(1) and Alvaro Garcia Neto (2)

1 Departamento de Ciências de Computação e Estatística, Instituto de Ciências Matemáticas de São Carlos USP - Brasil. E-mail: renata@icmsc.sc.usp.br

2 Departamento de Física e Informática, Instituto de Física de São Carlos USP - Brasil. E-mail: alvaro@uspfsc.ifq.sc.usp.br

This paper presents a framework to peruse the hyperdocument structure. This framework has been used to evaluate the ICMSC WWW server, which aims to help hyperdocument authors. This evaluation framework is part of a wider project, which intends to investigate and to propose ways to support the authoring process for improving hyperdocument quality.

Systematic Web Authoring

Harold Thimbleby

Middlesex University, Bounds Green Road, London, N11 2NQ, UK. E-mail: harold@mdx.ac.uk

GenTL is a tool for authoring large multi-page, evolving web documents. It provides many features to manage the development of documents that are revised and change structure over time; it provides features such as `reminders' so that it, rather than the author, can keep track of the multiple development threads that arise. It provides features for accurate "what's new" pages. It provides many diagnostic features. And so on. This paper explains why tools like GenTL are necessary, rather than optional. It also shows that helping the author also helps the reader of the document, and that many "reader" features are mirrors of "author" features. GenTL is also a research tool: it can be used in conjunction with Mathematica to evaluate or analyse hypertext document design and its use.

A Student-Centred Approach to Networked Mutimedia Courseware

David Benyon, Simon Holland, Debbie Stone and Mark Woodroffe

Department of Computing, The Open University, Milton Keynes, MK7 6AA, UK; E-mail: D.R.Benyon@open.ac.uk

The demand for high quality, distance learning instructional material is increasing. In the context of the UK's Open University this means providing courses which can be taken from anywhere in the world. Many of our students change location whilst taking the course and providing remote access to material is, thus, an important consideration. In this paper we describe our approach to developing a hypermedia version of our course in human-computer interaction, M867 User Interface Design and Development. The course will be implemented using the Hyper-G internet server (Byte, 1995; Maurer, 1996) which provides a number of facilities for organising and structuring HTML material.

References

Benedikt, M. (Ed.) (1991)
Cyberspace: First Steps, The MIT Press: Cambridge, MA.
Carr L, De Roure D, Hall W, Hill G (1995)
The Distributed Link Service: A Tool for Publishers, Authors and Readers, The Web Revolution: Fourth International World Wide Web Conference, December 11-14, 1995, Boston, Massachusetts, USA: http://www.w3.org/pub/Conferences/WWW4/Papers/178/
Connolly, D. (1996)
An Evaluation of the WWW as a Platform for Electronic Commerce. Position Paper at Hypermedia Research & The WWW, Workshop at ACM Hypertext 96: http://www.cs.bgsu.edu/hrweb/papers/connolly.html
Halasz, F. G. (1988)
Reflections on Notecards: Seven Issues for the Next Generation of Hypermedia Systems, Communications of the ACM, 31, 836-852. Updated in: Seven Issues: Revisited, Keynote Address, Hypertext'91 Conference, San Antonio, Texas (December 18, 1991): http://www.parc.xerox.com/spl/projects/halasz-keynote/
Heller, H. and Rivers, D.
So you wanna design for the Web. ACM Interactions, March , 1996, pp.19-23
Henderson, A. (1996)
Position Paper, CHI 96 Workshop: HCI and the Web: http://www.acm.org/sigchi/webhci/chi96workshop/papers/henderson.html
Johnson C. (1996)
Time Travel On The Web: http://www.dcs.gla.ac.uk/~johnson/talk.html
Nielsen, J. (1995)
Features for the Next Generation of Web Browsers. SunSoft's Alert Box Column, July 1995. http://www.sun.com/950701/columns/alertbox/
Pitkow, J. and Kehoe, C., 1995,
GVU's 4th WWW User Survey. http://www.cc.gatech.edu/gvu/user_surveys/survey-10-1995/
The Royal Society of Arts Web site
(organised using a lattice structure taken from the architect Alexander): http://www.cs.mdx.ac.uk/rsa/

Web Resources

Symposium papers, this report, and other Web usability resources are at:

http://kmi.open.ac.uk/~simonb/missing-link/

Related workshops to this one:

Hypermedia Research & The WWW, Workshop at ACM Hypertext 96, March, 1996. http://www.cs.bgsu.edu/hrweb/report.html

in particular, check out:

Opportunities for Improving the WWW---a resource to inspire user-centred Web developers to extend Web functionality in useful ways: http://www.cs.bgsu.edu/hrweb/opportunities/
HCI & The WWW, Workshop at ACM CHI 96, April, 1996; SIGCHI Bulletin Vol 28, No. 4, October 1996; http://www.cs.bgsu.edu/usable-web/chi96/

Following this series of workshops on informing Web design with HCI/Hypermedia research, a new resource has been created by Keith Instone which documents the above events, and will track ongoing developments in Web Usability:

HCI & The Web: http://www.acm.org/sigchi/webhci/

The `Highfive' site awards "excellence in web design" -- interesting criteria for HCI professionals to consider:

http://www.highfive.com/

The `Microcosm' system offers one route to more flexible linking possibilities:

http://bedrock.ecs.soton.ac.uk/

A special issue of the International Journal of Human-Computer Studies is currently in preparation, focusing on HCI and The Web.

Author's Address

Knowledge Media Institute
The Open University
Milton Keynes
MK7 6AA, U.K.
Email: S.Buckingham.Shum@open.ac.uk
WWW: http://kmi.open.ac.uk/~simonb

No earlier issue with same topic
Issue
Previous article
Article
SIGCHI Bulletin
Vol.28 No.4, October 1996
Next article
Article
No later issue with same topic
Issue