No earlier issue with same topic
Issue
Previous article
Article
SIGCHI Bulletin
Vol.30 No.1, January 1998
Next article
Article
No later issue with same topic
Issue

Time and the Web

Alan Dix

Graphics, information, waiting, ... java, multimedia, waiting, ..
hypertext, waiting, ... global networking, waiting, ... waiting, ... waiting ...

"Time and the Web" was a workshop of the British HCI Group held on 19th June 1997 at the Octagon, Staffordshire University. The organizers were Dave Clarke, Devina Ramduny, Dave Trepess and myself. I'll try to give a flavor of the day in this report, but for the full papers see the web site:

http://www.hiraeth.com/web97/

For me this workshop brought together two fascinating areas in HCI, first of all the study of temporal issues of user interaction, a focus of my own research for many years, and second the web which has affected us all dramatically.

Modern user interface paradigms depend on direct manipulation, rapid response and immediate semantic feedback. Up until the early 80s response time was a recognized problem. But, with the advent of personal computing and graphical interfaces, user interface designers have often assumed that machines will be fast enough. Response delays were no longer an interesting problem, ever faster computers would make the problem go away. I have previously called this assumption the "myth of the infinitely fast machine" [1].

The web has given the lie to this assumption -- exponential growth in traffic has lead to ever increasing network delays and bottlenecks at over-used servers. Even if we imagine that network capacity could overtake growth in usage, we are ultimately faced with the fundamental limitations of the speed of light. Delays are here to stay.

All this has highlighted the role of temporal issues in human-computer interaction:

The workshop began to address these issues building on the growing interest in this area, in particular on the popular workshop on Temporal Aspects of Usability held at Glasgow in 1995 [2], and recent meetings on Hypermedia Usability [3] and on user interfaces and CSCW for the web [4].

The day was split into three main parts:O

  1. Setting the agenda: studies and issues -- with studies and analysis defining and establishing problems for time and the web.
  2. Attacking the problem: theory and mechanisms -- with papers more concerned with potential solutions.
  3. A panel discussion, led by Richard Bentley (Rank Xerox), Nigel Birch (EPSRC)

Of course as with all such divisions the rationale was as much governed by the timetable as the content, but in general the day did move from problem statement, through theories to partial solutions.

Setting The Agenda: Studies And Issues

Chair: Dave Trepess

The Use of Critical Parameters in the Design of Web-based Interactive Systems

William Newman
Rank Xerox Research Centre, 61 Regent Street, Cambridge, UK
newman@cambridge.rxrc.xerox.com

Critical parameters are performance measures which by common agreement can be used as a basis of assessment. For example, adverts for cars may quote fuel consumption at 70 km/hour or even drag coefficient, countries are often discussed in terms of their GNP, population size or land area. Such critical parameters can not capture the full subtlety of design trade-offs but they do allow the design space to be represented more succinctly and hence more tractably.

William gave examples of the use of critical parameters in applications, including airline reservation, medical record-keeping and calendar maintenance. He discussed the potential effect of porting applications to the Web in terms of critical parameters. He did not attempt to define the full set of critical parameters for the web, thus leaving a challenge to the workshop participants.

Some critical temporal parameters at a low-level are obvious: network bandwidth and latency. Those familiar with my own work will know that I would say that often the most critical temporal parameter is pace, that is, the rate at which users can act and expect to receive some response to their actions [5].

Is Time out to be the Big Issue?

Anthony Byrne and Richard Picking
Staffordshire University, Beaconside, Stafford, UK
R.Picking@soc.staffs.ac.uk

This paper reported the results of a web-based questionnaire. The questionnaire was created from a large initial set of candidate evaluation criteria using sorting techniques. The questionnaire was made available to the web community, enabling users to provide their views on the usability of four nominated web sites after browsing each one. The survey data were analyzed and results interpreted to place the perceived importance of delay in context with other usability issues in web environments. The survey aimed to establish the subjective views of users performing browsing activities, rather than on measurements of the user performing allocated tasks.

The sorting exercise ascertained that delay was regarded as important by users, although issues of navigation and web page design are still dominant. Also, the survey revealed that delays provoked criticism from users, although delays appeared to be acceptable for local web sites where download times are faster.

I found it interesting to contrast these results with a similar small scale study discussed in a short paper at HCI '97, "Heuristic Evaluation of Web Site Usability" by Jones and Hewitt [6]. In their paper delays were found not to be important. Perhaps this reflects the difference in evaluation styles: user questionnaire vs. heuristic evaluation, perhaps simply the different web sites used in the evaluations. Whichever is the case, it clearly demonstrates the difficulty in even quantifying the problem of web delays.

Compensatory Actions for Time Delays

Barbara McManus
Department of Computing, University of Central Lancashire, Preston, UK
b.mcmanus@uclan.ac.uk

This paper described observations of students engaged in web-design assignments during a period of poor network performance. Users were observed using various types of "compensatory actions", that is techniques which alleviate the effects of the slow response time. For example some users kept several browser windows open so that they could work with one window whilst other windows were loading, some expert users extended the browser's cache size and other users simply avoided graphics-rich sites. The type of compensatory actions depended on the expertise of the user, expert or novice, and the kind of task, directed or exploratory.

In my own work, I have observed users adopting similar techniques to deal with a variety of time-based problems and called these "coping strategies". At first these often start as breakdown situations where the user explicitly acts to compensate for delays or unexpected behavior. Later the actions become automatic and users are often unaware that they are using them, but of course the additional cognitive and physical loads remain.

Temporal Usability and Disturbance Management in Interaction

Helen Parker
Computing Research Centre, Sheffield Hallam University, Sheffield, UK
h.parker@shu.ac.uk

This paper was based on Helen's extensive review of work on temporal issues within the HCI and Psychology literature. She focused on two definitions of "just right" timing in the user interface:

  1. The timing of behavior conforms to users' expectations based on prior experience or current status information.
  2. The experienced user never has to devote conscious attention-directed awareness to the timing of interface behavior.

The first definition emphasizes the importance of "temporal affordances" ways of making the user aware of the likely and on-going delays. In a network system like the web this may mean deliberately not being location transparent -- Helen talked of "spatial" navigational aids to aid temporal awareness.

The second definition emphasizes the importance of "disturbance management" techniques whereby delays in one activity can be filled with another (as in the case of multiple browser windows as observed by Barbara) and whereby the original activity can be resumed with minimal effort.

Attacking the Problem: Theory and Mechanisms

Chair: Devina Ramduny

What's the Web Worth? The Impact of Retrieval Delays on the Value of Distributed Information

Chris Johnson
Department of Computing Science, University of Glasgow, UK
johnson@dcs.glasgow.ac.uk

Linking ideas from different fields often leads to new and powerful insight. Chris has taken aspects of utility theory, an important branch of economics, and applied it to the way retrieval delays affect user's perceptions of the value of information. This can both give potential measures of these effects and also suggest design directions, for example to help users estimate the utility and retrieval cost (delay) of as yet unseen information.

Chris' work in this area is expanded further in his HCI '97 paper "The impact of marginal utility and time on distributed information retrieval" [7]. Also there are parallels in Grudin's use of cost-benefit analysis for discussing CSCW success factors [8] which I have used myself in assessing the success of the web as a CSCW infrastructure [9]. Another similar approach is the Xerox PARC work on "information foraging theory" which takes an ecological rather than economic metaphor [10]. A problem that still has to be addressed by both the economic and ecological models is how to accommodate the rather different and strange behavior of information compared to real solid food and goods.

An Adaptive Caching and Replication Mechanism for WWW

Cristian Ionitoiu
Computer Science Department, University of Warwick, Coventry, UK
chrisi@dcs.warwick.ac.uk

Maria Angi
Computer Science Department, "Politehnica" University of Timisoara, 2 Vasile Parvan, 1900 Timisoara, Romania
maria@cs.utt.ro

One of the few redeeming features of web browsers when dealing with slow networks or servers is that they use caching -- local copies of recently visited pages, So, although the first visit to a page may take some time, subsequent reloads only need to read the copy off your own disk.

Typically caching takes place at two levels. First the browser itself keeps copies on your local disk. Second, the browser may use a "proxy", that is it accesses pages through an intermediate machine. Proxy servers can allocate much more space to caching. Also, because the proxy is used by many web clients, there is a good chance the page you want has recently been accessed by another user and is in the proxy's cache.

Cristian and Maria described a caching mechanism based on more levels of proxy-like servers. They propose that the hierarchy of proxies should follow the DNS hierarchy given by the domain name of the machine. For example, my local machine might access a "soc.staffs.ac.uk" proxy initially which itself may ask for pages from the "staffs.ac.uk" server and then an "ac.uk" server etc. The advantage of this as a structure is that users within the same domain name grouping are likely to have similar access requirements.

Quality of Service Requirements for Multimedia Communications

Xinping Guo, Colin Pattinson
School of Computing, Leeds Metropolitan University, The Grange, Beckett Park, Leeds, UK
{X.Guo, C.Pattinson} @lmu.ac.uk

Different media demand different levels of timeliness and accuracy. If a set of company accounts is being transmitted it must be accurate (no wrong figures), but a delay of a few seconds mid-transmission is no real problem. In contrast, a delay of even a few hundred milliseconds in the middle of an orchestral performance would not be acceptable, although an occasional loss of sound quality may be. Even within the same media there are different demands for say video-conferencing as compared with television broadcasting and speech transmission compared with music.

These complex user-level demands give rise to the concept of Quality of Service (QoS) at the network level. The paper particularly addressed translation of QoS demands between levels, starting at the user level (Perceptual QoS) which leads to application QoS demands upon the lower levels of communication software, such as the TCP/IP stack used by the Internet, and finally the QoS demands upon the actual physical networks.

At present the web protocol (HTTP) does not support such levels of QoS and indeed during discussion it was noted that even different web media types such as text and images should be treated differently for caching purposes (arguably the use of progressive images makes some moves in this direction). Systems on the web sending other media use lower-level Internet protocols (such as UDP) although for home use, many Internet service providers optimize their dial-up connections in ways which conflict with UDP. The next generation of low-level Internet protocols will have better provision for specifying QoS, but in order to use this to its best advantage clear models of the required behavior at the user level are essential.

Panel-Led Discussion

Chair: Alan Dix

Panelists:

Richard Bentley
Rank Xerox Research Cambridge, UK.
bentley@cambridge.rxrc.xerox.com

Nigel Birch
Human Factors, IT & Computer Science Programme, EPSRC, Swindon, UK
nhb1@wpo.epsrc.ac.uk

Richard was one of the chief architects of Gunks BSCW (Basic Support for Cooperative Work) Shared Workspace system, which is one of the most well known and widely used CSCW systems on the web. He gave a brief overview of BSCW and then used the experiences from the project to suggest some general lessons to initiate discussion.

One of the initial reasons BSCW was built using the web infrastructure is that it offers an homogenous virtual platform -- no more PC/Mac/UNIX versions! Unfortunately, this is only partially successful -- differences between browsers clearly cause problems, but also different network characteristics can make a web interface usable or unusable.

A non-BSCW example of this is the oft cited advice to make web pages fit onto one or two screens... (i) Whose screen? We now have hand-held computers with web browsers! (ii) This isn't good advice when the latency is low but the bandwidth high (as is often the case for trans-Atlantic connections). In these cases it is best to get a reasonable amount downloaded in one go and then use the rapid delay-free interaction with the scrollbar!

Nigel is responsible for Human-Factors at EPSRC the UK research council which funds University research in computing. One of his most interesting remarks was that, despite the large amount of excitement surrounding the web, there were few high-quality grant applications in this area. Those that did arrive often addressed short-term issues and not deeper theoretical understanding.

Possibly the web highlights a general problem in HCI, that of defining a discipline which is closely tied to technology yet which must transcend the short-term aspects of that technology. Academic research on the web cannot outstrip the Netscape and Microsoft in building web applications, but should instead use these technologies as ways of examining deeper fundamental problems which will continue to be applicable when the next wave of technology hits us.

References

1. Dix, A.J., 1987. The myth of the infinitely fast machine. in People and Computers III --- Proceedings of HCI'87. Cambridge University Press. p. 215-228. http://www.soc.staffs.ac.uk/~cmtajd/papers/hci87/

2. Johnson, C. and P. Gray, 1996. Workshop Report: Temporal Aspects of Usability (Glasgow, June 1995). SIGCHI Bulletin. 28(2).

3. Buckingham Shum, S. and C. McKnight, 1997. Special Issue on World Wide Web Usability. International Journal of Human-Computer Studies. 47(1): p. 1-222. http://kmi.open.ac.uk/~simonb/missing-link

4. Busbach, U., D. Kerr and K. Sikkel (ed.), 1996. CSCW and the Web - Proceedings of the 5th ERCIM/W4G Workshop. Arbeitspapiere der GMD 984. GMD: Sankt Augustin. http://orgwis.gmd.de/projects/W4G/

5. Dix, A.J. 1992, Pace and interaction. in Proceedings of HCI'92: People and Computers VII. Cambridge University Press. p. 193-207. http://www.soc.staffs.ac.uk/~cmtajd/papers/pace/

6. Jones, S. and J. Hewitt. 1997, Heuristic Evaluation of Web Site Usability: Experience from Two Case Studies. in HCI'97 Conference Companion. Bristol, UK: . p. 23--25.

7. Johnson, C., 1997. The impact of marginal utility and time on distributed information retrieval. in People and Computers XII -- Proceedings of HCI'97. Bristol, UK: Springer. p. 191--204.

8. Grudin, J., 1988. Why CSCW applications fail: problems in the design and evaluation of organisational interfaces, in CSCW'88 Proceedings of the Conference on Computer-Supported Cooperative Work. ACM SIGCHI & SIGOIS: p. 85--89.

9. Dix, A., 1997. Challenges for Cooperative Work on the Web: An analytical approach. Computer--Supported Cooperative Work: The Journal of Collaborative Computing. 6: p. 135--156. (also published in Groupware and the World Wide Web, R. Bentley, U. Busbach, D. Kerr, and K. Sikkel, Editors. Kluwer. ) http://www.soc.staffs.ac.uk/~cmtajd/topics/webarch/

10. Pirolli, P. and S. Card. 1995, Information foraging in information access environments. in Proceedings of CHI'95. ACM Press. p. 51-58.

About the Author

Alan Dix is Professor of Computing and Associate Dean at the School of Computing, Staffordshire University, Stafford, UK. His research interests include CSCW, aspects of time in interface design, applications of formal methods in HCI and just about anything.

A.J.Dix@soc.staffs.ac.uk
http://www.soc.staffs.ac.uk/~cmtajd/

No earlier issue with same topic
Issue
Previous article
Article
SIGCHI Bulletin
Vol.30 No.1, January 1998
Next article
Article
No later issue with same topic
Issue