Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephen A. Brewster is active.

Publication


Featured researches published by Stephen A. Brewster.


human factors in computing systems | 2008

Investigating the effectiveness of tactile feedback for mobile touchscreens

Eve E. Hoggan; Stephen A. Brewster; Jody Johnston

This paper presents a study of finger-based text entry for mobile devices with touchscreens. Many devices are now coming to market that have no physical keyboards (the Apple iPhone being a very popular example). Touchscreen keyboards lack any tactile feedback and this may cause problems for entering text and phone numbers. We ran an experiment to compare devices with a physical keyboard, a standard touchscreen and a touchscreen with tactile feedback added. We tested this in both static and mobile environments. The results showed that the addition of tactile feedback to the touchscreen significantly improved finger-based text entry, bringing it close to the performance of a real physical keyboard. A second experiment showed that higher specification tactile actuators could improve performance even further. The results suggest that manufacturers should use tactile feedback in their touchscreen devices to regain some of the feeling lost when interacting on a touchscreen with a finger.


ubiquitous computing | 2002

Overcoming the Lack of Screen Space on Mobile Computers

Stephen A. Brewster

Abstract: One difficulty for interface design on mobile computers is lack of screen space caused by their small size. This paper describes a small pilot study and two formal experiments that investigate the usability of sonically-enhanced buttons of different sizes. The underlying hypothesis being that presenting information about the buttons in sound would increase their usability and allow their size to be reduced. An experimental interface was created that ran on a 3Com Palm III mobile computer and used a simple calculator-style interface to enter data. The buttons of the calculator were changed in size between 4×4, 8×8 and 16×16 pixels and used a range of different types of sound from basic to complex. Results showed that sounds significantly improved usability for both standard and small button sizes – more data could be entered with sonically-enhanced buttons and subjective workload reduced. More sophisticated sounds that presented more information about the state of the buttons were shown to be more effective than the standard Palm III sounds. The results showed that if sound was added to buttons then they could be reduced in size from 16×16 to 8×8 pixels without much loss in quantitative performance. This reduction in size, however, caused a significant increase in subjective workload. Results also showed that when a mobile device was used in a more realistic situation (whilst walking outside) usability was significantly reduced (with increased workload and less data entered) than when used in a usability laboratory. These studies show that sound can be beneficial for usability and that care must be taken to do testing in realistic environments to get a good measure of mobile device usability.


human factors in computing systems | 2002

Gestural and audio metaphors as a means of control for mobile devices

Antti Pirhonen; Stephen A. Brewster; Christopher Holguin

This paper discusses the use of gesture and non-speech audio as ways to improve the user interface of a mobile music player. Their key advantages mean that users could use a player without having to look at its controls when on the move. Two very different evaluations of the player took place: one based on a standard usability experiment (comparing the new player to a standard design) and the other a video analysis of the player in use. Both of these showed significant usability improvements for the gesture/audio-based interface over a standard visual/pen-based display. The similarities and differences in the results produced by the two studies are discussed


human factors in computing systems | 2007

Tactile feedback for mobile interactions

Stephen A. Brewster; Faraz Chohan; Lorna M. Brown

We present a study investigating the use of vibrotactile feedback for touch-screen keyboards on PDAs. Such key-boards are hard to use when mobile as keys are very small. We conducted a laboratory study comparing standard but-tons to ones with tactile feedback added. Results showed that with tactile feedback users entered significantly more text, made fewer errors and corrected more of the errors they did make. We ran the study again with users seated on an underground train to see if the positive effects trans-ferred to realistic use. There were fewer beneficial effects, with only the number of errors corrected significantly im-proved by the tactile feedback. However, we found strong subjective feedback in favour of the tactile display. The results suggest that tactile feedback has a key role to play in improving interactions with touch screens.


human factors in computing systems | 1993

An evaluation of earcons for use in auditory human-computer interfaces

Stephen A. Brewster; Peter C. Wright; Alistair D. N. Edwards

An evaluation of earcons was carried out to see whether they are an effective means of communicating information in sound. An initial experiment showed that earcons were better than unstructured bursts of sound and that musical timbres were more effective than simple tones. A second experiment was then carried out which improved upon some of the weaknesses shown up in Experiment 1 to give a significant improvement in recognition. From the results of these experiments some guidelines were drawn up for use in the creation of earcons. Earcons have been shown to be an effective method for communicating information in a human-computer interface.


human factors in computing systems | 2000

Putting the feel in ’look and feel‘

Ian Oakley; Marilyn Rose McGee; Stephen A. Brewster; Philip D. Gray

Haptic devices are now commercially available and thus touch has become a potentially realistic solution to a variety of interaction design challenges. We report on an investigation of the use of touch as a way of reducing visual overload in the conventional desktop. In a two-phase study, we investigated the use of the PHANToM haptic device as a means of interacting with a conventional graphical user interface. The first experiment compared the effects of four different haptic augmentations on usability in a simple targeting task. The second experiment involved a more ecologically-oriented searching and scrolling task. Results indicated that the haptic effects did not improve users performance in terms of task completion time. However, the number of errors made was significantly reduced. Subjective workload measures showed that participants perceived many aspects of workload as significantly less with haptics. The results are described and the implications for the use of haptics in user interface design are discussed.


human factors in computing systems | 2010

Usable gestures for mobile interfaces: evaluating social acceptability

Julie Rico; Stephen A. Brewster

Gesture-based mobile interfaces require users to change the way they use technology in public settings. Since mobile phones are part of our public appearance, designers must integrate gestures that users perceive as acceptable for pub-lic use. This topic has received little attention in the litera-ture so far. The studies described in this paper begin to look at the social acceptability of a set of gestures with re-spect to location and audience in order to investigate possi-ble ways of measuring social acceptability. The results of the initial survey showed that location and audience had a significant impact on a users willingness to perform ges-tures. These results were further examined through a user study where participants were asked to perform gestures in different settings (including a busy street) over repeated trials. The results of this work provide gesture design rec-ommendations as well as social acceptability evaluation guidelines.


human-computer interaction with mobile devices and services | 2006

Multidimensional tactons for non-visual information presentation in mobile devices

Lorna M. Brown; Stephen A. Brewster; Helen C. Purchase

Tactons are structured vibrotactile messages which can be used for non-visual information presentation when visual displays are limited, unavailable or inappropriate, such as in mobile phones and other mobile devices. Little is yet known about how to design them effectively. Previous studies have investigated the perception of Tactons which encode two dimensions of information using two different vibrotactile parameters (rhythm and roughness) and found recognition rates of around 70. When more dimensions of information are required it may be necessary to extend the parameter-space of these Tactons. Therefore this study investigates recognition rates for Tactons which encode a third dimension of information using spatial location. The results show that identification rate for three-parameter Tactons is just 48, but that this can be increased to 81 by reducing the number of values of one of the parameters. These results will aid designers to select suitable Tactons for use when designing mobile displays.


ACM Transactions on Computer-Human Interaction | 1998

Using nonspeech sounds to provide navigation cues

Stephen A. Brewster

This article describes 3 experiments that investigate the possibiity of using structured nonspeech audio messages called earcons to provide navigational cues in a menu hierarchy. A hierarchy of 27 nodes and 4 levels was created with an earcon for each node. Rules were defined for the creation of hierarchical earcons at each node. Participants had to identify their location in the hierarchy by listening to an earcon. Results of the first experiment showed that participants could identify their location with 81.5% accuracy, indicating that earcons were a powerful method of communicating hierarchy information. One proposed use for such navigation cues is in telephone-based interfaces (TBIs) where navigation is a problem. The first experiment did not address the particular problems of earcons in TBIs such as “does the lower quality of sound over the telephone lower recall rates,” “can users remember earcons over a period of time.” and “what effect does training type have on recall?” An experiment was conducted and results showed that sound quality did lower the recall of earcons. However; redesign of the earcons overcame this problem with 73% recalled correctly. Participants could still recall earcons at this level after a week had passed. Training type also affected recall. With personal training participants recalled 73% of the earcons, but with purely textual training results were significantly lower. These results show that earcons can provide good navigation cues for TBIs. The final experiment used compound, rather than hierarchical earcons to represent the hierarchy from the first experiment. Results showed that with sounds constructed in this way participants could recall 97% of the earcons. These experiments have developed our general understanding of earcons. A hierarchy three times larger than any previously created was tested, and this was also the first test of the recall of earcons over time.


ubiquitous computing | 2002

The Challenge of Mobile Devices for Human Computer Interaction

Mark D. Dunlop; Stephen A. Brewster

Abstract:

Collaboration


Dive into the Stephen A. Brewster's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Martin Halvey

University of Strathclyde

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge