Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephen Vickers is active.

Publication


Featured researches published by Stephen Vickers.


eye tracking research & application | 2010

Designing gaze gestures for gaming: an investigation of performance

Howell O. Istance; Aulikki Hyrskykari; Lauri Immonen; Santtu Mansikkamaa; Stephen Vickers

To enable people with motor impairments to use gaze control to play online games and take part in virtual communities, new interaction techniques are needed that overcome the limitations of dwell clicking on icons in the games interface. We have investigated gaze gestures as a means of achieving this. We report the results of an experiment with 24 participants that examined performance differences between different gestures. We were able to predict the effect on performance of the numbers of legs in the gesture and the primary direction of eye movement in a gesture. We also report the outcomes of user trials in which 12 experienced gamers used the gaze gesture interface to play World of Warcraft. All participants were able to move around and engage other characters in fighting episodes successfully. Gestures were good for issuing specific commands such as spell casting, and less good for continuous control of movement compared with other gaze interaction techniques we have developed.


eye tracking research & application | 2012

Gaze gestures or dwell-based interaction?

Aulikki Hyrskykari; Howell O. Istance; Stephen Vickers

The two cardinal problems recognized with gaze-based interaction techniques are: how to avoid unintentional commands, and how to overcome the limited accuracy of eye tracking. Gaze gestures are a relatively new technique for giving commands, which has the potential to overcome these problems. We present a study that compares gaze gestures with dwell selection as an interaction technique. The study involved 12 participants and was performed in the context of using an actual application. The participants gave commands to a 3D immersive game using gaze gestures and dwell icons. We found that gaze gestures are not only a feasible means of issuing commands in the course of game play, but they also exhibited performance that was at least as good as or better than dwell selections. The gesture condition produced less than half of the errors when compared with the dwell condition. The study shows that gestures provide a robust alternative to dwell-based interaction with the reliance on positional accuracy being substantially reduced.


international conference on human computer interaction | 2009

For Your Eyes Only: Controlling 3D Online Games by Eye-Gaze

Howell O. Istance; Aulikki Hyrskykari; Stephen Vickers; Thiago Chaves

Massively multiplayer online role-playing games, such as World of Warcraft, have become the most widespread 3D graphical environments with millions of active subscribers worldwide. People with severe motor impairments should be able to take part in these games without the extent of their disability being apparent to others online. Eye gaze is a high bandwidth modality that can support this. We have developed a software device that uses gaze input in different modes for emulating mouse and keyboard events appropriate for interacting with on-line games. We report an evaluation study that investigated gaze-based interaction with World of Warcraft using the device. We have found that it is feasible to carry out tasks representative of game play at a beginners skill level using gaze alone. The results from the locomotion task part of the study show similar performance for gaze-based interaction compared with a keyboard and mouse. We discuss the usability issues that arose when completing three types of tasks in the game and the implications of these for playing of this type of game using gaze as the only input modality.


human factors in computing systems | 2009

Gaze-based interaction with massively multiplayer on-line games

Howell O. Istance; Stephen Vickers; Aulikki Hyrskykari

People with motor impairments can benefit greatly from being able to take part in Massively Multiplayer On-line Games, such as World of Warcraft. We are investigating how to use eye gaze as a high bandwidth input modality for the range of tasks necessary to participate in the game. We approach this from two directions; in the bottom-up approach we iteratively implement and eva-luate various gaze-interaction techniques, and in the top-down approach we analyze the interaction in MMOGs and develop a theory to map games tasks to gaze-based interaction techniques. We present preliminary results from a recently conducted set of trials which have studied how well tasks in World of Warcraft can be carried out using gaze only. We describe this in the context of the whole project.


Universal Access in The Information Society | 2010

Gaze interaction with virtual on-line communities: levelling the playing field for disabled users

Richard Bates; Stephen Vickers; Howell O. Istance

This paper introduces the concept of enabling gaze-based interaction for users with high-level motor disabilities to control an avatar in a first-person perspective on-line community. An example community, Second Life, is introduced that could offer disabled users the same virtual freedom as any other user, and so allow disabled users to be able-bodied (should they wish) within the virtual world. A survey of the control demands for Second Life and a subsequent preliminary experiment show that gaze control has inherent problems particularly for locomotion and camera movement. These problems result in a lack of effective gaze control of Second Life, such that control is not practical and show that disabled users who interact using gaze will have difficulties in controlling Second Life (and similar environments). This suggests that these users could once again become disabled in the virtual world by the difficulties in effectively controlling their avatars, and their ‘disability privacy’, or the right to control an avatar as effectively as an able bodied user, and so appear virtually able bodied, will be compromised. Methods for overcoming these difficulties such as the use of gaze aware on-screen assistive tools could overcome these problems, but games manufacturers must design inclusively, so that disabled users may have the right to disability privacy in their Second (virtual) Lives.


advances in computer entertainment technology | 2010

EyeGuitar: making rhythm based music video games accessible using only eye movements

Stephen Vickers; Howell O. Istance; Matthew Smalley

Rhythm based music games such as Guitar Hero are hugely popular and allow gamers to take on the role of a famous musician. To play such games you must press keys on virtual guitars in various combinations in time with the music. Gamers with severe physical disabilities cannot always use traditional input devices so alternative methods of input are required to play such games. Eye-gaze is a high bandwidth modality that can support this if suitable interaction techniques exist. By analysing actual gameplay a suitable eye-gaze interaction technique is designed for a Guitar Hero style game. We present results from a user study demonstrating that users are able to score higher with the gaze technique than using a keyboard for game input, albeit at the cost of gameplay. The experiment concludes with a case study in which a young person with physical disabilities is able to successfully play the game using only eye movements.


ACM Transactions on Accessible Computing | 2013

Performing Locomotion Tasks in Immersive Computer Games with an Adapted Eye-Tracking Interface

Stephen Vickers; Howell O. Istance; Aulikki Hyrskykari

Young people with severe physical disabilities may benefit greatly from participating in immersive computer games. In-game tasks can be fun, engaging, educational, and socially interactive. But for those who are unable to use traditional methods of computer input such as a mouse and keyboard, there is a barrier to interaction that they must first overcome. Eye-gaze interaction is one method of input that can potentially achieve the levels of interaction required for these games. How we use eye-gaze or the gaze interaction technique depends upon the task being performed, the individual performing it, and the equipment available. To fully realize the impact of participation in these environments, techniques need to be adapted to the person’s abilities. We describe an approach to designing and adapting a gaze interaction technique to support locomotion, a task central to immersive game playing. This is evaluated by a group of young people with cerebral palsy and muscular dystrophy. The results show that by adapting the interaction technique, participants are able to significantly improve their in-game character control.


human factors in computing systems | 2013

Accessible gaming for people with physical and cognitive disabilities: a framework for dynamic adaptation

Stephen Vickers; Howell O. Istance; Michael James Heron

Current approaches to enabling access to computer games are typically fragmentary, and may involve manual expert configuration of the game, or of the input or output devices used. We present work towards a comprehensive software framework to facilitate dynamic adaptation of computer games to different levels of physical and cognitive abilities. The framework is grounded on a task analysis of gameplay by expert players, and integrates automatic modification of games tasks, interaction techniques, and input device configuration according to a profile of user abilities.


eye tracking research & application | 2012

The validity of using non-representative users in gaze communication research

Howell O. Istance; Stephen Vickers; Aulikki Hyrskykari

Gaze-based interaction techniques have been investigated for the last two decades, and in many cases the evaluation of these has been based on trials with able-bodied users and conventional usability criteria, mainly speed and accuracy. The target user group of many of the gaze-based techniques investigated is, however, people with different types of physical disabilities. We present the outcomes of two studies that compare the performance of two groups of participants with a type of physical disability (one being cerebral palsy and the other muscular dystrophy) with that of a control group of able-bodied participants doing a task using a particular gaze interaction technique. One study used a task based on dwell-time selection, and the other used a task based on gaze gestures. In both studies, the groups of participants with physical disabilities performed significantly worse than the able-bodied control participants. We question the ecological validity of research into gaze interaction intended for people with physical disabilities that only uses able-bodied participants in evaluation studies without any testing using members of the target user population.


Archive | 2008

Gaze Interaction with Virtual On-line Communities

Richard Bates; Howell O. Istance; Stephen Vickers

On-line real-time ‘immersive’ communities, such as SecondLife, are becoming increasingly popular as a means of interacting and doing things together with friends and new acquaintances. These communities represent users as avatars, through which a person may be represented by a virtual self of any shape, size, colour or other appearance, with interaction taking place in a virtual 3-dimensional world. A disabled user may construct an avatar which can reveal, or hide, their disability from other people in the virtual community, and offer a different experience from the real community.

Collaboration


Dive into the Stephen Vickers's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge