Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Aulikki Hyrskykari is active.

Publication


Featured researches published by Aulikki Hyrskykari.


human factors in computing systems | 1998

101 spots, or how do users read menus?

Antti Aaltonen; Aulikki Hyrskykari; Kari-Jouko Räihä

In modern graphical user interfaces pull-down menus are one of the most frequently used components. But still after years of research there is no clear evidence on how the users carry out the visual search process in pull-down menus. Several models have been proposed for predicting selection times. However, most observations are based only on execution times and cannot therefore explain where the time is spent. The few models that are based on eye movement research are conflicting. In this study we present an experiment where eye movement data was gathered in a menu usage task. By analyzing the scan paths of the eye, we found that menus are read in sequential sweeps. This may explain why the best models produced by previous research are hybrid models that combine systematic reading behavior with random reading behavior.


eye tracking research & application | 2000

Design issues of iDICT: a gaze-assisted translation aid

Aulikki Hyrskykari; Päivi Majaranta; Antti Aaltonen; Kari-Jouko Räihä

Eye-aware applications have existed for long, but mostly for very special and restricted target populations. We have designed and are currently implementing an eye-aware application, called iDict, which is a general-purpose translation aid aimed at mass markets. iDict monitors the users gaze path while s/he is reading text written in a foreign language. When the reader encounters difficulties, iDict steps in and provides assistance with the translation. To accomplish this, the system makes use of information obtained from reading research, a language model, and the user profile. This paper describes the idea of the iDict application, the design problems and the key solutions for resolving these problems.


eye tracking research & application | 2010

Designing gaze gestures for gaming: an investigation of performance

Howell O. Istance; Aulikki Hyrskykari; Lauri Immonen; Santtu Mansikkamaa; Stephen Vickers

To enable people with motor impairments to use gaze control to play online games and take part in virtual communities, new interaction techniques are needed that overcome the limitations of dwell clicking on icons in the games interface. We have investigated gaze gestures as a means of achieving this. We report the results of an experiment with 24 participants that examined performance differences between different gestures. We were able to predict the effect on performance of the numbers of legs in the gesture and the primary direction of eye movement in a gesture. We also report the outcomes of user trials in which 12 experienced gamers used the gaze gesture interface to play World of Warcraft. All participants were able to move around and engage other characters in fighting episodes successfully. Gestures were good for issuing specific commands such as spell casting, and less good for continuous control of movement compared with other gaze interaction techniques we have developed.


eye tracking research & application | 2012

Gaze gestures or dwell-based interaction?

Aulikki Hyrskykari; Howell O. Istance; Stephen Vickers

The two cardinal problems recognized with gaze-based interaction techniques are: how to avoid unintentional commands, and how to overcome the limited accuracy of eye tracking. Gaze gestures are a relatively new technique for giving commands, which has the potential to overcome these problems. We present a study that compares gaze gestures with dwell selection as an interaction technique. The study involved 12 participants and was performed in the context of using an actual application. The participants gave commands to a 3D immersive game using gaze gestures and dwell icons. We found that gaze gestures are not only a feasible means of issuing commands in the course of game play, but they also exhibited performance that was at least as good as or better than dwell selections. The gesture condition produced less than half of the errors when compared with the dwell condition. The study shows that gestures provide a robust alternative to dwell-based interaction with the reliance on positional accuracy being substantially reduced.


Computers in Human Behavior | 2006

Utilizing eye movements : Overcoming inaccuracy while tracking the focus of attention during reading

Aulikki Hyrskykari

Even though eye movements during reading have been studied intensively for decades, applications that track the reading of longer passages of text in real time are rare. The problems encountered in developing such an application (a reading aid, iDict), and the solutions to the problems are described. Some of the issues are general and concern the broad family of Attention Aware Systems. Others are specific to the modality of interest: eye gaze. One of the most difficult problems when using eye tracking to identify the focus of visual attention is the inaccuracy of the eye trackers used to measure the point of gaze. The inaccuracy inevitably affects the design decisions of any application exploiting the point of gaze for localizing the point of visual attention. The problem is demonstrated with examples from our experiments. The principles of the drift correction algorithms that automatically correct the vertical inaccuracy are presented and the performance of the algorithms is evaluated.


international conference on human computer interaction | 2009

For Your Eyes Only: Controlling 3D Online Games by Eye-Gaze

Howell O. Istance; Aulikki Hyrskykari; Stephen Vickers; Thiago Chaves

Massively multiplayer online role-playing games, such as World of Warcraft, have become the most widespread 3D graphical environments with millions of active subscribers worldwide. People with severe motor impairments should be able to take part in these games without the extent of their disability being apparent to others online. Eye gaze is a high bandwidth modality that can support this. We have developed a software device that uses gaze input in different modes for emulating mouse and keyboard events appropriate for interacting with on-line games. We report an evaluation study that investigated gaze-based interaction with World of Warcraft using the device. We have found that it is feasible to carry out tasks representative of game play at a beginners skill level using gaze alone. The results from the locomotion task part of the study show similar performance for gaze-based interaction compared with a keyboard and mouse. We discuss the usability issues that arose when completing three types of tasks in the game and the implications of these for playing of this type of game using gaze as the only input modality.


human factors in computing systems | 2009

Gaze-based interaction with massively multiplayer on-line games

Howell O. Istance; Stephen Vickers; Aulikki Hyrskykari

People with motor impairments can benefit greatly from being able to take part in Massively Multiplayer On-line Games, such as World of Warcraft. We are investigating how to use eye gaze as a high bandwidth input modality for the range of tasks necessary to participate in the game. We approach this from two directions; in the bottom-up approach we iteratively implement and eva-luate various gaze-interaction techniques, and in the top-down approach we analyze the interaction in MMOGs and develop a theory to map games tasks to gaze-based interaction techniques. We present preliminary results from a recently conducted set of trials which have studied how well tasks in World of Warcraft can be carried out using gaze only. We describe this in the context of the whole project.


Universal Access in The Information Society | 2009

Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimation

Helmut Prendinger; Aulikki Hyrskykari; Minoru Nakayama; Howell O. Istance; Nikolaus Bee; Yosiyuki Takahasi

Attentive user interfaces (AUIs) capitalize on the rich information that can be obtained from users’ gaze behavior in order to infer relevant aspects of their cognitive state. Not only is eye gaze an excellent clue to states of interest and intention, but also to preference and confidence in comprehension. AUIs are built with the aim of adapting the interface to the user’s current information need, and thus reduce workload of interaction. Given those characteristics, it is believed that AUIs can have particular benefits for users with severe disabilities, for whom operating a physical device (like a mouse pointer) might be very strenuous or infeasible. This paper presents three studies that attempt to gauge uncertainty and intention on the part of the user from gaze data, and compare the success of each approach. The paper discusses how the application of the approaches adopted in each study to user interfaces can support users with severe disabilities.


Archive | 2011

Gaze-Aware Systems and Attentive Applications

Howell O. Istance; Aulikki Hyrskykari

Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies focuses on interactive communication and control tools based on gaze tracking, including eye typing, computer control, and gaming, with special attention to assistive technologies. For researchers and practitioners interested in the applied use of gaze tracking, the book offers instructions for building a basic eye tracker from off-the-shelf components, gives practical hints on building interactive applications, presents smooth and efficient interaction techniques, and summarizes the results of effective research on cutting edge gaze interaction applications.


ACM Transactions on Accessible Computing | 2013

Performing Locomotion Tasks in Immersive Computer Games with an Adapted Eye-Tracking Interface

Stephen Vickers; Howell O. Istance; Aulikki Hyrskykari

Young people with severe physical disabilities may benefit greatly from participating in immersive computer games. In-game tasks can be fun, engaging, educational, and socially interactive. But for those who are unable to use traditional methods of computer input such as a mouse and keyboard, there is a barrier to interaction that they must first overcome. Eye-gaze interaction is one method of input that can potentially achieve the levels of interaction required for these games. How we use eye-gaze or the gaze interaction technique depends upon the task being performed, the individual performing it, and the equipment available. To fully realize the impact of participation in these environments, techniques need to be adapted to the person’s abilities. We describe an approach to designing and adapting a gaze interaction technique to support locomotion, a task central to immersive game playing. This is evaluated by a group of young people with cerebral palsy and muscular dystrophy. The results show that by adapting the interaction technique, participants are able to significantly improve their in-game character control.

Collaboration


Dive into the Aulikki Hyrskykari's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Qiang Ji

Rensselaer Polytechnic Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge