Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Oleg Špakov is active.

Publication


Featured researches published by Oleg Špakov.


human factors in computing systems | 2009

Fast gaze typing with an adjustable dwell time

Päivi Majaranta; Ulla-Kaija Ahola; Oleg Špakov

Previous research shows that text entry by gaze using dwell time is slow, about 5-10 words per minute (wpm). These results are based on experiments with novices using a constant dwell time, typically between 450 and 1000 ms. We conducted a longitudinal study to find out how fast novices learn to type by gaze using an adjustable dwell time. Our results show that the text entry rate increased from 6.9 wpm in the first session to 19.9 wpm in the tenth session. Correspondingly, the dwell time decreased from an average of 876 ms to 282 ms, and the error rates decreased from 1.28% to .36%. The achieved typing speed of nearly 20 wpm is comparable with the result of 17.3 wpm achieved in an earlier, similar study with Dasher.


Universal Access in The Information Society | 2009

Gaze controlled games

Poika Isokoski; M. Joos; Oleg Špakov; Benoît Martin

The quality and availability of eye tracking equipment has been increasing while costs have been decreasing. These trends increase the possibility of using eye trackers for entertainment purposes. Games that can be controlled solely through movement of the eyes would be accessible to persons with decreased limb mobility or control. On the other hand, use of eye tracking can change the gaming experience for all players, by offering richer input and enabling attention-aware games. Eye tracking is not currently widely supported in gaming, and games specifically developed for use with an eye tracker are rare. This paper reviews past work on eye tracker gaming and charts future development possibilities in different sub-domains within. It argues that based on the user input requirements and gaming contexts, conventional computer games can be classified into groups that offer fundamentally different opportunities for eye tracker input. In addition to the inherent design issues, there are challenges and varying levels of support for eye tracker use in the technical implementations of the games.


eye tracking research & application | 2012

Comparison of eye movement filters used in HCI

Oleg Špakov

We compared various real-time filters designed to denoise eye movements from low-sampling devices. Most of the filters found in literature were implemented and tested on data gathered in a previous study. An improvement was proposed for one of the filters. Parameters of each filter were adjusted to ensure their best performance. Four estimation parameters were proposed as criteria for comparison. The output from the filters was compared against two idealized signals (the signals denoised offline). The study revealed that FIR filters with triangular or Gaussian kernel (weighting) functions and parameters dependent on signal state show the best performance.


human factors in computing systems | 2009

Disambiguating ninja cursors with eye gaze

Kari-Jouko Räihä; Oleg Špakov

Ninja cursors aim to speed up target selection on large or multiple monitors. Several cursors are displayed on the screen with one of them selected as the active cursor. Eye tracking is used to choose the active cursor. An experiment with 13 participants showed that multiple cursors speed up the selection over long distances, but not over short distances. Participants felt the technique was fastest with 4 cursors per monitor, but still preferred to have only 1 cursor per monitor for their own use.


international conference on multimodal interfaces | 2005

Gaze-based selection of standard-size menu items

Oleg Špakov; Darius Miniotas

With recent advances in eye tracking technology, eye gaze gradually gains acceptance as a pointing modality. Its relatively low accuracy, however, determines the need to use enlarged controls in eye-based interfaces rendering their design rather peculiar. Another factor impairing pointing performance is deficient robustness of an eye trackers calibration. To facilitate pointing at standard-size menus, we developed a technique that uses dynamic target expansion for on-line correction of the eye trackers calibration. Correction is based on the relative change in the gaze point location upon the expansion. A user study suggests that the technique affords a dramatic six-fold improvement in selection accuracy. This is traded off against a much smaller reduction in performance speed (39%). The technique is thus believed to contribute to development of universal-access solutions supporting navigation through standard menus by eye gaze alone.


ubiquitous computing | 2012

Enhanced gaze interaction using simple head gestures

Oleg Špakov; Päivi Majaranta

We propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. The gestures were detected from the eye-tracking data by a range-based algorithm, which was found accurate enough in recognizing nodding and left-directed gestures. The gaze estimation accuracy did not noticeably suffer from the quick head motions. Participants pointed to nodding as the best gesture for occasional selections tasks and rated the other gestures as promising methods for navigation (turning) and functional mode switching (tilting). In general, dwell time works well for repeated tasks such as eye typing. However, considering multimodal games or transient interactions in pervasive and mobile environments, we believe a combination of gaze and head interaction could potentially provide a natural and more accurate interaction method.


eye tracking research & application | 2008

Effects of time pressure and text complexity on translators' fixations

Selina Sharmin; Oleg Špakov; Kari-Jouko Räihä; Arnt Lykke Jakobsen

We tracked the eye movements of 18 students as they translated three short texts with different complexity levels under three different time constraints. Participants with touch typing skills were found to attend more to on-screen text than participants without touch typing skills. Time pressure was found to mainly affect fixations on the source text, and text complexity was found to only affect the number of fixations on the source text. Overall, it was found that average fixation duration was longer in the target text area than in the source text area.


eye tracking research & application | 2014

Look and lean: accurate head-assisted eye pointing

Oleg Špakov; Poika Isokoski; Päivi Majaranta

Compared to the mouse, eye pointing is inaccurate. As a consequence, small objects are difficult to point by gaze alone. We suggest using a combination of eye pointing and subtle head movements to achieve accurate hands-free pointing in a conventional desktop computing environment. For tracking the head movements, we exploited information of the eye position in the eye trackers camera view. We conducted a series of three experiments to study the potential caveats and benefits of using head movements to adjust gaze cursor position. Results showed that head-assisted eye pointing significantly improves the pointing accuracy without a negative impact on the pointing time. In some cases participants were able to point almost 3 times closer to the targets center, compared to the eye pointing alone (7 vs. 19 pixels). We conclude that head assisted eye pointing is a comfortable and potentially very efficient alternative for other assisting methods in the eye pointing, such as zooming.


workshop on applications of computer vision | 2012

Face typing: Vision-based perceptual interface for hands-free text entry with a scrollable virtual keyboard

Yulia Gizatdinova; Oleg Špakov; Veikko Surakka

We present a novel vision-based perceptual user interface for hands-free text entry that utilizes face detection and visual gesture detection to manipulate a scrollable virtual keyboard. A thorough experimentation was undertaken to quantitatively define a performance of the interface in hands-free pointing, selection and scrolling tasks. The experiments were conducted with nine participants in laboratory conditions. Several face and head gestures were examined for detection robustness and user convenience. The system gave a reasonable performance in terms of high gesture detection rate and small false alarm rate. The participants reported that a new interface was easy to understand and operate. Encouraged by these results, we discuss advantages and constraints of the interface and suggest possibilities for design improvements.


Proceedings of the 1st Conference on Novel Gaze-Controlled Applications | 2011

Comparison of gaze-to-objects mapping algorithms

Oleg Špakov

Gaze data processing is an important and necessary step in gaze-based applications. This study focuses on the comparison of several gaze-to-object mapping algorithms using various dwell times for selection and presenting targets of several types and sizes. Seven algorithms found in literature were compared against two newly designed algorithms. The study revealed that a fractional mapping algorithm (known) has produced the highest rate of correct selections and fastest selection times, but also the highest rate of incorrect selections. The dynamic competing algorithm (designed) has shown the next best result, but also high rate of incorrect selections. A small impact on the type of target to the calculated statistics has been observed. A strictly centered gazing has helped to increase the rate of correct selections for all algorithms and types of targets. The directions for further mapping algorithms improvement and future investigation have been explained.

Collaboration


Dive into the Oleg Špakov's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge