Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Päivi Majaranta is active.

Publication


Featured researches published by Päivi Majaranta.


eye tracking research & application | 2002

Twenty years of eye typing: systems and design issues

Päivi Majaranta; Kari-Jouko Räihä

Eye typing provides a means of communication for severely handicapped people, even those who are only capable of moving their eyes. This paper considers the features, functionality and methods used in the eye typing systems developed in the last twenty years. Primary concerned with text production, the paper also addresses other communication related issues, among them customization and voice output.


human factors in computing systems | 2009

Fast gaze typing with an adjustable dwell time

Päivi Majaranta; Ulla-Kaija Ahola; Oleg Špakov

Previous research shows that text entry by gaze using dwell time is slow, about 5-10 words per minute (wpm). These results are based on experiments with novices using a constant dwell time, typically between 450 and 1000 ms. We conducted a longitudinal study to find out how fast novices learn to type by gaze using an adjustable dwell time. Our results show that the text entry rate increased from 6.9 wpm in the first session to 19.9 wpm in the tenth session. Correspondingly, the dwell time decreased from an average of 876 ms to 282 ms, and the error rates decreased from 1.28% to .36%. The achieved typing speed of nearly 20 wpm is comparable with the result of 17.3 wpm achieved in an earlier, similar study with Dasher.


Archive | 2014

Eye Tracking and Eye-Based Human–Computer Interaction

Päivi Majaranta; Andreas Bulling

Eye tracking has a long history in medical and psychological research as a tool for recording and studying human visual behavior. Real-time gaze-based text entry can also be a powerful means of communication and control for people with physical disabilities. Following recent technological advances and the advent of affordable eye trackers, there is a growing interest in pervasive attention-aware systems and interfaces that have the potential to revolutionize mainstream human-technology interaction. In this chapter, we provide an introduction to the state-of-the art in eye tracking technology and gaze estimation. We discuss challenges involved in using a perceptual organ, the eye, as an input modality. Examples of real life applications are reviewed, together with design solutions derived from research results. We also discuss how to match the user requirements and key features of different eye tracking systems to find the best system for each task and application.


international conference on human computer interaction | 2005

Eye-tracking reveals the personal styles for search result evaluation

Anne Aula; Päivi Majaranta; Kari-Jouko Räihä

We used eye-tracking to study 28 users when they evaluated result lists produced by web search engines. Based on their different evaluation styles, the users were divided into economic and exhaustive evaluators. Economic evaluators made their decision about the next action (e.g., query re-formulation, following a link) faster and based on less information than exhaustive evaluators. The economic evaluation style was especially beneficial when most of the results in the result page were relevant. In these tasks, the task times were significantly shorter for economic than for exhaustive evaluators. The results suggested that economic evaluators were more experienced with computers than exhaustive evaluators. Thus, the result evaluation style seems to evolve towards a more economic style as the users gain more experience.


Universal Access in The Information Society | 2006

Effects of feedback and dwell time on eye typing speed and accuracy

Päivi Majaranta; I. Scott MacKenzie; Anne Aula; Kari-Jouko Räihä

Eye typing provides a means of communication that is especially useful for people with disabilities. However, most related research addresses technical issues in eye typing systems, and largely ignores design issues. This paper reports experiments studying the impact of auditory and visual feedback on user performance and experience. Results show that feedback impacts typing speed, accuracy, gaze behavior, and subjective experience. Also, the feedback should be matched with the dwell time. Short dwell times require simplified feedback to support the typing rhythm, whereas long dwell times allow extra information on the eye typing process. Both short and long dwell times benefit from combined visual and auditory feedback. Six guidelines for designing feedback for gaze-based text entry are provided.


eye tracking research & application | 2000

Design issues of iDICT: a gaze-assisted translation aid

Aulikki Hyrskykari; Päivi Majaranta; Antti Aaltonen; Kari-Jouko Räihä

Eye-aware applications have existed for long, but mostly for very special and restricted target populations. We have designed and are currently implementing an eye-aware application, called iDict, which is a general-purpose translation aid aimed at mass markets. iDict monitors the users gaze path while s/he is reading text written in a foreign language. When the reader encounters difficulties, iDict steps in and provides assistance with the translation. To accomplish this, the system makes use of information obtained from reading research, a language model, and the user profile. This paper describes the idea of the iDict application, the design problems and the key solutions for resolving these problems.


Text Entry Systems#R##N#Mobility, Accessibility, Universality | 2007

CHAPTER 9 – Text Entry by Gaze: Utilizing Eye Tracking

Päivi Majaranta; Kari-Jouko Räihä

This chapter reveals that for understanding the prospects and problems of text entry by gaze, it is instrumental to know how eye-tracking devices work and to understand their limitations. Text entry by gaze is intended for users with disabilities. There are also other gaze controlled applications intended for the same user group. In one sense, text entry by eye gaze is quite similar to any screen-based text entry technique, such as the on-screen keyboards used with tablet PCs. The interface is more or less the same, only the interaction technique for pointing and selecting changes. Instead of a stylus or other pointing device, eye gaze is used. The most common way to use gaze for text entry is direct pointing by looking at the desired letter. A typical setup has an on-screen keyboard with a static layout, an eye tracking device that tracks the users gaze, and a computer that analyzes the users gaze behavior.


eye tracking research & application | 2008

Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze

Outi Tuisku; Päivi Majaranta; Poika Isokoski; Kari-Jouko Räihä

Dasher is one of the best known inventions in the area of text entry in recent years. It can be used with many input devices, but studies on user performance with it are still scarce. We ran a longitudinal study where 12 participants transcribed Finnish text with Dasher in ten 15-minute sessions using a Tobii 1750 eye tracker as a pointing device. The mean text entry rate was 2.5 wpm during the first session and 17.3 wpm during the tenth session. Our results show that very high text entry rates can be achieved with eye-operated Dasher, but only after several hours of training.


eye tracking research & application | 2004

Effects of feedback on eye typing with a short dwell time

Päivi Majaranta; Anne Aula; Kari-Jouko Räihä

Eye typing provides means of communication especially for people with severe disabilities. Recent research indicates that the type of feedback impacts typing speed, error rate, and the users need to switch her gaze between the on-screen keyboard and the typed text field. The current study focuses on the issues of feedback when a short dwell time (450 ms vs. 900 ms in a previous study) is used. Results show that the findings obtained using longer dwell times only partly apply for shorter dwell times. For example, with a short dwell time, spoken feedback results in slower text entry speed and double entry errors. A short dwell time requires sharp and clear feedback that supports the typing rhythm.


human factors in computing systems | 2014

Gaze gestures and haptic feedback in mobile devices

Jari Kangas; Deepak Akkil; Jussi Rantala; Poika Isokoski; Päivi Majaranta; Roope Raisamo

Anticipating the emergence of gaze tracking capable mobile devices, we are investigating the use of gaze as an input modality in handheld mobile devices. We conducted a study of combining gaze gestures with vibrotactile feedback. Gaze gestures were used as an input method in a mobile device and vibrotactile feedback as a new alternative way to give confirmation of interaction events. Our results show that vibrotactile feedback significantly improved the use of gaze gestures. The tasks were completed faster and rated easier and more comfortable when vibrotactile feedback was provided.

Collaboration


Dive into the Päivi Majaranta's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anne Aula

University of Tampere

View shared research outputs
Researchain Logo
Decentralizing Knowledge