Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Javier San Agustin is active.

Publication


Featured researches published by Javier San Agustin.


eye tracking research & application | 2010

Evaluation of a low-cost open-source gaze tracker

Javier San Agustin; Henrik H. T. Skovsgaard; Emilie Møllenbach; Maria Barret; Martin Tall; Dan Witzner Hansen; John Paulin Hansen

This paper presents a low-cost gaze tracking system that is based on a webcam mounted close to the users eye. The performance of the gaze tracker was evaluated in an eye-typing task using two different typing applications. Participants could type between 3.56 and 6.78 words per minute, depending on the typing system used. A pilot study to assess the usability of the system was also carried out in the home of a user with severe motor impairments. The user successfully typed on a wall-projected interface using his eye movements.


human factors in computing systems | 2009

Low-cost gaze interaction: ready to deliver the promises

Javier San Agustin; Henrik H. T. Skovsgaard; John Paulin Hansen; Dan Witzner Hansen

Eye movements are the only means of communication for some severely disabled people. However, the high prices of commercial eye tracking systems limit the access to this technology. In this pilot study we compare the performance of a low-cost, webcam-based gaze tracker that we have developed with two commercial trackers in two different tasks: target acquisition and eye typing. From analyses on throughput, words per minute and error rates we conclude that a low-cost solution can be as efficient as expensive commercial systems.


eye tracking research & application | 2010

Homography normalization for robust gaze estimation in uncalibrated setups

Dan Witzner Hansen; Javier San Agustin; Arantxa Villanueva

Homography normalization is presented as a novel gaze estimation method for uncalibrated setups. The method applies when head movements are present but without any requirements to camera calibration or geometric calibration. The method is geometrically and empirically demonstrated to be robust to head pose changes and despite being less constrained than cross-ratio methods, it consistently performs favorably by several degrees on both simulated data and data from physical setups. The physical setups include the use of off-the-shelf web cameras with infrared light (night vision) and standard cameras with and without infrared light. The benefits of homography normalization and uncalibrated setups in general are also demonstrated through obtaining gaze estimates (in the visible spectrum) using only the screen reflections on the cornea.


eye tracking research & application | 2012

Gaze input for mobile devices by dwell and gestures

Morten Lund Dybdal; Javier San Agustin; John Paulin Hansen

This paper investigates whether it is feasible to interact with the small screen of a smartphone using eye movements only. Two of the most common gaze-based selection strategies, dwell time selections and gaze gestures are compared in a target selection experiment. Finger-strokes and accelerometer-based interaction, i. e. tilting, are also considered. In an experiment with 11 subjects we found gaze interaction to have a lower performance than touch interaction but comparable to the error rate and completion time of accelerometer (i.e. tilt) interaction. Gaze gestures had a lower error rate and were faster than dwell selections by gaze, especially for small targets, suggesting that this method may be the best option for hands-free gaze control of smartphones.


human factors in computing systems | 2008

Gaze beats mouse: hands-free selection by combining gaze and emg

Julio C. Mateo; Javier San Agustin; John Paulin Hansen

Facial EMG for selection is fast, easy and, combined with gaze pointing, it can provide completely hands-free interaction. In this pilot study, 5 participants performed a simple point-and-select task using mouse or gaze for pointing and a mouse button or a facial-EMG switch for selection. Gaze pointing was faster than mouse pointing, while maintaining a similar error rate. EMG and mouse-button selection had a comparable performance. From analyses of completion time, throughput and error rates, we concluded that the combination of gaze and facial EMG holds potential for outperforming the mouse.


human factors in computing systems | 2011

Low cost vs. high-end eye tracking for usability testing

Sune Alstrup Johansen; Javier San Agustin; Henrik H. T. Skovsgaard; John Paulin Hansen; Martin Tall

Accuracy of an open source remote eye tracking system and a state-of-the-art commercial eye tracker was measured 4 times during a usability test. Results from 9 participants showed both devices to be fairly stable over time, but the commercial tracker was more accurate with a mean error of 31 pixels against 59 pixels using the low cost system. This suggests that low cost eye tracking can become a viable alternative, when usability studies need not to distinguish between, for instance, particular words or menu items that participants are looking at, but only between larger areas-of-interest they pay attention to.


human factors in computing systems | 2009

Low-cost gaze pointing and EMG clicking

Javier San Agustin; John Paulin Hansen; Dan Witzner Hansen; Henrik H. T. Skovsgaard

Some severely disabled people are excluded from using gaze interaction because gaze trackers are usually expensive (above


Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications | 2016

Wrist-worn pervasive gaze interaction

John Paulin Hansen; Haakon Lund; Florian Biermann; Emillie Møllenbach; Sebastian Sztuk; Javier San Agustin

10.000). In this paper we present a low-cost gaze pointer, which we have tested in combination with a desktop monitor and a wearable display. It is not as accurate as commercial gaze trackers, and walking while pointing with gaze on a wearable display turned out to be particularly difficult. However, in front of a desktop monitor it is precise enough to support communication. Supplemented with a commercial EMG switch it offers a complete hands-free, gaze-and-click control for less than


international symposium on wearable computers | 2015

A gaze interactive textual smartwatch interface

John Paulin Hansen; Florian Biermann; Janus Askø Madsen; Morten Jonassen; Haakon Lund; Javier San Agustin; Sebastian Sztuk

200.


Proceedings of the 1st Conference on Novel Gaze-Controlled Applications | 2011

Gaze interaction from bed

John Paulin Hansen; Javier San Agustin; Henrik H. T. Skovsgaard

This paper addresses gaze interaction for smart home control, conducted from a wrist-worn unit. First we asked ten people to enact the gaze movements they would propose for e.g. opening a door or adjusting the room temperature. On basis of their suggestions we built and tested different versions of a prototype applying off-screen stroke input. Command prompts were given to twenty participants by text or arrow displays. The success rate achieved by the end of their first encounter with the system was 46% in average; it took them 1.28 seconds to connect with the system and 1.29 seconds to make a correct selection. Their subjective evaluations were positive with regard to the speed of the interaction. We conclude that gaze gesture input seems feasible for fast and brief remote control of smart home technology provided that robustness of tracking is improved.

Collaboration


Dive into the Javier San Agustin's collaboration.

Top Co-Authors

Avatar

John Paulin Hansen

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dan Witzner Hansen

IT University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Florian Biermann

IT University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar

Haakon Lund

University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge