Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Henrik H. T. Skovsgaard is active.

Publication


Featured researches published by Henrik H. T. Skovsgaard.


eye tracking research & application | 2010

Evaluation of a low-cost open-source gaze tracker

Javier San Agustin; Henrik H. T. Skovsgaard; Emilie Møllenbach; Maria Barret; Martin Tall; Dan Witzner Hansen; John Paulin Hansen

This paper presents a low-cost gaze tracking system that is based on a webcam mounted close to the users eye. The performance of the gaze tracker was evaluated in an eye-typing task using two different typing applications. Participants could type between 3.56 and 6.78 words per minute, depending on the typing system used. A pilot study to assess the usability of the system was also carried out in the home of a user with severe motor impairments. The user successfully typed on a wall-projected interface using his eye movements.


human factors in computing systems | 2009

Low-cost gaze interaction: ready to deliver the promises

Javier San Agustin; Henrik H. T. Skovsgaard; John Paulin Hansen; Dan Witzner Hansen

Eye movements are the only means of communication for some severely disabled people. However, the high prices of commercial eye tracking systems limit the access to this technology. In this pilot study we compare the performance of a low-cost, webcam-based gaze tracker that we have developed with two commercial trackers in two different tasks: target acquisition and eye typing. From analyses on throughput, words per minute and error rates we conclude that a low-cost solution can be as efficient as expensive commercial systems.


eye tracking research & application | 2008

Noise tolerant selection by gaze-controlled pan and zoom in 3D

Dan Witzner Hansen; Henrik H. T. Skovsgaard; John Paulin Hansen; Emilie Møllenbach

This paper presents StarGazer - a new 3D interface for gaze-based interaction and target selection using continuous pan and zoom. Through StarGazer we address the issues of interacting with graph structured data and applications (i.e. gaze typing systems) using low resolution eye trackers or small-size displays. We show that it is possible to make robust selection even with a large number of selectable items on the screen and noisy gaze trackers. A test with 48 subjects demonstrated that users who have never tried gaze interaction before could rapidly adapt to the navigation principles of StarGazer. We tested three different display sizes (down to PDA-sized displays) and found that large screens are faster to navigate than small displays and that the error rate is higher for the smallest display. Half of the subjects were exposed to severe noise deliberately added on the cursor positions. We found that this had a negative impact on efficiency. However, the user remained in control and the noise did not seem to effect the error rate. Additionally, three subjects tested the effects of temporally adding noise to simulate latency in the gaze tracker. Even with a significant latency (about 200 ms) the subjects were able to type at acceptable rates. In a second test, seven subjects were allowed to adjust the zooming speed themselves. They achieved typing rates of more than eight words per minute without using language modeling. We conclude that the StarGazer application is an intuitive 3D interface for gaze navigation, allowing more selectable objects to be displayed on the screen than the accuracy of the gaze trackers would otherwise permit.


human factors in computing systems | 2011

Low cost vs. high-end eye tracking for usability testing

Sune Alstrup Johansen; Javier San Agustin; Henrik H. T. Skovsgaard; John Paulin Hansen; Martin Tall

Accuracy of an open source remote eye tracking system and a state-of-the-art commercial eye tracker was measured 4 times during a usability test. Results from 9 participants showed both devices to be fairly stable over time, but the commercial tracker was more accurate with a mean error of 31 pixels against 59 pixels using the low cost system. This suggests that low cost eye tracking can become a viable alternative, when usability studies need not to distinguish between, for instance, particular words or menu items that participants are looking at, but only between larger areas-of-interest they pay attention to.


eye tracking research & application | 2010

Small-target selection with gaze alone

Henrik H. T. Skovsgaard; Julio C. Mateo; John M. Flach; John Paulin Hansen

Accessing the smallest targets in mainstream interfaces using gaze alone is difficult, but interface tools that effectively increase the size of selectable objects can help. In this paper, we propose a conceptual framework to organize existing tools and guide the development of new tools. We designed a discrete zoom tool and conducted a proof-of-concept experiment to test the potential of the framework and the tool. Our tool was as fast as and more accurate than the currently available two-step magnification tool. Our framework shows potential to guide the design, development, and testing of zoom tools to facilitate the accessibility of mainstream interfaces for gaze users.


human factors in computing systems | 2009

Low-cost gaze pointing and EMG clicking

Javier San Agustin; John Paulin Hansen; Dan Witzner Hansen; Henrik H. T. Skovsgaard

Some severely disabled people are excluded from using gaze interaction because gaze trackers are usually expensive (above


Proceedings of the 1st Conference on Novel Gaze-Controlled Applications | 2011

Gaze interaction from bed

John Paulin Hansen; Javier San Agustin; Henrik H. T. Skovsgaard

10.000). In this paper we present a low-cost gaze pointer, which we have tested in combination with a desktop monitor and a wearable display. It is not as accurate as commercial gaze trackers, and walking while pointing with gaze on a wearable display turned out to be particularly difficult. However, in front of a desktop monitor it is precise enough to support communication. Supplemented with a commercial EMG switch it offers a complete hands-free, gaze-and-click control for less than


Proceedings of the 1st Conference on Novel Gaze-Controlled Applications | 2011

Evaluation of a remote webcam-based eye tracker

Henrik H. T. Skovsgaard; Javier San Agustin; Sune Alstrup Johansen; John Paulin Hansen; Martin Tall

200.


Behaviour & Information Technology | 2011

Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets

Henrik H. T. Skovsgaard; Julio C. Mateo; John Paulin Hansen

This paper presents a low-cost gaze tracking solution for bedbound people composed of free-ware tracking software and commodity hardware. Gaze interaction is done on a large wall-projected image, visible to all people present in the room. The hardware equipment leaves physical space free to assist the person. Accuracy and precision of the tracking system was tested in an experiment with 12 subjects. We obtained a tracking quality that is sufficiently good to control applications designed for gaze interaction. The best tracking condition were achieved when people were sitting up compared to lying down. Also, gaze tracking in the bottom part of the image was found to be more precise than in the top part.


human factors in computing systems | 2009

Gaze-controlled driving

Martin Tall; Alexandre Alapetite; Javier San Agustin; Henrik H. T. Skovsgaard; John Paulin Hansen; Dan Witzner Hansen; Emilie Møllenbach

In this paper we assess the performance of an open-source gaze tracker in a remote (i.e. table-mounted) setup, and compare it with two other commercial eye trackers. An experiment with 5 subjects showed the open-source eye tracker to have a significantly higher level of accuracy than one of the commercial systems, Mirametrix S1, but also a higher error rate than the other commercial system, a Tobii T60. We conclude that the web-camera solution may be viable for people who need a substitute for the mouse input but cannot afford a commercial system.

Collaboration


Dive into the Henrik H. T. Skovsgaard's collaboration.

Top Co-Authors

Avatar

John Paulin Hansen

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Javier San Agustin

IT University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar

Dan Witzner Hansen

IT University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alexandre Alapetite

Technical University of Denmark

View shared research outputs
Researchain Logo
Decentralizing Knowledge