Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Emilie Møllenbach is active.

Publication


Featured researches published by Emilie Møllenbach.


eye tracking research & application | 2010

Evaluation of a low-cost open-source gaze tracker

Javier San Agustin; Henrik H. T. Skovsgaard; Emilie Møllenbach; Maria Barret; Martin Tall; Dan Witzner Hansen; John Paulin Hansen

This paper presents a low-cost gaze tracking system that is based on a webcam mounted close to the users eye. The performance of the gaze tracker was evaluated in an eye-typing task using two different typing applications. Participants could type between 3.56 and 6.78 words per minute, depending on the typing system used. A pilot study to assess the usability of the system was also carried out in the home of a user with severe motor impairments. The user successfully typed on a wall-projected interface using his eye movements.


eye tracking research & application | 2008

Noise tolerant selection by gaze-controlled pan and zoom in 3D

Dan Witzner Hansen; Henrik H. T. Skovsgaard; John Paulin Hansen; Emilie Møllenbach

This paper presents StarGazer - a new 3D interface for gaze-based interaction and target selection using continuous pan and zoom. Through StarGazer we address the issues of interacting with graph structured data and applications (i.e. gaze typing systems) using low resolution eye trackers or small-size displays. We show that it is possible to make robust selection even with a large number of selectable items on the screen and noisy gaze trackers. A test with 48 subjects demonstrated that users who have never tried gaze interaction before could rapidly adapt to the navigation principles of StarGazer. We tested three different display sizes (down to PDA-sized displays) and found that large screens are faster to navigate than small displays and that the error rate is higher for the smallest display. Half of the subjects were exposed to severe noise deliberately added on the cursor positions. We found that this had a negative impact on efficiency. However, the user remained in control and the noise did not seem to effect the error rate. Additionally, three subjects tested the effects of temporally adding noise to simulate latency in the gaze tracker. Even with a significant latency (about 200 ms) the subjects were able to type at acceptable rates. In a second test, seven subjects were allowed to adjust the zooming speed themselves. They achieved typing rates of more than eight words per minute without using language modeling. We conclude that the StarGazer application is an intuitive 3D interface for gaze navigation, allowing more selectable objects to be displayed on the screen than the accuracy of the gaze trackers would otherwise permit.


eye tracking research & application | 2010

Single gaze gestures

Emilie Møllenbach; Martin Lillholm; Alastair G. Gail; John Paulin Hansen

This paper examines gaze gestures and their applicability as a generic selection method for gaze-only controlled interfaces. The method explored here is the Single Gaze Gesture (SGG), i.e. gestures consisting of a single point-to-point eye movement. Horizontal and vertical, long and short SGGs were evaluated on two eye tracking devices (Tobii/QuickGlance (QG)). The main findings show that there is a significant difference in selection times between long and short SGGs, between vertical and horizontal selections, as well as between the different tracking systems.


human factors in computing systems | 2009

Single stroke gaze gestures

Emilie Møllenbach; John Paulin Hansen; Martin Lillholm; Alastair G. Gale

This paper introduces and explains the concept of single stroke gaze gestures. Some preliminary results are presented which indicate the potential efficiency of this interaction method and we show how the method could be implemented for the benefit of disabled users and generally how it could be integrated with gaze dwell to create a new dimension in gaze controlled interfaces.


intelligent user interfaces | 2008

All eyes on the monitor: gaze based interaction in zoomable, multi-scaled information-spaces

Emilie Møllenbach; Thorarinn Stefansson; John Paulin Hansen

The experiment described in this paper, shows a test environment constructed with two information spaces; one large with 2000 nodes ordered in semi-structured groups in which participants performed search and browse tasks; the other was smaller and designed for precision zooming, where subjects performed target selection simulation tasks. For both tasks, modes of gaze- and mouse-controlled navigation were compared. The results of the browse and search tasks showed that the performances of the most efficient mouse and gaze implementations were indistinguishable. However, in the target selection simulation tasks the most efficient gazecontrol proved to be about 16% faster than the most efficient mouse-control. The results indicate that gaze-controlled pan/zoom navigation is a viable alternative to mouse control in inspection and target exploration of large, multi-scale environments. However, supplementing mouse control with gaze navigation also holds interesting potential for interface and interaction design.


human computer interaction with mobile devices and services | 2015

A GazeWatch Prototype

John Paulin Hansen; Florian Biermann; Emilie Møllenbach; Haakon Lund; Javier San Agustin; Sebastian Sztuk

We demonstrate potentials of adding a gaze tracking unit to a smartwatch, allowing hands-free interaction with the watch itself and control of the environment. Users give commands via gaze gestures, i.e. looking away and back to the GazeWatch. Rapid presentation of single words on the watch display provides a rich and effective textual interface. Finally, we exemplify how the GazeWatch can be used as a ubiquitous pointer on large displays.


human factors in computing systems | 2012

HCI and sustainability: the role of macrostructures

Emilie Møllenbach; Jens Hoff; Kasper Hornbæk

Sustained behavior changes are required to reduce the impact of human society on the environment. Much research on how HCI may help do so focuses on changing behavior by providing information directed at an individual or a microstructure (e.g., household). We propose societal macrostructures (e.g., municipalities) and their interaction with microstructures as a focus for HCI aimed at designing behavior change. We present two ongoing case studies involving municipalities in Denmark and discuss how and why macrostructures may be used in the design of HCI for behavior based environmental sustainability.


nordic conference on human-computer interaction | 2014

Teaching to tinker: making as an educational strategy

Daniel Cermak-Sassenrath; Emilie Møllenbach

Maker communities and hacker spaces engaged in tangible computing are popping up in and outside the academic setting driven by curiosity and a desire to learn. This workshop is concerned with how making can be and has been used in an academic setting. Making shifts the focus of education from prescribed tasks towards what people want to know or do.


international conference on computers helping people with special needs | 2008

Eye, Me and the Environment

Fangmin Shi; Alastair G. Gale; Emilie Møllenbach

In order to broaden the applicability of an eye-tracking based assistive technology the available environmental control systems are reviewed. Their advantages and limitations are discussed with respect to their usage with eye tracking technology for aiding people with special needs. It is concluded that each system has its own distinct advantages for this task, linked also to their availability, ease of use and cost. Consequently a modular design approach is advocated for eye tracking control technologies in this domain to make them as generically applicable as possible.


EAI Endorsed Transactions on Ambient Systems | 2016

Classroom Habit(us) and Physical Co-presence in a Blended Learning Environment

Valeria Borsotti; Emilie Møllenbach

In this exploratory case study we map the educational practice of teachers and students in a professional master of Interaction Design. Through a grounded analysis of the context we describe and reflect on: 1) the use of digital learning tools in a blended learning environment, 2) co-presence as an educational parameter. We use the concept of habitus (Bourdieu, 1977) to engage with the empirical context, and we adopt the Reggio Emilia perspective of viewing space, both physical and social, as the third teacher (Edwards et al, 1998). This investigation has led to insights into the existing practice of educators and students, as well as the identification of emerging themes for future research.

Collaboration


Dive into the Emilie Møllenbach's collaboration.

Top Co-Authors

Avatar

John Paulin Hansen

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Martin Lillholm

University College London

View shared research outputs
Top Co-Authors

Avatar

Dan Witzner Hansen

IT University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Javier San Agustin

IT University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alexandre Alapetite

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Valeria Borsotti

IT University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge