Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hans Gellersen is active.

Publication


Featured researches published by Hans Gellersen.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2011

Eye Movement Analysis for Activity Recognition Using Electrooculography

Andreas Bulling; Jamie A. Ward; Hans Gellersen; Gerhard Tröster

In this work, we investigate eye movement analysis as a new sensing modality for activity recognition. Eye movement data were recorded using an electrooculography (EOG) system. We first describe and evaluate algorithms for detecting three eye movement characteristics from EOG signals-saccades, fixations, and blinks-and propose a method for assessing repetitive patterns of eye movements. We then devise 90 different features based on these characteristics and select a subset of them using minimum redundancy maximum relevance (mRMR) feature selection. We validate the method using an eight participant study in an office environment using an example set of five activity classes: copying a text, reading a printed paper, taking handwritten notes, watching a video, and browsing the Web. We also include periods with no specific activity (the NULL class). Using a support vector machine (SVM) classifier and person-independent (leave-one-person-out) training, we obtain an average precision of 76.1 percent and recall of 70.5 percent over all classes and participants. The work demonstrates the promise of eye-based activity recognition (EAR) and opens up discussion on the wider applicability of EAR to other activities that are difficult, or even impossible, to detect using common sensing modalities.


international conference on pervasive computing | 2007

Shake well before use: authentication based on accelerometer data

Rene Mayrhofer; Hans Gellersen

Small, mobile devices without user interfaces, such as Bluetooth headsets, often need to communicate securely over wireless networks. Active attacks can only be prevented by authenticating wireless communication, which is problematic when devices do not have any a priori information about each other. We introduce a new method for device-to-device authentication by shaking devices together. This paper describes two protocols for combining cryptographic authentication techniques with known methods of accelerometer data analysis to the effect of generating authenticated, secret keys. The protocols differ in their design, one being more conservative from a security point of view, while the other allows more dynamic interactions. Three experiments are used to optimize and validate our proposed authentication method.


IEEE Transactions on Mobile Computing | 2009

Shake Well Before Use: Intuitive and Secure Pairing of Mobile Devices

Rene Mayrhofer; Hans Gellersen

A challenge in facilitating spontaneous mobile interactions is to provide pairing methods that are both intuitive and secure. Simultaneous shaking is proposed as a novel and easy-to-use mechanism for pairing of small mobile devices. The underlying principle is to use common movement as a secret that the involved devices share for mutual authentication. We present two concrete methods, ShaVe and ShaCK, in which sensing and analysis of shaking movement is combined with cryptographic protocols for secure authentication. ShaVe is based on initial key exchange followed by exchange and comparison of sensor data for verification of key authenticity. ShaCK, in contrast, is based on matching features extracted from the sensor data to construct a cryptographic key. The classification algorithms used in our approach are shown to robustly separate simultaneous shaking of two devices from other concurrent movement of a pair of devices, with a false negative rate of under 12 percent. A user study confirms that the method is intuitive and easy to use, as users can shake devices in an arbitrary pattern.


IEEE Pervasive Computing | 2010

Location and Navigation Support for Emergency Responders: A Survey

Carl Fischer; Hans Gellersen

As this overview of products and projects shows, preinstalled location systems, wireless sensor networks, and inertial sensing all have benefits and drawbacks when considering emergency response requirements.


ACM Transactions on Computer-Human Interaction | 2005

Expected, sensed, and desired: A framework for designing sensing-based interaction

Steve Benford; Holger Schnädelbach; Boriana Koleva; Rob Anastasi; Chris Greenhalgh; Tom Rodden; Jonathan Green; Ahmed Ghali; Tony P. Pridmore; Bill Gaver; Andy Boucher; Brendan Walker; Sarah Pennington; Albrecht Schmidt; Hans Gellersen; Anthony Steed

Movements of interfaces can be analyzed in terms of whether they are expected, sensed, and desired. Expected movements are those that users naturally perform; sensed are those that can be measured by a computer; and desired movements are those that are required by a given application. We show how a systematic comparison of expected, sensed, and desired movements, especially with regard to how they do not precisely overlap, can reveal potential problems with an interface and also inspire new features. We describe how this approach has been applied to the design of three interfaces: pointing flashlights at walls and posters in order to play sounds; the Augurscope II, a mobile augmented reality interface for outdoors; and the Drift Table, an item of furniture that uses load sensing to control the display of aerial photographs. We propose that this approach can help to build a bridge between the analytic and inspirational approaches to design and can help designers meet the challenges raised by a diversification of sensing technologies and interface forms, increased mobility, and an emerging focus on technologies for everyday life.


ubiquitous computing | 2013

Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets

Mélodie Vidal; Andreas Bulling; Hans Gellersen

Although gaze is an attractive modality for pervasive interactions, the real-world implementation of eye-based interfaces poses significant challenges, such as calibration. We present Pursuits, an innovative interaction technique that enables truly spontaneous interaction with eye-based interfaces. A user can simply walk up to the screen and readily interact with moving targets. Instead of being based on gaze location, Pursuits correlates eye pursuit movements with objects dynamically moving on the interface. We evaluate the influence of target speed, number and trajectory and develop guidelines for designing Pursuits-based interfaces. We then describe six realistic usage scenarios and implement three of them to evaluate the method in a usability study and a field study. Our results show that Pursuits is a versatile and robust technique and that users can interact with Pursuits-based interfaces without prior knowledge or preparation phase.


IEEE Computer Graphics and Applications | 2004

Building intelligent environments with Smart-Its

Lars Erik Holmquist; Hans Gellersen; Gerd Kortuem; Stavros Antifakos; Florian Michahelles; Bernt Schiele; Michael Beigl; Ramia Mazé

Smart-Its are self-contained, stick-on computers that attach to everyday objects. These augmented objects become soft media, enabling dynamic digital relationships with users and each other. In the Smart-Its project, we are developing technology to realize a vision of computation everywhere, where computer technology seamlessly integrates into everyday life, supporting users in their daily tasks. By embedding sensors, computation, and communication into common artifacts, future computing applications can adapt to human users rather than the other way around. However, its currently difficult to develop this type of ubiquitous computing because of the lack of toolkits integrating both the required hardware and software. Therefore, we are creating a class of small computers - called Smart-Its - equipped with wireless communication and sensors to make it possible to create smart artifacts with little overhead.


IEEE Pervasive Computing | 2004

Physical prototyping with Smart-Its

Hans Gellersen; Gerd Kortuem; Albrecht Schmidt; Michael Beigl

Exploring novel ubiquitous computing systems and applications inevitably requires prototyping physical components. Smart-Its are hardware and software components that augment physical objects with embedded processing and interaction to address this need. Our work, which uses small computing devices called Smart-Its, addresses the need to create embedded interactive systems that disappear from the foreground to become secondary to the physical objects with which people interact during everyday activities. Such systems create new design challenges related to prototyping with embedded technologies and require careful consideration of the physical design context.


IEEE Pervasive Computing | 2010

Toward Mobile Eye-Based Human-Computer Interaction

Andreas Bulling; Hans Gellersen

Current research on eye-based interfaces mostly focuses on stationary settings. However, advances in mobile eye-tracking equipment and automated eye-movement analysis now allow for investigating eye movements during natural behavior and promise to bring eye-based interaction into peoples everyday lives. Recent developments in mobile eye tracking equipment point the way toward unobtrusive human-computer interfaces that will become pervasively usable in everyday life. The potential applications for the further capability to track and analyze eye movements anywhere and anytime calls for new research to develop and understand eye-based interaction in mobile daily life settings.


user interface software and technology | 2010

PhoneTouch: a technique for direct phone interaction on surfaces

Dominik Schmidt; Fadi Chehimi; Enrico Rukzio; Hans Gellersen

PhoneTouch is a novel technique for integration of mobile phones and interactive surfaces. The technique enables use of phones to select targets on the surface by direct touch, facilitating for instance pick&drop-style transfer of objects between phone and surface. The technique is based on separate detection of phone touch events by the surface, which determines location of the touch, and by the phone, which contributes device identity. The device-level observations are merged based on correlation in time. We describe a proof-of-concept implementation of the technique, using vision for touch detection on the surface (including discrimination of finger versus phone touch) and acceleration features for detection by the phone.

Collaboration


Dive into the Hans Gellersen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rene Mayrhofer

Johannes Kepler University of Linz

View shared research outputs
Researchain Logo
Decentralizing Knowledge