Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andreas Bulling is active.

Publication


Featured researches published by Andreas Bulling.


ACM Computing Surveys | 2014

A tutorial on human activity recognition using body-worn inertial sensors

Andreas Bulling; Ulf Blanke; Bernt Schiele

The last 20 years have seen ever-increasing research activity in the field of human activity recognition. With activity recognition having considerably matured, so has the number of challenges in designing, implementing, and evaluating activity recognition systems. This tutorial aims to provide a comprehensive hands-on introduction for newcomers to the field of human activity recognition. It specifically focuses on activity recognition using on-body inertial sensors. We first discuss the key research challenges that human activity recognition shares with general pattern recognition and identify those challenges that are specific to human activity recognition. We then describe the concept of an Activity Recognition Chain (ARC) as a general-purpose framework for designing and evaluating activity recognition systems. We detail each component of the framework, provide references to related research, and introduce the best practice methods developed by the activity recognition research community. We conclude with the educational example problem of recognizing different hand gestures from inertial sensors attached to the upper and lower arm. We illustrate how each component of this framework can be implemented for this specific activity recognition problem and demonstrate how different implementations compare and how they impact overall recognition performance.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2011

Eye Movement Analysis for Activity Recognition Using Electrooculography

Andreas Bulling; Jamie A. Ward; Hans Gellersen; Gerhard Tröster

In this work, we investigate eye movement analysis as a new sensing modality for activity recognition. Eye movement data were recorded using an electrooculography (EOG) system. We first describe and evaluate algorithms for detecting three eye movement characteristics from EOG signals-saccades, fixations, and blinks-and propose a method for assessing repetitive patterns of eye movements. We then devise 90 different features based on these characteristics and select a subset of them using minimum redundancy maximum relevance (mRMR) feature selection. We validate the method using an eight participant study in an office environment using an example set of five activity classes: copying a text, reading a printed paper, taking handwritten notes, watching a video, and browsing the Web. We also include periods with no specific activity (the NULL class). Using a support vector machine (SVM) classifier and person-independent (leave-one-person-out) training, we obtain an average precision of 76.1 percent and recall of 70.5 percent over all classes and participants. The work demonstrates the promise of eye-based activity recognition (EAR) and opens up discussion on the wider applicability of EAR to other activities that are difficult, or even impossible, to detect using common sensing modalities.


ubiquitous computing | 2014

Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction

Moritz Kassner; William Rhoades Patera; Andreas Bulling

In this paper we present Pupil -- an accessible, affordable, and extensible open source platform for pervasive eye tracking and gaze-based interaction. Pupil comprises 1) a light-weight eye tracking headset, 2) an open source software framework for mobile eye tracking, as well as 3) a graphical user interface to playback and visualize video and gaze data. Pupil features high-resolution scene and eye cameras for monocular and binocular gaze estimation. The software and GUI are platform-independent and include state-of-the-art algorithms for real-time pupil detection and tracking, calibration, and accurate gaze estimation. Results of a performance evaluation show that Pupil can provide an average gaze estimation accuracy of 0.6 degree of visual angle (0.08 degree precision) with a processing pipeline latency of only 0.045 seconds.


computer vision and pattern recognition | 2015

Appearance-based gaze estimation in the wild

Xucong Zhang; Yusuke Sugano; Mario Fritz; Andreas Bulling

Appearance-based gaze estimation is believed to work well in real-world settings, but existing datasets have been collected under controlled laboratory conditions and methods have been not evaluated across multiple datasets. In this work we study appearance-based gaze estimation in the wild. We present the MPIIGaze dataset that contains 213,659 images we collected from 15 participants during natural everyday laptop use over more than three months. Our dataset is significantly more variable than existing ones with respect to appearance and illumination. We also present a method for in-the-wild appearance-based gaze estimation using multimodal convolutional neural networks that significantly outperforms state-of-the art methods in the most challenging cross-dataset evaluation. We present an extensive evaluation of several state-of-the-art image-based gaze estimation algorithms on three current datasets, including our own. This evaluation provides clear insights and allows us to identify key research challenges of gaze estimation in the wild.


ubiquitous computing | 2013

Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets

Mélodie Vidal; Andreas Bulling; Hans Gellersen

Although gaze is an attractive modality for pervasive interactions, the real-world implementation of eye-based interfaces poses significant challenges, such as calibration. We present Pursuits, an innovative interaction technique that enables truly spontaneous interaction with eye-based interfaces. A user can simply walk up to the screen and readily interact with moving targets. Instead of being based on gaze location, Pursuits correlates eye pursuit movements with objects dynamically moving on the interface. We evaluate the influence of target speed, number and trajectory and develop guidelines for designing Pursuits-based interfaces. We then describe six realistic usage scenarios and implement three of them to evaluate the method in a usability study and a field study. Our results show that Pursuits is a versatile and robust technique and that users can interact with Pursuits-based interfaces without prior knowledge or preparation phase.


IEEE Pervasive Computing | 2010

Toward Mobile Eye-Based Human-Computer Interaction

Andreas Bulling; Hans Gellersen

Current research on eye-based interfaces mostly focuses on stationary settings. However, advances in mobile eye-tracking equipment and automated eye-movement analysis now allow for investigating eye movements during natural behavior and promise to bring eye-based interaction into peoples everyday lives. Recent developments in mobile eye tracking equipment point the way toward unobtrusive human-computer interfaces that will become pervasively usable in everyday life. The potential applications for the further capability to track and analyze eye movements anywhere and anytime calls for new research to develop and understand eye-based interaction in mobile daily life settings.


Archive | 2014

Eye Tracking and Eye-Based Human–Computer Interaction

Päivi Majaranta; Andreas Bulling

Eye tracking has a long history in medical and psychological research as a tool for recording and studying human visual behavior. Real-time gaze-based text entry can also be a powerful means of communication and control for people with physical disabilities. Following recent technological advances and the advent of affordable eye trackers, there is a growing interest in pervasive attention-aware systems and interfaces that have the potential to revolutionize mainstream human-technology interaction. In this chapter, we provide an introduction to the state-of-the art in eye tracking technology and gaze estimation. We discuss challenges involved in using a perceptual organ, the eye, as an input modality. Examples of real life applications are reviewed, together with design solutions derived from research results. We also discuss how to match the user requirements and key features of different eye tracking systems to find the best system for each task and application.


ambient intelligence | 2009

Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments

Andreas Bulling; Daniel Roggen; Gerhard Tröster

In this article we introduce the analysis of eye motion as a new input modality for activity recognition, context-awareness and mobile HCI applications. We describe a novel embedded eye tracker that, in contrast to common systems using video cameras, relies on Electrooculography (EOG). This self-contained wearable device consists of goggles with dry electrodes integrated into the frame and a small pocket-worn component with a DSP for real-time EOG signal processing. It can store data locally for long-term recordings or stream processed EOG signals to a remote device over Bluetooth. We show how challenges associated with wearability, eye motion analysis and signal artefacts caused by physical activity can be addressed with a combination of a special mechanical design, optimised algorithms for eye movement detection and adaptive signal processing. In two case studies, we demonstrate that EOG is a suitable measurement technique for the recognition of reading activity and eye-based human-computer interaction. Eventually, wearable EOG goggles may pave the way for seamless eye movement analysis in everyday environments and new forms of context-awareness not possible today.


eye tracking research & application | 2014

EyeTab: model-based gaze estimation on unmodified tablet computers

Erroll Wood; Andreas Bulling

Despite the widespread use of mobile phones and tablets, hand-held portable devices have only recently been identified as a promising platform for gaze-aware applications. Estimating gaze on portable devices is challenging given their limited computational resources, low quality integrated front-facing RGB cameras, and small screens to which gaze is mapped. In this paper we present EyeTab, a model-based approach for binocular gaze estimation that runs entirely on an unmodified tablet. EyeTab builds on set of established image processing and computer vision algorithms and adapts them for robust and near-realtime gaze estimation. A technical prototype evaluation with eight participants in a normal indoors office setting shows that EyeTab achieves an average gaze estimation accuracy of 6.88° of visual angle at 12 frames per second.


eye tracking research & application | 2012

Robust real-time pupil tracking in highly off-axis images

Lech Świrski; Andreas Bulling; Neil A. Dodgson

Robust, accurate, real-time pupil tracking is a key component for online gaze estimation. On head-mounted eye trackers, existing algorithms that rely on circular pupils or contiguous pupil regions fail to detect or accurately track the pupil. This is because the pupil ellipse is often highly eccentric and partially occluded by eyelashes. We present a novel, real-time dark-pupil tracking algorithm that is robust under such conditions. Our approach uses a Haar-like feature detector to roughly estimate the pupil location, performs a k-means segmentation on the surrounding region to refine the pupil centre, and fits an ellipse to the pupil using a novel image-aware Random Sample Concensus (RANSAC) ellipse fitting. We compare our approach against existing real-time pupil tracking implementations, using a set of manually labelled infra-red dark-pupil eye images. We show that our technique has a higher pupil detection rate and greater pupil tracking accuracy.

Collaboration


Dive into the Andreas Bulling's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Florian Alt

Munich University of Applied Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge