Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stijn De Beugher is active.

Publication


Featured researches published by Stijn De Beugher.


international conference on computer vision theory and applications | 2014

Automatic analysis of in-the-wild mobile eye-tracking experiments using object, face and person detection

Stijn De Beugher; Geert Brône; Toon Goedemé

In this paper we present a novel method for the automatic analysis of mobile eye-tracking data in natural environments. Mobile eye-trackers generate large amounts of data, making manual analysis very time-consuming. Available solutions, such as marker-based analysis minimize the manual labour but require experimental control, making real-life experiments practically unfeasible. We present a novel method for processing this mobile eye-tracking data by applying object, face and person detection algorithms. Furthermore we present a temporal smoothing technique to improve the detection rate and we trained a new detection model for occluded person and face detections. This enables the analysis to be performed on the object level rather than the traditionally used coordinate level. We present speed and accuracy results of our novel detection scheme on challenging, large-scale real-life experiments.


international conference on computer vision theory and applications | 2015

Semi-automatic Hand Detection - A Case Study on Real Life Mobile Eye-tracker Data

Stijn De Beugher; Geert Brône; Toon Goedemé

In this paper we present a highly accurate algorithm for the detection of human hands in real-life 2D image sequences. Current state of the art algorithms show relatively poor detection accuracy results on unconstrained, challenging images. To overcome this, we introduce a detection scheme in which we combine several well known detection techniques combined with an advanced elimination mechanism to reduce false detections. Furthermore we present a novel (semi-)automatic framework achieving detection rates up to 100%, with only minimal manual input. This is a useful tool in supervised applications where an error-free detection result is required at the cost of a limited amount of manual effort. As an application, this paper focuses on the analysis of video data of human-human interaction, collected with the scene camera of mobile eye-tracking glasses. This type of data is typically annotated manually for relevant features (e.g. visual fixations on gestures), which is a time-consuming, tedious and error-prone task. The usage of our semi-automatic approach reduces the amount of manual analysis dramatically. We also present a new fully annotated benchmark dataset on this application which we made publicly available.


international conference on computer vision theory and applications | 2016

Semi-automatic Hand Annotation Making Human-human Interaction Analysis Fast and Accurate

Stijn De Beugher; Geert Brône; Toon Goedemé

The detection of human hands is of great importance in a variety of domains including research on humancomputer interaction, human-human interaction, sign language and physiotherapy. Within this field of research one is interested in relevant items in recordings, such as for example faces, human body or hands. However, nowadays this annotation is mainly done manual, which makes this task extremely time consuming. In this paper, we present a semi-automatic alternative for the manual labeling of recordings. Our system automatically searches for hands in images and asks for manual intervention if the confidence of a detection is too low. Most of the existing approaches rely on complex and computationally intensive models to achieve accurate hand detections, while our approach is based on segmentation techniques, smart tracking mechanisms and knowledge of human pose context. This makes our approach substantially faster as compared to existing approaches. In this paper we apply our semi-automatic hand detection to the annotation of mobile eye-tracker recordings on human-human interaction. Our system makes the analysis of such data tremendously faster (244×) while maintaining an average accuracy of 93.68% on the tested datasets.


International Joint Conference on Computer Vision, Imaging and Computer Graphics | 2015

Semi-automatic Hand Annotation of Egocentric Recordings

Stijn De Beugher; Geert Brône; Toon Goedemé

We present a fast and accurate algorithm for the detection of human hands in real-life 2D image sequences. We focus on a specific application of hand detection, viz. the annotation of egocentric recordings. A well known type of egocentric camera is the mobile eye-tracker, which is often used in research on human-human interaction. Nowadays, this type of data is typically annotated manually for relevant features (e.g. visual fixations of gestures), which is a time-consuming and error-prone task. We present a semi-automatic approach for the detection of human hands in images. Such an approach reduces the amount of manual analysis drastically while guaranteeing high accuracy. In our algorithm we combine several well-known detection techniques together with an advanced elimination scheme to reduce false detections. We validate our approach using a challenging dataset containing over 4300 hand instances. This validation allows us to explore the capabilities and boundaries of our approach.


international conference on pervasive and embedded computing and communication systems | 2014

The battle of the giants: a case study of GPU vs FPGA optimisation for real-time image processing

Lars Struyf; Stijn De Beugher; Dong Hoon Van Uytsel; Frans Kanters; Toon Goedemé


language resources and evaluation | 2018

A semi-automatic annotation tool for unobtrusive gesture analysis

Stijn De Beugher; Geert Brône; Toon Goedemé


SAGA 2015 Proceedings | 2015

Semi-automatic annotation of eye-tracking recordings in terms of human torso, face and hands

Stijn De Beugher; Geert Brône; Toon Goedemé


Archive | 2016

Studying musicians’ gaze behaviour in the light of synchronisation issues in ensemble playing

Sarah Vandemoortele; Stijn De Beugher; Geert Brône; Kurt Feyaerts; Toon Goedemé; Thomas De Baets; Stijn Vervliet


Archive | 2016

Into the Wild. Muzikale interactie in ensembles: een multimodale studie met eye-trackers

Sarah Vandemoortele; Stijn De Beugher; Geert Brône; Kurt Feyaerts; Toon Goedemé; Thomas De Baets; Stijn Vervliet


Archive | 2016

Automatic analysis of in-the-wild mobile eye-tracking experiments

Stijn De Beugher; Geert Brône; Toon Goedemé

Collaboration


Dive into the Stijn De Beugher's collaboration.

Top Co-Authors

Avatar

Geert Brône

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kurt Feyaerts

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Tinne Tuytelaars

Catholic University of Leuven

View shared research outputs
Top Co-Authors

Avatar

Lars Struyf

Katholieke Universiteit Leuven

View shared research outputs
Researchain Logo
Decentralizing Knowledge