Per Bækgaard
Technical University of Denmark
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Per Bækgaard.
PLOS ONE | 2017
Andrea Cuttone; Per Bækgaard; Vedran Sekara; Håkan Jonsson; Jakob Eg Larsen; Sune Lehmann
We propose a Bayesian model for extracting sleep patterns from smartphone events. Our method is able to identify individuals’ daily sleep periods and their evolution over time, and provides an estimation of the probability of sleep and wake transitions. The model is fitted to more than 400 participants from two different datasets, and we verify the results against ground truth from dedicated armband sleep trackers. We show that the model is able to produce reliable sleep estimates with an accuracy of 0.89, both at the individual and at the collective level. Moreover the Bayesian model is able to quantify uncertainty and encode prior knowledge about sleep patterns. Compared with existing smartphone-based systems, our method requires only screen on/off events, and is therefore much less intrusive in terms of privacy and more battery-efficient.
Cognitive Information Processing (CIP), 2014 4th International Workshop on | 2014
Per Bækgaard; Michael Kai Petersen; Jakob Eg Larsen
Achieving robust adaptive synchronization of multimodal biometric inputs: The recent arrival of wireless EEG headsets that enable mobile real-time 3D brain imaging on smartphones, and low cost eye trackers that provide gaze control of tablets, will radically change how biometric sensors might be integrated into next generation user interfaces. In experimental lab settings EEG neuroimaging and eye tracking data are traditionally combined using external triggers to synchronize the signals. However, with biometric sensors increasingly being applied in everyday usage scenarios, there will be a need for solutions providing a continuous alignment of signals. In the present paper we propose using spontaneous eye blinks, as a means to achieve near real-time synchronization of EEG and eye tracking. Analyzing key parameters that define eye blink signatures across the two domains, we outline a probability function based algorithm to correlate the signals. Comparing the accuracy of the method against a state of the art EYE-EEG plug-in for offline analysis of EEG and eye tracking data, we propose our approach could be applied for robust synchronization of biometric sensor data collected in a mobile context.
international conference on multimedia and expo | 2014
David Kristian Laundav; Camilla Birgitte Falk Jensen; Per Bækgaard; Michael Kai Petersen; Jakob Eg Larsen
Estimating emotional responses to pictures based on heart rate measurements: variations in Heart Rate serves as an important clinical health indicator, but potentially also as a window into cognitive reactions to presented stimuli, as a function of both stimuli, context and previous cognitive state. This study looks at single-trial time domain mean Heart Rate (HR) and frequency domain Heart Rate Variability (HRV) measured while subjects were passively viewing emotionally charged images, comparing short random presentations with grouped sequences of either neutral, highly arousing pleasant or highly arousing unpleasant pictures. Based on only a few users we were not able to demonstrate HRV variations that correlated with randomly presented emotional content due to the inherent noise in the signal. Nor could we reproduce results from earlier studies, which based on averaged values over many subjects, revealed small changes in the mean HR only seconds after presentation of emotional images. However for longer sequences of pleasant and unpleasant images, we found a trend in the mean HR that could correlate with the emotional content of the images. Suggesting a potential for using HR in single user Quantified Self applications to assess fluctuations over longer periods in emotional state, rather than dynamic responses to emotional stimuli.
Applied Ergonomics | 2019
Per Bækgaard; Shahram Jalaliniya; John Paulin Hansen
We conducted an empirical study of 57 children using a printed Booklet and a digital Tablet instruction for LEGO® construction while they wore a head-mounted gaze tracker. Booklets caused a particularly strong pupil dilation when encountered as the first media. Subjective responses confirmed the booklet to be more difficult to use. The children who were least productive and asked for assistance more often had a significantly different pupil pattern than the rest. Our findings suggest that it is possible to collect pupil size data in unconstrained work scenarios, providing insight to task effort and difficulties.
Behavior Research Methods | 2018
John Paulin Hansen; Diako Mardanbegi; Florian Biermann; Per Bækgaard
This paper presents a study of a gaze interactive digital assembly instruction that provides concurrent logging of pupil data in a realistic task setting. The instruction allows hands-free gaze dwells as a substitute for finger clicks, and supports image rotation as well as image zooming by head movements. A user study in two LEGO toy stores with 72 children showed it to be immediately usable by 64 of them. Data logging of view-times and pupil dilations was possible for 59 participants. On average, the children spent half of the time attending to the instruction (S.D. 10.9%). The recorded pupil size showed a decrease throughout the building process, except when the child had to back-step: a regression was found to be followed by a pupil dilation. The main contribution of this study is to demonstrate gaze-tracking technology capable of supporting both robust interaction and concurrent, non-intrusive recording of gaze- and pupil data in-the-wild. Previous research has found pupil dilation to be associated with changes in task effort. However, other factors like fatigue, head motion, or ambient light may also have an impact. The final section summarizes our approach to this complexity of real-task pupil data collection and makes suggestions for how future applications may utilize pupil information.
international conference on human computer interaction | 2015
Per Bækgaard; Michael Kai Petersen; Jakob Eg Larsen
The emergence of low cost eye tracking devices will make QS quantified self monitoring of eye movements attainable on next generation mobile devices, potentially allowing us to infer reactions related to fatigue or emotional responses on a continuous basis when interacting with the screens of smartphones and tablets. In the current study we explore whether consumer grade eye trackers, despite their reduced spatio-temporal resolution, are able to monitor fixations as well as frequencies of saccades and blinks that may characterize aspects of attention, and identify consistent individual patterns that may be modulated by our overall level of engagement.
Proceedings of the Workshop on Communication by Gaze Interaction | 2018
John Paulin Hansen; Vijay Rajanna; I. Scott MacKenzie; Per Bækgaard
Archive | 2017
Per Bækgaard; Michael Kai Petersen; Jakob Eg Larsen
Archive | 2017
Per Bækgaard; Michael Kai Petersen; Jakob Eg Larsen
international conference on universal access in human-computer interaction | 2016
Per Bækgaard; Michael Kai Petersen; Jakob Eg Larsen