Diako Mardanbegi
IT University of Copenhagen
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Diako Mardanbegi.
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications | 2011
Diako Mardanbegi; Dan Witzner Hansen
Head-mounted eye trackers can be used for mobile interaction as well as gaze estimation purposes. This paper presents a method that enables the user to interact with any planar digital display in a 3D environment using a head-mounted eye tracker. An effective method for identifying the screens in the field of view of the user is also presented which can be applied in a general scenario in which multiple users can interact with multiple screens. A particular application of using this technique is implemented in a home environment with two big screens and a mobile phone. In this application a user was able to interact with these screens using a wireless head-mounted eye tracker.
eye tracking research & application | 2012
Diako Mardanbegi; Dan Witzner Hansen; Thomas Pederson
A novel method for video-based head gesture recognition using eye information by an eye tracker has been proposed. The method uses a combination of gaze and eye movement to infer head gestures. Compared to other gesture-based methods a major advantage of the method is that the user keeps the gaze on the interaction object while interacting. This method has been implemented on a head-mounted eye tracker for detecting a set of predefined head gestures. The accuracy of the gesture classifier is evaluated and verified for gaze-based interaction in applications intended for both large public displays and small mobile phone screens. The user study shows that the method detects a set of defined gestures reliably.
ubiquitous computing | 2012
Diako Mardanbegi; Dan Witzner Hansen
This paper investigates the parallax error, which is a common problem of many video-based monocular mobile gaze trackers. The parallax error is defined and described using the epipolar geometry in a stereo camera setup. The main parameters that change the error are introduced and it is shown how each parameter affects the error. The optimum distribution of the error (magnitude and direction) in the field of view varies for different applications. However, the results can be used for finding the optimum parameters that are needed for designing a head-mounted gaze tracker. It has been shown that the difference between the visual and optical axes does not have a significant effect on the parallax error, and the epipolar geometry can be used for describing the parallax error in the HMGT.
human factors in computing systems | 2016
Shahram Jalaliniya; Diako Mardanbegi
EyeGrip proposes a novel and yet simple technique of analysing eye movements for automatically detecting the users objects of interest in a sequence of visual stimuli moving horizontally or vertically in front of the users view. We assess the viability of this technique in a scenario where the user looks at a sequence of images moving horizontally on the display while the users eye movements are tracked by an eye tracker. We conducted an experiment that shows the performance of the proposed approach. We also investigated the influence of the speed and maximum number of visible images in the screen, on the accuracy of EyeGrip. Based on the experiment results, we propose guidelines for designing EyeGrip-based interfaces. EyeGrip can be considered as an implicit gaze interaction technique with potential use in broad range of applications such as large screens, mobile devices and eyewear computers. In this paper, we demonstrate the rich capabilities of EyeGrip with two example applications: 1) a mind reading game, and 2) a picture selection system. Our study shows that by selecting an appropriate speed and maximum number of visible images in the screen the proposed method can be used in a fast scrolling task where the system accurately (87%) detects the moving images that are visually appealing to the user, stops the scrolling and brings the item(s) of interest back to the screen.
international symposium on wearable computers | 2015
Shahram Jalaliniya; Diako Mardanbegi; Ioannis Sintos; Daniel Garcia Garcia
In this paper we report on development and evaluation of a video-based mobile gaze tracker for eyewear computers. Unlike most of the previous work, our system performs all its processing workload on an Android device and sends the coordinates of the gaze point to an eyewear device through wireless connection. We propose a lightweight software architecture for Android to increase the efficiency of image processing needed for eye tracking. The evaluation of the system indicated an accuracy of 1.06 degrees and a battery lifetime of approximate 4.5 hours.
symposium on spatial user interaction | 2017
Ken Pfeuffer; Benedikt Mayer; Diako Mardanbegi; Hans Gellersen
Virtual reality affords experimentation with human abilities beyond whats possible in the real world, toward novel senses of interaction. In many interactions, the eyes naturally point at objects of interest while the hands skilfully manipulate in 3D space. We explore a particular combination for virtual reality, the Gaze + Pinch interaction technique. It integrates eye gaze to select targets, and indirect freehand gestures to manipulate them. This keeps the gesture use intuitive like direct physical manipulation, but the gestures effect can be applied to any object the user looks at --- whether located near or far. In this paper, we describe novel interaction concepts and an experimental system prototype that bring together interaction technique variants, menu interfaces, and applications into one unified virtual experience. Proof-of-concept application examples were developed and informally tested, such as 3D manipulation, scene navigation, and image zooming, illustrating a range of advanced interaction capabilities on targets at any distance, without relying on extra controller devices.
Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications | 2016
Shahram Jalaliniya; Diako Mardanbegi
In this paper we investigate the utility of an eye-based interaction technique (EyeGrip) for seamless interaction with scrolling contents on eyewear computers. EyeGrip uses Optokinetic Nystagmus (OKN) eye movements to detect object of interest among a set of scrolling contents and automatically stops scrolling for the user. We empirically evaluated the usability of EyeGrip in two different applications for eyewear computers: 1) a menu scroll viewer and 2) a Facebook newsfeed reader. The results of our study showed that the EyeGrip technique performs as good as keyboard which has long been a well-known input device. Moreover, the accuracy of the EyeGrip method for menu item selection was higher while in the Facebook study participants found keyboard more accurate.
Vision Research | 2018
Diako Mardanbegi; Rebecca Killick; Baiqiang Xia; Thomas Wilcockson; Hans Gellersen; Peter Sawyer; Trevor J. Crawford
ABSTRACT Recent research have shown that the eye movement data measured by an eye tracker does not necessarily reflect the exact rotations of the eyeball. For example, post‐saccadic eye movements may be more reflecting the relative movements between the pupil and the iris rather than the eyeball oscillations. Since, accurate measurement of eye movements is important in many studies, it is crucial to identify different factors that influence the dynamics of the eye movements measured by an eye tracker. Previous studies have shown that deformation of the internal structure of the iris and size of the pupil directly affect the amplitude of the post‐saccadic oscillations that are measured by video‐based eye trackers that are pupil‐based. In this paper, we look at the effect of aging on post‐saccadic oscillations. We recorded eye movements from a group of 43 young and 22 older participants during an abstract and a more natural viewing task. The recording was conducted with a video‐based eye tracker using the pupil center and corneal reflection. We anticipated that changes in the muscle strength as an effect of aging might affect, directly or indirectly, the post‐saccadic oscillations. Results showed that the size of the post‐saccadic oscillations were significantly larger for our older group. The results suggests that aging has to be considered as an important factor when studying the post‐saccadic eye movements.
international symposium on wearable computers | 2017
Shahram Jalaliniya; Thomas Pederson; Diako Mardanbegi
In this position paper we stress the need for considering the nature of human attention when designing future potentially interruptive IoT and propose to let IoT devices share attention-related data and collaborate on the task of drawing human attention in order to achieve higher quality attention management with less overall system resources. Finally, we categorize some existing strategies for drawing peoples attention according to a simple symbiotic (human-machine) attention management framework.
Behavior Research Methods | 2018
John Paulin Hansen; Diako Mardanbegi; Florian Biermann; Per Bækgaard
This paper presents a study of a gaze interactive digital assembly instruction that provides concurrent logging of pupil data in a realistic task setting. The instruction allows hands-free gaze dwells as a substitute for finger clicks, and supports image rotation as well as image zooming by head movements. A user study in two LEGO toy stores with 72 children showed it to be immediately usable by 64 of them. Data logging of view-times and pupil dilations was possible for 59 participants. On average, the children spent half of the time attending to the instruction (S.D. 10.9%). The recorded pupil size showed a decrease throughout the building process, except when the child had to back-step: a regression was found to be followed by a pupil dilation. The main contribution of this study is to demonstrate gaze-tracking technology capable of supporting both robust interaction and concurrent, non-intrusive recording of gaze- and pupil data in-the-wild. Previous research has found pupil dilation to be associated with changes in task effort. However, other factors like fatigue, head motion, or ambient light may also have an impact. The final section summarizes our approach to this complexity of real-task pupil data collection and makes suggestions for how future applications may utilize pupil information.