Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jayson Turner is active.

Publication


Featured researches published by Jayson Turner.


user interface software and technology | 2013

Pursuit calibration: making gaze calibration less tedious and more flexible

Ken Pfeuffer; Mélodie Vidal; Jayson Turner; Andreas Bulling; Hans Gellersen

Eye gaze is a compelling interaction modality but requires user calibration before interaction can commence. State of the art procedures require the user to fixate on a succession of calibration markers, a task that is often experienced as difficult and tedious. We present pursuit calibration, a novel approach that, unlike existing methods, is able to detect the users attention to a calibration target. This is achieved by using moving targets, and correlation of eye movement and target trajectory, implicitly exploiting smooth pursuit eye movement. Data for calibration is then only sampled when the user is attending to the target. Because of its ability to detect user attention, pursuit calibration can be performed implicitly, which enables more flexible designs of the calibration task. We demonstrate this in application examples and user studies, and show that pursuit calibration is tolerant to interruption, can blend naturally with applications and is able to calibrate users without their awareness.


Computer Communications | 2012

Wearable eye tracking for mental health monitoring

Mélodie Vidal; Jayson Turner; Andreas Bulling; Hans Gellersen

Pervasive healthcare is a promising field of research as small and unobtrusive on-body sensors become available. However, despite considerable advances in the field, current systems are limited in terms of the pathologies they can detect, particularly regarding mental disorders. In this work we propose wearable eye tracking as a new method for mental health monitoring. We provide two reviews: one of the state-of-the-art in wearable eye tracking equipment and a second one of the work in experimental psychology and clinical research on the link between eye movements and cognition. Both reviews show a significant potential of wearable eye tracking for mental health monitoring in daily life settings. This finding calls for further research on unobtrusive sensing equipment and novel algorithms for automated analysis of long-term eye movement data.


international conference on human-computer interaction | 2013

Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch

Jayson Turner; Jason Alexander; Andreas Bulling; Dominik Schmidt; Hans Gellersen

Previous work has validated the eyes and mobile input as a viable approach for pointing at, and selecting out of reach objects. This work presents Eye Pull, Eye Push, a novel interaction concept for content transfer between public and personal devices using gaze and touch. We present three techniques that enable this interaction: Eye Cut & Paste, Eye Drag & Drop, and Eye Summon & Cast. We outline and discuss several scenarios in which these techniques can be used. In a user study we found that participants responded well to the visual feedback provided by Eye Drag & Drop during object movement. In contrast, we found that although Eye Summon & Cast significantly improved performance, participants had difficulty coordinating their hands and eyes during interaction.


eye tracking research & application | 2014

Cross-device gaze-supported point-to-point content transfer

Jayson Turner; Andreas Bulling; Jason Alexander; Hans Gellersen

Within a pervasive computing environment, we see content on shared displays that we wish to acquire and use in a specific way i.e., with an application on a personal device, transferring from point-to-point. The eyes as input can indicate intention to interact with a service, providing implicit pointing as a result. In this paper we investigate the use of gaze and manual input for the positioning of gaze-acquired content on personal devices. We evaluate two main techniques, (1) Gaze Positioning, transfer of content using gaze with manual input to confirm actions, (2) Manual Positioning, content is selected with gaze but final positioning is performed by manual input, involving a switch of modalities from gaze to manual input. A first user study compares these techniques applied to direct and indirect manual input configurations, a tablet with touch input and a laptop with mouse input. A second study evaluated our techniques in an application scenario involving distractor targets. Our overall results showed general acceptance and understanding of all conditions, although there were clear individual user preferences dependent on familiarity and preference toward gaze, touch, or mouse input.


Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction | 2011

Combining gaze with manual interaction to extend physical reach

Jayson Turner; Andreas Bulling; Hans Gellersen

Situated public displays and interactive surfaces are becoming ubiquitous in our daily lives. Issues arise with these devices when attempting to interact over a distance or with content that is physically out of reach. In this paper we outline three techniques that combine gaze with manual hand-controlled input to move objects. We demonstrate and discuss how these techniques could be applied to two scenarios involving, (1) a multi-touch surface and (2) a public display and a mobile device.


human factors in computing systems | 2015

Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks

Jayson Turner; Jason Alexander; Andreas Bulling; Hans Gellersen

Our work investigates the use of gaze and multitouch to fluidly perform rotate-scale-translate (RST) tasks on large displays. The work specifically aims to understand if gaze can provide benefit in such a task, how task complexity affects performance, and how gaze and multitouch can be combined to create an integral input structure suited to the task of RST. We present four techniques that individually strike a different balance between gaze-based and touch-based translation while maintaining concurrent rotation and scaling operations. A 16 participant empirical evaluation revealed that three of our four techniques present viable options for this scenario, and that larger distances and rotation/scaling operations can significantly affect a gaze-based translation configuration. Furthermore we uncover new insights regarding multimodal integrality, finding that gaze and touch can be combined into configurations that pertain to integral or separable input structures.


eye tracking research & application | 2012

Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction

Jayson Turner; Andreas Bulling; Hans Gellersen

Pervasive eye-based interaction refers to the vision of eye-based interaction becoming ubiquitously usable in everyday life, e. g. across multiple displays in the environment. While current head-mounted eye trackers work well for interaction with displays at similar distances, the scene camera often fails to cover both remote and close proximity displays, e. g. a public display on a wall and a handheld portable device. In this paper we describe an approach that allows for robust detection and gaze mapping across multiple such displays. Our approach uses an additional scene camera to extend the viewing and gaze mapping area of the eye tracker and automatically switches between both cameras depending on the display in view. Results from a pilot study show that our system achieves a similar gaze estimation accuracy to a single-camera system while at the same time increasing usability.


annual symposium on computer human interaction in play | 2014

EyePlay: applications for gaze in games

Jayson Turner; Eduardo Velloso; Hans Gellersen; Veronica Sundstedt

What new challenges does the combination of games and eye-tracking present? The EyePlay workshop brings together researchers and industry specialists from the fields of eye-tracking and games to address this question. Eye-tracking been investigated extensively in a variety of domains in human-computer Interaction, but little attention has been given to its application for gaming. As eye-tracking technology is now an affordable commodity, its appeal as a sensing technology for games is set to become the driving force for novel methods of player-computer interaction and games evaluation. This workshop presents a forum for eye-based gaming research, with a focus on identifying the opportunities that eye-tracking brings to games design and research, on plotting the landscape of the work in this area, and on formalising a research agenda for EyePlay as a field. Possible topics are, but not limited to, novel interaction techniques and game mechanics, usability and evaluation, accessibility, learning, and serious games contexts.


international conference on human-computer interaction | 2015

An Empirical Investigation of Gaze Selection in Mid-Air Gestural 3D Manipulation

Eduardo Velloso; Jayson Turner; Jason Alexander; Andreas Bulling; Hans Gellersen

In this work, we investigate gaze selection in the context of mid-air hand gestural manipulation of 3D rigid bodies on monoscopic displays. We present the results of a user study with 12 participants in which we compared the performance of Gaze, a Raycasting technique (2D Cursor) and a Virtual Hand technique (3D Cursor) to select objects in two 3D mid-air interaction tasks. Also, we compared selection confirmation times for Gaze selection when selection is followed by manipulation to when it is not. Our results show that gaze selection is faster and more preferred than 2D and 3D mid-air-controlled cursors, and is particularly well suited for tasks in which users constantly switch between several objects during the manipulation. Further, selection confirmation times are longer when selection is followed by manipulation than when it is not.


user interface software and technology | 2013

Cross-device eye-based interaction

Jayson Turner

Eye-tracking technology is envisaged to become part of our daily life, as its development progresses it becomes more wearable. Additionally there is a wealth of digital content around us, either close to us, on our personal devices or out-of-reach on public displays. The scope of this work aims to combine gaze with mobile input modalities to enable the transfer of content between public and close proximity personal displays. The work contributes enabling technologies, novel interaction techniques, and poses bigger questions that move toward a formalisation of this design space to develop guidelines for the development of future cross-device eye-based interaction methods.

Collaboration


Dive into the Jayson Turner's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge