Ken Pfeuffer
Lancaster University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ken Pfeuffer.
user interface software and technology | 2013
Ken Pfeuffer; Mélodie Vidal; Jayson Turner; Andreas Bulling; Hans Gellersen
Eye gaze is a compelling interaction modality but requires user calibration before interaction can commence. State of the art procedures require the user to fixate on a succession of calibration markers, a task that is often experienced as difficult and tedious. We present pursuit calibration, a novel approach that, unlike existing methods, is able to detect the users attention to a calibration target. This is achieved by using moving targets, and correlation of eye movement and target trajectory, implicitly exploiting smooth pursuit eye movement. Data for calibration is then only sampled when the user is attending to the target. Because of its ability to detect user attention, pursuit calibration can be performed implicitly, which enables more flexible designs of the calibration task. We demonstrate this in application examples and user studies, and show that pursuit calibration is tolerant to interruption, can blend naturally with applications and is able to calibrate users without their awareness.
human factors in computing systems | 2013
Mélodie Vidal; Ken Pfeuffer; Andreas Bulling; Hans-Werner Gellersen
Eye-based interaction has commonly been based on estimation of eye gaze direction, to locate objects for interaction. We introduce Pursuits, a novel and very different eye tracking method that instead is based on following the trajectory of eye movement and comparing this with trajectories of objects in the field of view. Because the eyes naturally follow the trajectory of moving objects of interest, our method is able to detect what the user is looking at, by matching eye movement and object movement. We illustrate Pursuits with three applications that demonstrate how the method facilitates natural interaction with moving targets.
user interface software and technology | 2015
Ken Pfeuffer; Jason Alexander; Ming Ki Chong; Yanxia Zhang; Hans-Werner Gellersen
Modalities such as pen and touch are associated with direct input but can also be used for indirect input. We propose to combine the two modes for direct-indirect input modulated by gaze. We introduce gaze-shifting as a novel mechanism for switching the input mode based on the alignment of manual input and the users visual attention. Input in the users area of attention results in direct manipulation whereas input offset from the users gaze is redirected to the visual target. The technique is generic and can be used in the same manner with different input modalities. We show how gaze-shifting enables novel direct-indirect techniques with pen, touch, and combinations of pen and touch input.
ubiquitous computing | 2017
Yanxia Zhang; Ken Pfeuffer; Ming Ki Chong; Jason Alexander; Andreas Bulling; Hans-Werner Gellersen
Gaze information provides indication of users focus which complements remote collaboration tasks, as distant users can see their partner’s focus. In this paper, we apply gaze for co-located collaboration, where users’ gaze locations are presented on the same display, to help collaboration between partners. We integrated various types of gaze indicators on the user interface of a collaborative search system, and we conducted two user studies to understand how gaze enhances coordination and communication between co-located users. Our results show that gaze indeed enhances co-located collaboration, but with a trade-off between visibility of gaze indicators and user distraction. Users acknowledged that seeing gaze indicators eases communication, because it let them be aware of their partner’s interests and attention. However, users can be reluctant to share their gaze information due to trust and privacy, as gaze potentially divulges their interests.
human factors in computing systems | 2016
Ken Pfeuffer; Jason Alexander; Hans Gellersen
Bimanual pen and touch UIs are mainly based on the direct manipulation paradigm. Alternatively we propose partially-indirect bimanual input, where direct pen input is used with the dominant hand, and indirect-touch input with the non-dominant hand. As direct and indirect inputs do not overlap, users can interact in the same space without interference. We investigate two indirect-touch techniques combined with direct pen input: the first redirects touches to the users gaze position, and the second redirects touches to the pen position. In this paper, we present an empirical user study where we compare both partially-indirect techniques to direct pen and touch input in bimanual pan, zoom, and ink tasks. Our experimental results show that users are comparatively fast with the indirect techniques, but more accurate as users can dynamically change the zoom-target during indirect zoom gestures. Further our studies reveal that direct and indirect zoom gestures have distinct characteristics regarding spatial use, gestural use, and bimanual parallelism.
user interface software and technology | 2016
Ken Pfeuffer; Hans Gellersen
We explore how gaze can support touch interaction on tablets. When holding the device, the free thumb is normally limited in reach, but can provide an opportunity for indirect touch input. Here we propose gaze and touch input, where touches redirect to the gaze target. This provides whole-screen reachability while only using a single hand for both holding and input. We present a user study comparing this technique to direct-touch, showing that users are slightly slower but can utilise one-handed use with less physical effort. To enable interaction with small targets, we introduce CursorShift, a method that uses gaze to provide users temporal control over cursors during direct-touch interactions. Taken together, users can employ three techniques on tablets: direct-touch, gaze and touch, and cursor input. In three applications, we explore how these techniques can coexist in the same UI and demonstrate how tablet tasks can be performed with thumb-only input of the holding hand, and with it describe novel interaction techniques for gaze based tablet interaction.
symposium on spatial user interaction | 2017
Ken Pfeuffer; Benedikt Mayer; Diako Mardanbegi; Hans Gellersen
Virtual reality affords experimentation with human abilities beyond whats possible in the real world, toward novel senses of interaction. In many interactions, the eyes naturally point at objects of interest while the hands skilfully manipulate in 3D space. We explore a particular combination for virtual reality, the Gaze + Pinch interaction technique. It integrates eye gaze to select targets, and indirect freehand gestures to manipulate them. This keeps the gesture use intuitive like direct physical manipulation, but the gestures effect can be applied to any object the user looks at --- whether located near or far. In this paper, we describe novel interaction concepts and an experimental system prototype that bring together interaction technique variants, menu interfaces, and applications into one unified virtual experience. Proof-of-concept application examples were developed and informally tested, such as 3D manipulation, scene navigation, and image zooming, illustrating a range of advanced interaction capabilities on targets at any distance, without relying on extra controller devices.
international conference on human-computer interaction | 2015
Ken Pfeuffer; Jason Alexander; Hans-Werner Gellersen
Direct touch input is employed on many devices, but it is inherently restricted to displays that are reachable by the user. Gaze input as a mediator can extend touch to remote displays - using gaze for remote selection, and touch for local manipulation - but at what cost and benefit? In this paper, we investigate the potential trade-off with four experiments that empirically compare remote Gaze+touch to standard touch. Our experiments investigate dragging, rotation, and scaling tasks. Results indicate that Gaze+touch is, compared to touch, (1) equally fast and more accurate for rotation and scaling, (2) slower and less accurate for dragging, and (3) enables selection of smaller targets. Our participants confirm this trend, and are positive about the relaxed finger placement of Gaze+touch. Our experiments provide detailed performance characteristics to consider for the design of Gaze+touch interaction of remote displays. We further discuss insights into strengths and drawbacks in contrast to direct touch.
human factors in computing systems | 2017
Ken Pfeuffer; Ken Hinckley; Michel Pahud; Bill Buxton
Modern tablets support simultaneous pen and touch input, but it remains unclear how to best leverage this capability for bimanual input when the nonpreferred hand holds the tablet. We explore Thumb + Pen interactions that support simultaneous pen and touch interaction, with both hands, in such situations. Our approach engages the thumb of the device-holding hand, such that the thumb interacts with the touch screen in an indirect manner, thereby complementing the direct input provided by the preferred hand. For instance, the thumb can determine how pen actions (articulated with the opposite hand) are interpreted. Alternatively, the pen can point at an object, while the thumb manipulates one or more of its parameters through indirect touch. Our techniques integrate concepts in a novel way that derive from marking menus, spring-loaded modes, indirect input, and multi-touch conventions. Our overall approach takes the form of a set of probes, each representing a meaningfully distinct class of application. They serve as an initial exploration of the design space at a level which will help determine the feasibility of supporting bimanual interaction in such contexts, and the viability of the Thumb + Pen techniques in so doing.
mobile and ubiquitous multimedia | 2016
Ken Pfeuffer; Jason Alexander; Hans Gellersen
Gaze can complement touch on surfaces for fast target selection and occlusion-free input. In this work, we look beyond single-user application of gaze and touch and explore how gaze can be leveraged for collaborative use. We present the design of a two-player shooter game in which targets are gaze-aware and able to react differently to attention by one of the players versus shared attention of both players. The game-play, evaluated in a study with 14 users, encourages users to adopt different strategies switching between individual and shared attention to achieve their collaborative goal.