Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alex Butler is active.

Publication


Featured researches published by Alex Butler.


ubiquitous computing | 2006

SenseCam: a retrospective memory aid

Steve Hodges; Lyndsay Williams; Emma Berry; Shahram Izadi; James Srinivasan; Alex Butler; Gavin Smyth; Narinder Kapur; Kenneth R. Wood

This paper presents a novel ubiquitous computing device, the SenseCam, a sensor augmented wearable stills camera. SenseCam is designed to capture a digital record of the wearers day, by recording a series of images and capturing a log of sensor data. We believe that reviewing this information will help the wearer recollect aspects of earlier experiences that have subsequently been forgotten, and thereby form a powerful retrospective memory aid. In this paper we review existing work on memory aids and conclude that there is scope for an improved device. We then report on the design of SenseCam in some detail for the first time. We explain the details of a first in-depth user study of this device, a 12-month clinical trial with a patient suffering from amnesia. The results of this initial evaluation are extremely promising; periodic review of images of events recorded by SenseCam results in significant recall of those events by the patient, which was previously impossible. We end the paper with a discussion of future work, including the application of SenseCam to a wider audience, such as those with neurodegenerative conditions such as Alzheimers disease.


user interface software and technology | 2012

Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor

David Kim; Otmar Hilliges; Shahram Izadi; Alex Butler; Jiawen Chen; Iason Oikonomidis; Patrick Olivier

Digits is a wrist-worn sensor that recovers the full 3D pose of the users hand. This enables a variety of freehand interactions on the move. The system targets mobile settings, and is specifically designed to be low-power and easily reproducible using only off-the-shelf hardware. The electronics are self-contained on the users wrist, but optically image the entirety of the users hand. This data is processed using a new pipeline that robustly samples key parts of the hand, such as the tips and lower regions of each finger. These sparse samples are fed into new kinematic models that leverage the biomechanical constraints of the hand to recover the 3D pose of the users hand. The proposed system works without the need for full instrumentation of the hand (for example using data gloves), additional sensors in the environment, or depth cameras which are currently prohibitive for mobile scenarios due to power and form-factor considerations. We demonstrate the utility of Digits for a variety of application scenarios, including 3D spatial interaction with mobile devices, eyes-free interaction on-the-move, and gaming. We conclude with a quantitative and qualitative evaluation of our system, and discussion of strengths, limitations and future work.


user interface software and technology | 2008

SideSight: multi-"touch" interaction around small devices

Alex Butler; Shahram Izadi; Steve Hodges

Interacting with mobile devices using touch can lead to fingers occluding valuable screen real estate. For the smallest devices, the idea of using a touch-enabled display is almost wholly impractical. In this paper we investigate sensing user touch around small screens like these. We describe a prototype device with infra-red (IR) proximity sensors embedded along each side and capable of detecting the presence and position of fingers in the adjacent regions. When this device is rested on a flat surface, such as a table or desk, the user can carry out single and multi-touch gestures using the space around the device. This gives a larger input space than would otherwise be possible which may be used in conjunction with or instead of on-display touch input. Following a detailed description of our prototype, we discuss some of the interactions it affords.


user interface software and technology | 2007

ThinSight: versatile multi-touch sensing for thin form-factor displays

Steve Hodges; Shahram Izadi; Alex Butler; Alban Rrustemi; Bill Buxton

ThinSight is a novel optical sensing system, fully integrated into a thin form factor display, capable of detecting multi-ple fingers placed on or near the display surface. We describe this new hardware in detail, and demonstrate how it can be embedded behind a regular LCD, allowing sensing without degradation of display capability. With our approach, fingertips and hands are clearly identifiable through the display. The approach of optical sensing also opens up the exciting possibility for detecting other physical objects and visual markers through the display, and some initial experiments are described. We also discuss other novel capabilities of our system: interaction at a distance using IR pointing devices, and IR-based communication with other electronic devices through the display. A major advantage of ThinSight over existing camera and projector based optical systems is its compact, thin form-factor making such systems even more deployable. We therefore envisage using ThinSight to capture rich sensor data through the display which can be processed using computer vision techniques to enable both multi-touch and tangible interaction.


user interface software and technology | 2009

Mouse 2.0: multi-touch meets the mouse

Nicolas Villar; Shahram Izadi; Dan Rosenfeld; Hrvoje Benko; John Helmes; Jonathan Westhues; Steve Hodges; Eyal Ofek; Alex Butler; Xiang Cao; Billy Chen

In this paper we present novel input devices that combine the standard capabilities of a computer mouse with multi-touch sensing. Our goal is to enrich traditional pointer-based desktop interactions with touch and gestures. To chart the design space, we present five different multi-touch mouse implementations. Each explores a different touch sensing strategy, which leads to differing form-factors and hence interactive possibilities. In addition to the detailed description of hardware and software implementations of our prototypes, we discuss the relative strengths, limitations and affordances of these novel input devices as informed by the results of a preliminary user study.


international conference on computer graphics and interactive techniques | 2007

ThinSight: integrated optical multi-touch sensing through thin form-factor displays

Shahram Izadi; Steve Hodges; Alex Butler; Alban Rrustemi; Bill Buxton

ThinSight is a novel optical sensing system, fully integrated into a thin form factor display, capable of detecting multiple objects such as fingertips placed on or near the display surface. We describe this new hardware, and demonstrate how it can be embedded behind a regular LCD, allowing sensing without compromising display quality. Our aim is to capture rich sensor data through the display, which can be processed using computer vision techniques to enable interaction via multi-touch and physical objects. A major advantage of ThinSight over existing camera and projector based optical systems is its compact, low profile form factor making such interaction techniques more practical and deployable in real-world settings.


user interface software and technology | 2011

Vermeer: direct interaction with a 360° viewable 3D display

Alex Butler; Otmar Hilliges; Shahram Izadi; Steve Hodges; David Molyneaux; David Kim; Danny Kong

We present Vermeer, a novel interactive 360° viewable 3D display. Like prior systems in this area, Vermeer provides viewpoint-corrected, stereoscopic 3D graphics to simultaneous users, 360° around the display, without the need for eyewear or other user instrumentation. Our goal is to over-come an issue inherent in these prior systems which - typically due to moving parts - restrict interactions to outside the display volume. Our system leverages a known optical illusion to demonstrate, for the first time, how users can reach into and directly touch 3D objects inside the display volume. Vermeer is intended to be a new enabling technology for interaction, and we therefore describe our hardware implementation in full, focusing on the challenges of combining this optical configuration with an existing approach for creating a 360° viewable 3D display. Initially we demonstrate direct involume interaction by sensing user input with a Kinect camera placed above the display. However, by exploiting the properties of the optical configuration, we also demonstrate novel prototypes for fully integrated input sensing alongside simultaneous display. We conclude by discussing limitations, implications for interaction, and ideas for future work.


Communications of The ACM | 2009

ThinSight: a thin form-factor interactive surface technology

Shahram Izadi; Steve Hodges; Alex Butler; Darren West; Alban Rrustemi; Michael Molloy; William Buxton

ThinSight is a thin form-factor interactive surface technology based on optical sensors embedded inside a regular liquid crystal display (LCD). These augment the display with the ability to sense a variety of objects near the surface, including fingertips and hands, to enable multitouch interaction. Optical sensing also allows other physical items to be detected, allowing interactions using various tangible objects. A major advantage of ThinSight over existing camera and projector-based systems is its compact form-factor, making it easier to deploy in a variety of settings. We describe how the ThinSight hardware is embedded behind a regular LCD, allowing sensing without degradation of dis- play capability, and illustrate the capabilities of our system through a number of proof-of-concept hardware prototypes and applications.


user interface software and technology | 2009

A reconfigurable ferromagnetic input device

Jonathan Hook; Stuart Taylor; Alex Butler; Nicolas Villar; Shahram Izadi

We present a novel hardware device based on ferromagnetic sensing, capable of detecting the presence, position and deformation of any ferrous object placed on or near its surface. These objects can include ball bearings, magnets, iron filings, and soft malleable bladders filled with ferrofluid. Our technology can be used to build reconfigurable input devices -- where the physical form of the input device can be assembled using combinations of such ferrous objects. This allows users to rapidly construct new forms of input device, such as a trackball-style device based on a single large ball bearing, tangible mixers based on a collection of sliders and buttons with ferrous components, and multi-touch malleable surfaces using a ferrofluid bladder. We discuss the implementation of our technology, its strengths and limitations, and potential application scenarios.


ieee international workshop on horizontal interactive human computer systems | 2008

Experiences with building a thin form-factor touch and tangible tabletop

Shahram Izadi; Alex Butler; Steve Hodges; Darren West; Malcolm Hall; Bill Buxton; Michael Molloy

In this paper we describe extensions to our work on ThinSight, necessary to scale the system to larger tabletop displays. The technique integrates optical sensors into existing off-the-shelf LCDs with minimal impact on the physical form of the display. This allows thin form-factor sensing that goes beyond the capabilities of existing multi-touch techniques, such as capacitive or resistive approaches. Specifically, the technique not only senses multiple fingertips, but outlines of whole hands and other passive tangible objects placed on the surface. It can also support sensing and communication with devices that carry embedded computation such as a mobile phone or an active stylus. We explore some of these possibilities in this paper. Scaling up the implementation to a tabletop has been non-trivial, and has resulted in modifications to the LCD architecture beyond our earlier work. We also discuss these in this paper, to allow others to make practical use of ThinSight.

Collaboration


Dive into the Alex Butler's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge