Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sean White is active.

Publication


Featured researches published by Sean White.


human factors in computing systems | 2011

Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring

Daniel Ashbrook; Patrick Baudisch; Sean White

We present Nenya, a new input device in the shape of a finger ring. Nenya provides an input mechanism that is always available, fast to access, and allows analog input, while remaining socially acceptable by being embodied in commonly worn items. Users make selections by twisting the ring and click by sliding it along the finger. The ring - the size of a regular wedding band - is magnetic, and is tracked by a wrist-worn sensor. Nenyas tiny size, eyes-free usability, and physical form indistinguishable from a regular ring make its use subtle and socially acceptable. We present two user studies (one- and two-handed) in which we studied sighted and eyes-free use, finding that even with no visual feedback users were able to select from eight targets.


user interface software and technology | 2012

Facet: a multi-segment wrist worn system

Kent Lyons; David Nguyen; Daniel Ashbrook; Sean White

We present Facet, a multi-display wrist worn system consisting of multiple independent touch-sensitive segments joined into a bracelet. Facet automatically determines the pose of the system as a whole and of each segment individually. It further supports multi-segment touch, yielding a rich set of touch input techniques. Our work builds on these two primitives to allow the user to control how applications use segments alone and in coordination. Applications can expand to use more segments, collapses to encompass fewer, and be swapped with other segments. We also explore how the concepts from Facet could apply to other devices in this design space.


user interface software and technology | 2013

uTrack: 3D input using two magnetic sensors

Ke-Yu Chen; Kent Lyons; Sean White; Shwetak N. Patel

While much progress has been made in wearable computing in recent years, input techniques remain a key challenge. In this paper, we introduce uTrack, a technique to convert the thumb and fingers into a 3D input system using magnetic field (MF) sensing. A user wears a pair of magnetometers on the back of their fingers and a permanent magnet affixed to the back of the thumb. By moving the thumb across the fingers, we obtain a continuous input stream that can be used for 3D pointing. Specifically, our novel algorithm calculates the magnets 3D position and tilt angle directly from the sensor readings. We evaluated uTrack as an input device, showing an average tracking accuracy of 4.84 mm in 3D space - sufficient for subtle interaction. We also demonstrate a real-time prototype and example applications allowing users to interact with the computer using 3D finger input.


Handbook of Augmented Reality | 2011

Visualization Techniques for Augmented Reality

Denis Kalkofen; Christian Sandor; Sean White; Dieter Schmalstieg

Visualizations in real world environments benefit from the visual interaction between real and virtual imagery. However, compared to traditional visualizations, a number of problems have to be solved in order to achieve effective visualizations within Augmented Reality (AR). This chapter provides an overview of techniques to handle the main obstacles in AR visualizations. It discusses spatial integration of virtual objects within real world environments, techniques to rearrange objects within mixed environments, and visualizations which adapt to its environmental context.


user interface software and technology | 2011

RhythmLink: securely pairing I/O-constrained devices by tapping

Felix Xiaozhu Lin; Daniel Ashbrook; Sean White

We present RhythmLink, a system that improves the wireless pairing user experience. Users can link devices such as phones and headsets together by tapping a known rhythm on each device. In contrast to current solutions, RhythmLink does not require user interaction with the host device during the pairing process; and it only requires binary input on the peripheral, making it appropriate for small devices with minimal physical affordances. We describe the challenges in enabling this user experience and our solution, an algorithm that allows two devices to compare imprecisely-entered tap sequences while maintaining the secrecy of those sequences. We also discuss our prototype implementation of RhythmLink and review the results of initial user tests.


human factors in computing systems | 2013

Exploring the interaction design space for interactive glasses

Andrés Lucero; Kent Lyons; Akos Vetek; Toni Järvenpää; Sean White; Marja Salmimaa

In this paper, we explore the interaction design space for interactive glasses. We discuss general issues with interactive glasses (i.e., optics, technology, social, form factors), and then concentrate on the topic of the nature of interaction with glasses and its implications to provide a delightful user experience with the NotifEye.


user interface software and technology | 2013

BitWear: a platform for small, connected, interactive devices

Kent Lyons; David Nguyen; Shigeyuki Seki; Sean White; Daniel Ashbrook; Halley Profita

We describe BitWear, a platform for prototyping small, wireless, interactive devices. BitWear incorporates hardware, wireless connectivity, and a cloud component to enable collections of connected devices. We are using this platform to create, explore, and experiment with a multitude of wearable and deployable physical forms and interactions.


international symposium on mixed and augmented reality | 2011

Comparing spatial understanding between touch-based and AR-style interaction

Jason Wither; Sean White; Ronald Azuma

There are currently two primary ways of viewing location specific information in-situ on hand-held mobile device screens: using a see-through augmented reality interface and using a touch-based interface with panoramas. The two approaches use fundamentally different interaction metaphors: an AR-style of interacting where the user holds up the device and physically moves it to change views of the world, and a touch-based technique where panorama navigation is independent of the physical world. We have investigated how this difference in interaction technique impacts a users spatial understanding of the mixed reality world. Our study found that AR-style interaction provided better spatial understanding overall, while touch-based interaction changed the experience to have more similar characteristics to interaction in a separate virtual environment.


Archive | 2011

Dynamic, Abstract Representations of Audio in a Mobile Augmented Reality Conferencing System

Sean White; Steven Feiner

We describe a wearable audio conferencing and information presentation system that represents individual participants and audio elements through dynamic, visual abstractions, presented on a tracked, see-through head-worn display. Our interest is in communication spaces, annotation, and data that are represented by auditory media with synchronistic or synesthetic visualizations. Representations can transition between different spatial modalities as audio elements enter and exit the wearer’s physical presence. In this chapter, we discuss the user interface and infrastructure, SoundSight, which uses the Skype Internet telephony API to support wireless conferencing, and describe our early experience using the system.


international symposium on mixed and augmented reality | 2011

Enabling large-scale outdoor Mixed Reality and Augmented Reality

Steven Feiner; Thommen Korah; David Joseph Murphy; Vasu Parameswaran; Matei Stroila; Sean White

While there is significant recent progress in technologies supporting augmented reality for small indoor environments, there is still much work to be done for large outdoor environments. This workshop focuses primarily on research that enables high-quality outdoor Mixed Reality (MR) and Augmented Reality (AR) applications. These research topics include, but are not restricted to: — 3D geo-referenced data (images, point clouds, and models) — Algorithms for object recognition from large databases of geo-referenced data — Algorithms for object tracking in outdoor environment — Multi-cue fusion to achieve improved performance of object detection and tracking — Novel representation schemes to facilitate large-scale content distribution — 3D reasoning to support intelligent augmentation — Novel and improved mobile capabilities for data capture (device sensors), processing, and display — Applications, experiences, and user interface techniques. The workshop will also showcase existing prototypes of applications enabled by these technologies: mirror worlds, high-fidelity virtual environments, applications of panoramic imagery, and user studies relating to these media types. This workshop aims to bring together academic and industrial researchers and to foster discussion amongst participants on the current state of the art and future directions for technologies that enable large-scale outdoor MR and AR applications. The workshop will start with a session in which position statements and overviews of the state of the art are presented. In the afternoon, we will follow up with discussion sessions and a short closing session.

Collaboration


Dive into the Sean White's collaboration.

Researchain Logo
Decentralizing Knowledge