Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Desney S. Tan is active.

Publication


Featured researches published by Desney S. Tan.


human factors in computing systems | 2010

Skinput: appropriating the body as an input surface

Chris Harrison; Desney S. Tan; Dan Morris

We present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body. We collect these signals using a novel array of sensors worn as an armband. This approach provides an always available, naturally portable, and on-body finger input system. We assess the capabilities, accuracy and limitations of our technique through a two-part, twenty-participant user study. To further illustrate the utility of our approach, we conclude with several proof-of-concept applications we developed.


user interface software and technology | 2009

Enabling always-available input with muscle-computer interfaces

T. Scott Saponas; Desney S. Tan; Dan Morris; Ravin Balakrishnan; Jim Turner; James A. Landay

Previous work has demonstrated the viability of applying offline analysis to interpret forearm electromyography (EMG) and classify finger gestures on a physical surface. We extend those results to bring us closer to using muscle-computer interfaces for always-available input in real-world applications. We leverage existing taxonomies of natural human grips to develop a gesture set covering interaction in free space even when hands are busy with other objects. We present a system that classifies these gestures in real-time and we introduce a bi-manual paradigm that enables use in interactive systems. We report experimental results demonstrating four-finger classification accuracies averaging 79% for pinching, 85% while holding a travel mug, and 88% when carrying a weighted bag. We further show generalizability across different arm postures and explore the tradeoffs of providing real-time visual feedback.


human factors in computing systems | 2001

Exploring 3D navigation: combining speed-coupled flying with orbiting

Desney S. Tan; George G. Robertson; Mary Czerwinski

We present a task-based taxonomy of navigation techniques for 3D virtual environments, used to categorize existing techniques, drive exploration of the design space, and inspire new techniques. We briefly discuss several new techniques, and describe in detail one new techniques, Speed-coupled Flying with Orbiting. This technique couples control of movement speed to camera height and tilt, allowing users to seamlessly transition between local environment-views and global overviews. Users can also orbit specific objects for inspection. Results from two competitive user studies suggest users performed better with Speed-coupled Flying with Orbiting over alternatives, with performance also enhanced by a large display.


human factors in computing systems | 2004

WinCuts: manipulating arbitrary window regions for more effective use of screen space

Desney S. Tan; Brian Meyers; Mary Czerwinski

Each window on our computer desktop provides a view into some information. Although users can currently manipulate multiple windows, we assert that being able to spatially arrange smaller regions of these windows could help users perform certain tasks more efficiently. In this paper, we describe a novel interaction technique that allows users to replicate arbitrary regions of existing windows into independent windows called WinCuts. Each WinCut is a live view of a region of the source window with which users can interact. We also present an extension that allows users to share WinCuts across multiple devices. Next, we classify the set of tasks for which WinCuts may be useful, both in single as well as multiple device scenarios. We present high level implementation details so that other researchers can replicate this work. And finally, we discuss future work that we will pursue in extending these ideas.


IEEE Computer Graphics and Applications | 2005

The large-display user experience

George G. Robertson; Mary Czerwinski; Patrick Baudisch; Brian Meyers; Daniel C. Robbins; Greg Smith; Desney S. Tan

The PCs increasing graphical-processing power is fueling a demand for larger and more capable display devices. Several operating systems have supported work with multiple displays for some time. This fact, coupled with graphic-card advancements has led to an increase in multiple monitor (multimon) use. Large displays offer users significant benefits and usability challenges. In this article the authors discuss those challenges along with novel techniques to address these issues.


human factors in computing systems | 2002

Women take a wider view

Mary Czerwinski; Desney S. Tan; George G. Robertson

Published reports suggest that males significantly outperform females in navigating virtual environments. A novel navigation technique reported in CHI 2001, when combined with a large display and wide field of view, appeared to reduce that gender bias. That work has been extended with two navigation studies in order to understand the finding under carefully controlled conditions. The first study replicated the finding that a wide field of view coupled with a large display benefits both male and female users and reduces gender bias. The second study suggested that wide fields of view on a large display were useful to females despite a more densely populated virtual world. Implications for design of virtual worlds and large displays are discussed. Specifically, women take a wider field of view to achieve similar virtual environment navigation performance to men


human factors in computing systems | 2008

Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces

T. Scott Saponas; Desney S. Tan; Dan Morris; Ravin Balakrishnan

We explore the feasibility of muscle-computer interfaces (muCIs): an interaction methodology that directly senses and decodes human muscular activity rather than relying on physical device actuation or user actions that are externally visible or audible. As a first step towards realizing the mu-CI concept, we conducted an experiment to explore the potential of exploiting muscular sensing and processing technologies for muCIs. We present results demonstrating accurate gesture classification with an off-the-shelf electromyography (EMG) device. Specifically, using 10 sensors worn in a narrow band around the upper forearm, we were able to differentiate position and pressure of finger presses, as well as classify tapping and lifting gestures across all five fingers. We conclude with discussion of the implications of our results for future muCI designs.


ACM Transactions on Computer-Human Interaction | 2006

Physically large displays improve performance on spatial tasks

Desney S. Tan; Darren Gergle; Peter Scupelli; Randy Pausch

Large wall-sized displays are becoming prevalent. Although researchers have articulated qualitative benefits of group work on large displays, little work has been done to quantify the benefits for individual users. In this article we present four experiments comparing the performance of users working on a large projected wall display to that of users working on a standard desktop monitor. In these experiments, we held the visual angle constant by adjusting the viewing distance to each of the displays. Results from the first two experiments suggest that physically large displays, even when viewed at identical visual angles as smaller ones, help users perform better on mental rotation tasks. We show through the experiments how these results may be attributed, at least in part, to large displays immersing users within the problem space and biasing them into using more efficient cognitive strategies. In the latter two experiments, we extend these results, showing the presence of these effects with more complex tasks, such as 3D navigation and mental map formation and memory. Results further show that the effects of physical display size are independent of other factors that may induce immersion, such as interactivity and mental aids within the virtual environments. We conclude with a general discussion of the findings and possibilities for future work.


human factors in computing systems | 2012

SoundWave: using the doppler effect to sense gestures

Sidhant Gupta; Dan Morris; Shwetak N. Patel; Desney S. Tan

Gesture is becoming an increasingly popular means of interacting with computers. However, it is still relatively costly to deploy robust gesture recognition sensors in existing mobile platforms. We present SoundWave, a technique that leverages the speaker and microphone already embedded in most commodity devices to sense in-air gestures around the device. To do this, we generate an inaudible tone, which gets frequency-shifted when it reflects off moving objects like the hand. We measure this shift with the microphone to infer various gestures. In this note, we describe the phenomena and detection algorithm, demonstrate a variety of gestures, and present an informal evaluation on the robustness of this approach across different devices and people.


human factors in computing systems | 2003

With similar visual angles, larger displays improve spatial performance

Desney S. Tan; Darren Gergle; Peter Scupelli; Randy Pausch

Large wall-sized displays are becoming prevalent. Although researchers have articulated qualitative benefits of group work on large displays, little work has been done to quantify the benefits for individual users. We ran two studies comparing the performance of users working on a large projected wall display to that of users working on a standard desktop monitor. In these studies, we held the visual angle constant by adjusting the viewing distance to each of the displays. Results from the first study indicate that although there was no significant difference in performance on a reading comprehension task, users performed about 26% better on a spatial orientation task done on the large display. Results from the second study suggest that the large display affords a greater sense of presence, allowing users to treat the spatial task as an egocentric rather than an exocentric rotation. We discuss future work to extend our findings and formulate design principles for computer interfaces and physical workspaces.

Collaboration


Dive into the Desney S. Tan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Muriel Y. Ishikawa

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Roderick A. Hyde

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge