Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dan Morris is active.

Publication


Featured researches published by Dan Morris.


human factors in computing systems | 2010

Skinput: appropriating the body as an input surface

Chris Harrison; Desney S. Tan; Dan Morris

We present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body. We collect these signals using a novel array of sensors worn as an armband. This approach provides an always available, naturally portable, and on-body finger input system. We assess the capabilities, accuracy and limitations of our technique through a two-part, twenty-participant user study. To further illustrate the utility of our approach, we conclude with several proof-of-concept applications we developed.


user interface software and technology | 2009

Enabling always-available input with muscle-computer interfaces

T. Scott Saponas; Desney S. Tan; Dan Morris; Ravin Balakrishnan; Jim Turner; James A. Landay

Previous work has demonstrated the viability of applying offline analysis to interpret forearm electromyography (EMG) and classify finger gestures on a physical surface. We extend those results to bring us closer to using muscle-computer interfaces for always-available input in real-world applications. We leverage existing taxonomies of natural human grips to develop a gesture set covering interaction in free space even when hands are busy with other objects. We present a system that classifies these gestures in real-time and we introduce a bi-manual paradigm that enables use in interactive systems. We report experimental results demonstrating four-finger classification accuracies averaging 79% for pinching, 85% while holding a travel mug, and 88% when carrying a weighted bag. We further show generalizability across different arm postures and explore the tradeoffs of providing real-time visual feedback.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2007

Haptic Feedback Enhances Force Skill Learning

Dan Morris; Hong Z. Tan; Federico Barbagli; Timothy Chang; Kenneth Salisbury

This paper explores the use of haptic feedback to teach an abstract motor skill that requires recalling a sequence of forces. Participants are guided along a trajectory and are asked to learn a sequence of one-dimensional forces via three paradigms: haptic training, visual training, or combined visuohaptic training. The extent of learning is measured by accuracy of force recall. We find that recall following visuohaptic training is significantly more accurate than recall following visual or haptic training alone, although haptic training alone is inferior to visual training alone. This suggests that in conjunction with visual feedback, haptic training may be an effective tool for teaching sensorimotor skills that have a force-sensitive component to them, such as surgery. We also present a dynamic programming paradigm to align and compare spatiotemporal haptic trajectories


human factors in computing systems | 2008

Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces

T. Scott Saponas; Desney S. Tan; Dan Morris; Ravin Balakrishnan

We explore the feasibility of muscle-computer interfaces (muCIs): an interaction methodology that directly senses and decodes human muscular activity rather than relying on physical device actuation or user actions that are externally visible or audible. As a first step towards realizing the mu-CI concept, we conducted an experiment to explore the potential of exploiting muscular sensing and processing technologies for muCIs. We present results demonstrating accurate gesture classification with an off-the-shelf electromyography (EMG) device. Specifically, using 10 sensors worn in a narrow band around the upper forearm, we were able to differentiate position and pressure of finger presses, as well as classify tapping and lifting gestures across all five fingers. We conclude with discussion of the implications of our results for future muCI designs.


IEEE Computer Graphics and Applications | 2006

Visuohaptic simulation of bone surgery for training and evaluation

Dan Morris; Christopher Sewell; Federico Barbagli; Kenneth Salisbury; Nikolas H. Blevins; Sabine Girod

Visual and haptic simulation of bone surgery can support and extend current surgical training techniques. The authors present a system for simulating surgeries involving bone manipulation, such as temporal bone surgery and mandibular surgery, and discuss the automatic computation of surgical performance metrics. Experimental results confirm the systems construct validity


human factors in computing systems | 2012

SoundWave: using the doppler effect to sense gestures

Sidhant Gupta; Dan Morris; Shwetak N. Patel; Desney S. Tan

Gesture is becoming an increasingly popular means of interacting with computers. However, it is still relatively costly to deploy robust gesture recognition sensors in existing mobile platforms. We present SoundWave, a technique that leverages the speaker and microphone already embedded in most commodity devices to sense in-air gestures around the device. To do this, we generate an inaudible tone, which gets frequency-shifted when it reflects off moving objects like the hand. We measure this shift with the microphone to infer various gestures. In this note, we describe the phenomena and detection algorithm, demonstrate a variety of gestures, and present an informal evaluation on the robustness of this approach across different devices and people.


international acm sigir conference on research and development in information retrieval | 2007

Investigating the querying and browsing behavior of advanced search engine users

Ryen W. White; Dan Morris

One way to help all users of commercial Web search engines be more successful in their searches is to better understand what those users with greater search expertise are doing, and use this knowledge to benefit everyone. In this paper we study the interaction logs of advanced search engine users (and those not so advanced) to better understand how these user groups search. The results show that there are marked differences in the queries, result clicks, post-query browsing, and search success of users we classify as advanced (based on their use of query operators), relative to those classified as non-advanced. Our findings have implications for how advanced users should be supported during their searches, and how their interactions could be used to help searchers of all experience levels find more relevant information and learn improved searching strategies.


human factors in computing systems | 2008

MySong: automatic accompaniment generation for vocal melodies

Ian Simon; Dan Morris; Sumit Basu

We introduce MySong, a system that automatically chooses chords to accompany a vocal melody. A user with no musical experience can create a song with instrumental accompaniment just by singing into a microphone, and can experiment with different styles and chord patterns using interactions designed to be intuitive to non-musicians. We describe the implementation of MySong, which trains a Hidden Markov Model using a music database and uses that model to select chords for new melodies. Model parameters are intuitively exposed to the user. We present results from a study demonstrating that chords assigned to melodies using MySong and chords assigned manually by musicians receive similar subjective ratings. We then present results from a second study showing that thirteen users with no background in music theory are able to rapidly create musical accompaniments using MySong, and that these accompaniments are rated positively by evaluators.


human factors in computing systems | 2012

Humantenna: using the body as an antenna for real-time whole-body interaction

Gabe Cohn; Dan Morris; Shwetak N. Patel; Desney S. Tan

Computer vision and inertial measurement have made it possible for people to interact with computers using whole-body gestures. Although there has been rapid growth in the uses and applications of these systems, their ubiquity has been limited by the high cost of heavily instrumenting either the environment or the user. In this paper, we use the human body as an antenna for sensing whole-body gestures. Such an approach requires no instrumentation to the environment, and only minimal instrumentation to the user, and thus enables truly mobile applications. We show robust gesture recognition with an average accuracy of 93% across 12 whole-body gestures, and promising results for robust location classification within a building. In addition, we demonstrate a real-time interactive system which allows a user to interact with a computer using whole-body gestures


human factors in computing systems | 2010

Making muscle-computer interfaces more practical

T. Scott Saponas; Desney S. Tan; Dan Morris; Jim Turner; James A. Landay

Recent work in muscle sensing has demonstrated the poten-tial of human-computer interfaces based on finger gestures sensed from electrodes on the upper forearm. While this approach holds much potential, previous work has given little attention to sensing finger gestures in the context of three important real-world requirements: sensing hardware suitable for mobile and off-desktop environments, elec-trodes that can be put on quickly without adhesives or gel, and gesture recognition techniques that require no new training or calibration after re-donning a muscle-sensing armband. In this note, we describe our approach to over-coming these challenges, and we demonstrate average clas-sification accuracies as high as 86% for pinching with one of three fingers in a two-session, eight-person experiment.

Collaboration


Dive into the Dan Morris's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge