Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Troy L. McDaniel is active.

Publication


Featured researches published by Troy L. McDaniel.


human factors in computing systems | 2010

VibroGlove: an assistive technology aid for conveying facial expressions

Sreekar Krishna; Shantanu Bala; Troy L. McDaniel; Stephen McGuire; Sethuraman Panchanathan

In this paper, a novel interface is described for enhancing human-human interpersonal interactions. Specifically, the device is targeted as an assistive aid to deliver the facial expressions of an interaction partner to people who are blind or visually impaired. Vibro-tactors, mounted on the back of a glove, provide a means for conveying haptic emoticons that represent the six basic human emotions and the neutral expression of the users interaction partner. The detailed design of the haptic interface and haptic icons of expressions are presented, along with a user study involving a subject who is blind, as well as sighted, blind-folded participants. Results reveal the potential for enriching social communication for people with visual disabilities.


ieee international workshop on haptic audio visual environments and games | 2008

Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind

Troy L. McDaniel; Sreekar Krishna; Vineeth Nallure Balasubramanian; Dirk Colbry; Sethuraman Panchanathan

Good social skills are important and provide for a healthy, successful life; however, individuals with visual impairments are at a disadvantage when interacting with sighted peers due to inaccessible non-verbal cues. This paper presents a haptic (vibrotactile) belt to assist individuals who are blind or visually impaired by communicating non-verbal cues during social interactions. We focus on non-verbal communication pertaining to the relative location of the communicators with respect to the user in terms of direction and distance. Results from two experiments show that the haptic belt is effective in using vibration location and duration to communicate the relative direction and distance, respectively, of an individual in the userpsilas visual field.


human factors in computing systems | 2009

Using tactile rhythm to convey interpersonal distances to individuals who are blind

Troy L. McDaniel; Sreekar Krishna; Dirk Colbry; Sethuraman Panchanathan

This paper presents a scheme for using tactile rhythms to convey interpersonal distance to individuals who are blind or visually impaired, with the goal of providing access to non-verbal cues during social interactions. A preliminary experiment revealed that subjects could identify the proposed tactile rhythms and found them intuitive for the given application. Future work aims to improve recognition results and increase the number of interpersonal distances conveyed by incorporating temporal change information into the proposed methodology.


ieee international workshop on haptic audio visual environments and games | 2010

MOVeMENT: A framework for systematically mapping vibrotactile stimulations to fundamental body movements

Troy L. McDaniel; Daniel Villanueva; Sreekar Krishna; Sethuraman Panchanathan

Traditional forms of motor learning, i.e., audio-visual instruction and/or guidance through physical contact, are limited depending on the situation such as instruction in a noisy or busy classroom setting, or across a large physical separation. Vibrotactile stimulation has recently emerged as a promising alternative or augmentation to traditional forms of motor training and learning, but has evolved mostly in an adhoc manner where trainers or researchers have resorted to application specific vibrotactile-movement mapping. In contrast, this paper proposes a novel framework, MOVeMENT (Mapping Of Vibrations to moveMENT), that provides systematic design guidelines for mapping vibrotactile stimulations to human body movements in motor skill training applications. We present a pilot test to validate the proposed framework. Results of the study are promising, and the subjects found the movement instructions intuitive and easy to recognize.


ACM Transactions on Multimedia Computing, Communications, and Applications | 2006

Modeling context in haptic perception, rendering, and visualization

Kanav Kahol; Priyamvada Tripathi; Troy L. McDaniel; Laura Bratton; Sethuraman Panchanathan

Haptic perception refers to the ability of human beings to perceive spatial properties through touch-based sensations. In haptics, contextual clues about material,shape, size, texture, and weight configurations of an object are perceived by individuals leading to recognition of the object and its spatial features. In this paper, we present strategies and algorithms to model context in haptic applications that allow users to haptically explore objects in virtual reality/augmented reality environments. Initial results show significant improvement in accuracy and efficiency of haptic perception in augmented reality environments when compared to conventional approaches that do not model context in haptic rendering.


international conference on human computer interaction | 2007

Augmented virtual reality for laparoscopic surgical tool training

Kanav Kahol; Jamieson French; Troy L. McDaniel; Sethuraman Panchanathan; Mark D. Smith

Feedback in surgical simulation has been limited to offline analysis of movement, time taken to complete simulation and in some cases a virtual playback of completed simulation tasks. In comparison to aircraft simulation, these feedback schemes are very rudimentary. Research in military simulations has shown that real time feedback significantly improves performance on the task at hand and leads to skill generalization and transfer. However, such systems have not been developed for surgical simulation. Lack of effective feedback systems also has the added effect of increasing workload of senior surgeons leading to increased costs and decreased overall efficiency. In a pilot study performed with 8 surgical residents, we tested the effect of real time feedback on movement proficiency.


IEEE Journal of Selected Topics in Signal Processing | 2016

Social Interaction Assistant: A Person-Centered Approach to Enrich Social Interactions for Individuals With Visual Impairments

Sethuraman Panchanathan; Shayok Chakraborty; Troy L. McDaniel

Social interaction is a central component of human experience. The ability to interact with others and communicate effectively within an interactive context is a fundamental necessity for professional success as well as personal fulfillment. Individuals with visual impairment face significant challenges in social communication, which if unmitigated, may lead to lifelong needs for extensive social and economic support. Unfortunately, todays multimedia technologies largely cater to the needs of the “able” population, resulting in solutions that mostly meet the needs of that community. Individuals with disabilities (such as visual impairment) have largely been absent in the design process, and have to adapt themselves (often unsuccessfully) to available solutions. In this paper, we propose a social interaction assistant for individuals who are blind or visually impaired, incorporating novel contributions in: 1) person recognition through batch mode active learning; 2) reliable multimodal person recognition through the conformal predictions framework; and 3) facial expression recognition through topic models. Moreover, individuals with visual impairments often have specific requirements that necessitate a personalized, adaptive approach to multimedia computing. To address this challenge, our proposed solutions place emphasis on understanding the individual users needs, expectations and adaptations toward designing, and developing and deploying effective multimedia solutions. Our empirical results demonstrate the significant potential in using person centered multimedia solutions to enrich the lives of individuals with disabilities.


human factors in computing systems | 2010

Heartbeats: a methodology to convey interpersonal distance through touch

Troy L. McDaniel; Daniel Villanueva; Sreekar Krishna; Dirk Colbry; Sethuraman Panchanathan

Individuals who are blind are at a disadvantage when interacting with sighted peers given that nearly 65% of interaction cues are non-verbal in nature [3]. Previously, we proposed an assistive device in the form of a vibrotactile belt capable of communicating interpersonal positions (direction and distance between users who are blind and the other participants involved in a social interaction). In this paper, we extend our work through use of novel tactile rhythms to provide access to the non-verbal cue of interpersonal distance, referred to as Proxemics in popular literature. Experimental results reveal that subjects found the proposed approach to be intuitive, and they could accurately recognize the rhythms, and hence, the interpersonal distances.


international conference on universal access in human-computer interaction | 2014

Affective Haptics for Enhancing Access to Social Interactions for Individuals Who are Blind

Troy L. McDaniel; Shantanu Bala; Jacob Rosenthal; Ramin Tadayon; Arash Tadayon; Sethuraman Panchanathan

Non-verbal cues used during social interactions, such as facial expressions, are largely inaccessible to individuals who are blind. This work explores the use of affective haptics for communicating emotions displayed during social interactions. We introduce a novel haptic device, called the Haptic Face Display (HFD), consisting of a two-dimensional array of vibration motors capable of displaying rich spatiotemporal vibrotactile patterns presented through passive or active interaction styles. This work investigates users’ emotional responses to vibrotactile patterns using a passive interaction style in which the display is embedded on the back of an ergonomic chair. Such a technology could enhance social interactions for individuals who are blind in which emotions of interaction partners, once recognized by a frontend system such as computer vision algorithms, are conveyed through the HFD. We present the results of an experiment exploring the relationship between vibrotactile pattern design and elicited emotional response. Results indicate that pattern shape, duration, among other dimensions, influence emotional response, which is an important consideration when designing technologies for affective haptics.


ambient media and systems | 2008

Integration of RFID and computer vision for remote object perception for individuals who are blind

Troy L. McDaniel; Kanav Kahol; Daniel Villanueva; Sethuraman Panchanathan

Over the last few years, Radio-Frequency Identification (RFID) technology has gained popularity for use in assistive technology for individuals who are blind. Recently, RFID-based wearable assistive devices have been developed for individuals who are blind to assist with navigation or remote object perception. However, RFID-based assistive technology suffers from two major drawbacks: (1) information overload in environments with many tagged objects, and (2) usability issues in untagged environments. In this paper, we propose a framework for integrating RFID and computer vision in assistive devices for remote object perception to overcome the aforementioned limitations. Computer vision enables content selection to help prevent information overload and provide users with only relevant information found through RFID. Moreover, computer vision can be used to learn a mapping between visual data and object features as acquired through tags, which will enable computer vision to replace RFID in untagged environments.

Collaboration


Dive into the Troy L. McDaniel's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kanav Kahol

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Ramin Tadayon

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Shantanu Bala

Arizona State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Arash Tadayon

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Bijan Fakhri

Arizona State University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge