Amy Swanson
Vanderbilt University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Amy Swanson.
IEEE Transactions on Visualization and Computer Graphics | 2013
Esubalew Bekele; Zhi Zheng; Amy Swanson; Julie Crittendon; Zachary Warren; Nilanjan Sarkar
Autism Spectrum Disorders (ASD) are characterized by atypical patterns of behaviors and impairments in social communication. Among the fundamental social impairments in the ASD population are challenges in appropriately recognizing and responding to facial expressions. Traditional intervention approaches often require intensive support and well-trained therapists to address core deficits, with many with ASD having tremendous difficulty accessing such care due to lack of available trained therapists as well as intervention costs. As a result, emerging technology such as virtual reality (VR) has the potential to offer useful technology-enabled intervention systems. In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore new efficient therapeutic paradigms. A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed. The eye tracking and physiological data were analyzed to determine intragroup and intergroup variations of gaze and physiological patterns. Performance data, eye tracking indices and physiological features indicated that there were differences in the way adolescents with ASD process and recognize emotional faces compared to their typically developing peers. These results will be used in the future for an online adaptive VR-based multimodal social interaction system to improve emotion recognition abilities of individuals with ASD.
Autism | 2014
Esubalew Bekele; Julie Crittendon; Amy Swanson; Nilanjan Sarkar; Zachary Warren
It has been argued that clinical applications of advanced technology may hold promise for addressing impairments associated with autism spectrum disorders. This pilot feasibility study evaluated the application of a novel adaptive robot-mediated system capable of both administering and automatically adjusting joint attention prompts to a small group of preschool children with autism spectrum disorders (n = 6) and a control group (n = 6). Children in both groups spent more time looking at the humanoid robot and were able to achieve a high level of accuracy across trials. However, across groups, children required higher levels of prompting to successfully orient within robot-administered trials. The results highlight both the potential benefits of closed-loop adaptive robotic systems as well as current limitations of existing humanoid-robotic platforms.
Journal of Autism and Developmental Disorders | 2014
Esubalew Bekele; Julie Crittendon; Zhi Zheng; Amy Swanson; Amy Weitlauf; Zachary Warren; Nilanjan Sarkar
Teenagers with autism spectrum disorder (ASD) and age-matched controls participated in a dynamic facial affect recognition task within a virtual reality (VR) environment. Participants identified the emotion of a facial expression displayed at varied levels of intensity by a computer generated avatar. The system assessed performance (i.e., accuracy, confidence ratings, response latency, and stimulus discrimination) as well as how participants used their gaze to process facial information using an eye tracker. Participants in both groups were similarly accurate at basic facial affect recognition at varied levels of intensity. Despite similar performance characteristics, ASD participants endorsed lower confidence in their responses and substantial variation in gaze patterns in absence of perceptual discrimination deficits. These results add support to the hypothesis that deficits in emotion and face recognition for individuals with ASD are related to fundamental differences in information processing. We discuss implications of this finding in a VR environment with regards to potential future applications and paradigms targeting not just enhanced performance, but enhanced social information processing within intelligent systems capable of adaptation to individual processing differences.
Autism | 2014
Amy Swanson; Zachary Warren; Wendy L. Stone; Alison Vehorn; Elizabeth Dohrmann; Quentin Humberd
The increased prevalence of autism spectrum disorder and documented benefits of early intensive intervention have created a need for flexible systems for determining eligibility for autism-specific services. This study evaluated the effectiveness of a training program designed to enhance autism spectrum disorder identification and assessment within community pediatric settings across the state. Twenty-seven pediatric providers participated in regional trainings across a 3.5-year period. Trainings provided clinicians with strategies for conducting relatively brief within-practice interactive assessments following positive autism spectrum disorder screenings. Program evaluation was measured approximately 1.5 years following training through (a) clinician self-reports of practice change and (b) blind diagnostic verification of a subset of children assessed. Pediatric providers participating in the training reported significant changes in screening and consultation practices following training, with a reported 85% increase in diagnostic identification of children with autism spectrum disorder within their own practice setting. In addition, substantial agreement (86%–93%) was found between pediatrician diagnostic judgments and independent, comprehensive blinded diagnostic evaluations. Collaborative training methods that allow autism spectrum disorder identification within broader community pediatric settings may help translate enhanced screening initiatives into more effective and efficient diagnosis and treatment.
international conference on robotics and automation | 2014
Zhi Zheng; Shuvajit Das; Eric M. Young; Amy Swanson; Zachary Warren; Nilanjan Sarkar
Autism Spectrum Disorders (ASD) impact 1 in 88 children in the United States. The cost of ASD intervention is tremendous with huge individual and social consequences. In recent years, robotic systems have been introduced with considerable success for ASD intervention because of their potential to engage children with ASD. In this work, we present a novel closed-loop autonomous robotic system for imitation skill learning for ASD intervention. Children with ASD show powerful impairment in imitation, which has been associated with a host of neurodevelopmental and learning challenges over time. The presented robotic system offers dynamic, adaptive and autonomous interaction for learning of imitation skills with real-time performance evaluation and feedback. The system has been tested in a user study with young children with ASD and typically developing (TD) control sample. Further, the performance of the system was compared with that of a human therapist in the user study. The results demonstrate that the developed robotic system is well-tolerated by the target population, engaged the children with ASD more than a human therapist, and produced performances that were relatively better than that of a human therapist.
ieee international conference on rehabilitation robotics | 2013
Zhi Zheng; Lian Zhang; Esubalew Bekele; Amy Swanson; Julie Crittendon; Zachary Warren; Nilanjan Sarkar
With Centers for Disease Control and Prevention prevalence estimates for children with autism spectrum disorder 1 in 88, identification and effective treatment of autism spectrum disorder (ASD) is often characterized as a public health emergency. There is an urgent need for more efficacious treatments whose realistic application will yield more substantial impact on the neurodevelopmental trajectories of young children with ASD. Robotic technology appears particularly promising for potential application to ASD intervention. Initial results applying robotic technology to ASD intervention have consistently demonstrated a unique potential to elicit interest and attention in young children with ASD. As such, technologies capable of intelligently harnessing this potential, along with capabilities for detecting and meaningfully responding to young childrens attention and behavior, may represent intervention platforms with substantial promise for impacting early symptoms of ASD. Our current work describes development and application of a novel adaptive robot-mediated interaction technology for facilitating early joint attention skills for children with ASD. The system is composed of a humanoid robot endowed with a prompt decision hierarchy to alter behavior in concert with reinforcing stimuli within an intervention environment to promote joint attention skills. Results of implementation of this system over time, including specific analyses of attentional bias and performance enhancement, with 6 young children with ASD are presented.
international conference on universal access in human computer interaction | 2013
Dayi Bian; Joshua W. Wade; Lian Zhang; Esubalew Bekele; Amy Swanson; Julie Crittendon; Medha Shukla Sarkar; Zachary Warren; Nilanjan Sarkar
Individuals with autism spectrum disorders (ASD) often have difficulty functioning independently and display impairments related to important tasks related to adaptive independence such as driving. Ability to drive is believed to be an important factor of quality of life for individuals with ASD. The presented work describes a novel driving simulator based on a virtual city environment that will be used in the future to impart driving skills to teenagers with ASD as a part of intervention. A physiological data acquisition system, which was used to acquire and process participants physiological signals, and an eye tracker, which was utilized to detect eye gaze signals, were each integrated into the driving simulator. These physiological and eye gaze indices were recorded and computed to infer the affective states of the participant in real-time when he/she was driving. Based on the affective states of the participant together with his/her performance, the driving simulator adaptively changes the difficulty level of the task. This VR-based driving simulator will be capable of manipulating the driving task difficulty in response to the physiological and eye gaze indices recorded during the task. The design of this novel driving simulator system and testing data to validate its functionalities are presented in this paper.
Ksii Transactions on Internet and Information Systems | 2016
Joshua W. Wade; Lian Zhang; Dayi Bian; Jing Fan; Amy Swanson; Amy Weitlauf; Medha Shukla Sarkar; Zachary Warren; Nilanjan Sarkar
In addition to social and behavioral deficits, individuals with Autism Spectrum Disorder (ASD) often struggle to develop the adaptive skills necessary to achieve independence. Driving intervention in individuals with ASD is a growing area of study, but it is still widely under-researched. We present the development and preliminary assessment of a gaze-contingent adaptive virtual reality driving simulator that uses real-time gaze information to adapt the driving environment with the aim of providing a more individualized method of driving intervention. We conducted a small pilot study of 20 adolescents with ASD using our system: 10 with the adaptive gaze-contingent version of the system and 10 in a purely performance-based version. Preliminary results suggest that the novel intervention system may be beneficial in teaching driving skills to individuals with ASD.
international conference on universal access in human-computer interaction | 2014
Joshua W. Wade; Dayi Bian; Lian Zhang; Amy Swanson; Medha Shukla Sarkar; Zachary Warren; Nilanjan Sarkar
Autism Spectrum Disorder (ASD) is an extremely common and costly neurodevelopmental disorder. While significant research has been devoted to addressing social communication skill deficits of people with ASD, relatively less attention has been paid to improving their deficits in daily activities such as driving. Only two empirical studies have investigated driving performance in individuals with ASD—both employing proprietary driving simulation software. We designed a novel Virtual Reality (VR) driving simulator so that we could integrate various sensory modules directly into our system as well as to define task-oriented protocols that would not be otherwise possible using commercial software. We conducted a small user study with a group of individuals with ASD and a group of typically developing community controls. We found that our system was capable of distinguishing behavioral patterns between both groups indicating that it is suitable for use in designing a protocol aimed at improving driving performance.
IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2016
Zhi Zheng; Eric M. Young; Amy Swanson; Amy Weitlauf; Zachary Warren; Nilanjan Sarkar
Autism spectrum disorder (ASD) impacts 1 in 68 children in the U.S., with tremendous individual and societal costs. Technology-aided intervention, more specifically robotic intervention, has gained momentum in recent years due to the inherent affinity of many children with ASD towards technology. In this paper we present a novel robot-mediated intervention system for imitation skill learning, which is considered a core deficit area for children with ASD. The Robot-mediated Imitation Skill Training Architecture (RISTA) is designed in such a manner that it can operate either completely autonomously or in coordination with a human therapist depending on the intervention need. Experimental results are presented from small user studies validating system functionality, assessing user tolerance, and documenting subject performance. Preliminary results show that this novel robotic system draws more attention from the children with ASD and teaches gestures more effectively as compared to a human therapist. While no broad generalized conclusions can be made about the effectiveness of RISTA based on our small user studies, initial results are encouraging and justify further exploration in the future.