Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter Washington is active.

Publication


Featured researches published by Peter Washington.


human factors in computing systems | 2016

A Wearable Social Interaction Aid for Children with Autism

Peter Washington; Catalin Voss; Nick Haber; Serena Tanaka; Jena Daniels; Carl Feinstein; Terry Winograd; Dennis P. Wall

Over 1 million children under the age of 17 in the US have been identified with Autism Spectrum Disorder (ASD). These children struggle to recognize facial expressions, make eye contact, and engage in social interactions. Gaining these skills requires intensive behavioral interventions that are often expensive, difficult to access, and inconsistently administered.nWe have developed a system to automate facial expression recognition that runs on wearable glasses and delivers real time social cues, with the goal of creating a behavioral aid for children with ASD that maximizes behavioral feedback while minimizing the distractions to the child. This paper describes the design of our system and interface decisions resulting from initial observations gathered during multiple preliminary trials.


ubiquitous computing | 2016

Superpower glass: delivering unobtrusive real-time social cues in wearable systems

Catalin Voss; Peter Washington; Nick Haber; Aaron Kline; Jena Daniels; Azar Fazel; Titas De; Beth McCarthy; Carl Feinstein; Terry Winograd; Dennis P. Wall

We have developed a system for automatic facial expression recognition, which runs on Google Glass and delivers real-time social cues to the wearer. We evaluate the system as a behavioral aid for children with Autism Spectrum Disorder (ASD), who can greatly benefit from real-time non-invasive emotional cues and are more sensitive to sensory input than neurotypically developing children. In addition, we present a mobile application that enables users of the wearable aid to review their videos along with auto-curated emotional information on the video playback bar. This integrates our learning aid into the context of behavioral therapy. Expanding on our previous work describing in-lab trials, this paper presents our system and application-level design decisions in depth as well as the interface learnings gathered during the use of the system by multiple children with ASD in an at-home iterative trial.


Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies | 2017

SuperpowerGlass: A Wearable Aid for the At-Home Therapy of Children with Autism

Peter Washington; Catalin Voss; Aaron Kline; Nick Haber; Jena Daniels; Azar Fazel; Titas De; Carl Feinstein; Terry Winograd; Dennis P. Wall

We have developed a system for automatic facial expression recognition running on Google Glass, delivering real-time social cues to children with Autism Spectrum Disorder (ASD). The system includes multiple mechanisms to engage children and their parents, who administer this technology within the home. We completed an at-home design trial with 14 families that used the learning aid over a 3-month period. We found that children with ASD generally respond well to wearing the system at home and opt for the most expressive feedback choice. We further evaluated app usage, facial engagement, and model accuracy. We found that the device can act as a powerful training aid when used periodically in the home, that interactive video content from wearable therapy sessions should be augmented with sufficient context about the content to produce long-term engagement, and that the design of wearable systems for children with ASD should be heavily dependent on the functioning level of the child. We contribute general design implications for developing wearable aids used by children with ASD and other behavioral disorders as well as their parents during at-home parent-administered therapy sessions.


npj Digital Medicine | 2018

Exploratory study examining the at-home feasibility of a wearable tool for social-affective learning in children with autism

Jena Daniels; Jessey Schwartz; Catalin Voss; Nick Haber; Azar Fazel; Aaron Kline; Peter Washington; Carl Feinstein; Terry Winograd; Dennis P. Wall

Although standard behavioral interventions for autism spectrum disorder (ASD) are effective therapies for social deficits, they face criticism for being time-intensive and overdependent on specialists. Earlier starting age of therapy is a strong predictor of later success, but waitlists for therapies can be 18 months long. To address these complications, we developed Superpower Glass, a machine-learning-assisted software system that runs on Google Glass and an Android smartphone, designed for use during social interactions. This pilot exploratory study examines our prototype tool’s potential for social-affective learning for children with autism. We sent our tool home with 14 families and assessed changes from intake to conclusion through the Social Responsiveness Scale (SRS-2), a facial affect recognition task (EGG), and qualitative parent reports. A repeated-measures one-way ANOVA demonstrated a decrease in SRS-2 total scores by an average 7.14 points (F(1,13) = 33.20, p = <.001, higher scores indicate higher ASD severity). EGG scores also increased by an average 9.55 correct responses (F(1,10) = 11.89, p = <.01). Parents reported increased eye contact and greater social acuity. This feasibility study supports using mobile technologies for potential therapeutic purposes.


Applied Clinical Informatics | 2018

Feasibility Testing of a Wearable Behavioral Aid for Social Learning in Children with Autism

Jena Daniels; Nick Haber; Catalin Voss; Jessey Schwartz; Serena Tamura; Azar Fazel; Aaron Kline; Peter Washington; Jennifer Phillips; Terry Winograd; Carl Feinstein; Dennis P. Wall

BACKGROUND Recent advances in computer vision and wearable technology have created an opportunity to introduce mobile therapy systems for autism spectrum disorders (ASD) that can respond to the increasing demand for therapeutic interventions; however, feasibility questions must be answered first. OBJECTIVE We studied the feasibility of a prototype therapeutic tool for children with ASD using Google Glass, examining whether children with ASD would wear such a device, if providing the emotion classification will improve emotion recognition, and how emotion recognition differs between ASD participants and neurotypical controls (NC). METHODS We ran a controlled laboratory experiment with 43 children: 23 with ASD and 20 NC. Children identified static facial images on a computer screen with one of 7 emotions in 3 successive batches: the first with no information about emotion provided to the child, the second with the correct classification from the Glass labeling the emotion, and the third again without emotion information. We then trained a logistic regression classifier on the emotion confusion matrices generated by the two information-free batches to predict ASD versus NC. RESULTS All 43 children were comfortable wearing the Glass. ASD and NC participants who completed the computer task with Glass providing audible emotion labeling (n = 33) showed increased accuracies in emotion labeling, and the logistic regression classifier achieved an accuracy of 72.7%. Further analysis suggests that the ability to recognize surprise, fear, and neutrality may distinguish ASD cases from NC. CONCLUSION This feasibility study supports the utility of a wearable device for social affective learning in ASD children and demonstrates subtle differences in how ASD and NC children perform on an emotion recognition task.


human factors in computing systems | 2018

Prototyping Biotic Games and Interactive Experiments with JavaScript

Peter Washington; Karina Samuel-Gama; Shirish Goyal; Ashwin Ramaswami; Ingmar H. Riedel-Kruse

Life-science research is often driven by advancements in biotechnology. In this demonstration, we explore technology which supports real-time interaction with living matter in the cloud. In order to enable scientists to perform more interactive experiments, we have developed a JavaScript API and corresponding online IDE which can be used to program interactive computer applications allowing the user to remotely interact with swarms of living single-celled micro-organisms in real time. The API interfaces with several remote microscopes which provide a magnified view of a microfluidic chip housing the microorganisms. We hope this work can be a start towards bringing techniques from HCI into bioengineering and biotechnology development.


bioRxiv | 2017

Bioty: A cloud-based development toolkit for programming experiments and interactive applications with living cells

Peter Washington; Karina Samuel-Gama; Shirish Goyal; Ingmar H. Riedel-Kruse

Recent advancements in life-science instrumentation and automation enable entirely new modes of human interaction with microbiological processes and corresponding applications for science and education through biology cloud labs. A critical barrier for remote life-science experimentation is the absence of suitable abstractions and interfaces for programming living matter. To this end we conceptualize a programming paradigm that provides stimulus control functions and sensor control functions for realtime manipulation of biological (physical) matter. Additionally, a simulation mode facilitates higher user throughput, program debugging, and biophysical modeling. To evaluate this paradigm, we implemented a JavaScript-based web toolkit, ‘Bioty’, that supports realtime interaction with swarms of phototactic Euglena cells hosted on a cloud lab. Studies with remote users demonstrate that individuals with little to no biology knowledge and intermediate programming knowledge were able to successfully create and use scientific applications and games. This work informs the design of programming environments for controlling living matter in general and lowers the access barriers to biology experimentation for professional and citizen scientists, learners, and the lay public. Significance Statement Biology cloud labs are an emerging approach to lower access barriers to life-science experimentation. However, suitable programming approaches and user interfaces are lacking, especially ones that enable the interaction with the living matter itself - not just the control of equipment. Here we present and implement a corresponding programming paradigm for realtime interactive applications with remotely housed biological systems, and which is accessible and useful for scientists, programmers and lay people alike. Our user studies show that scientists and non-scientists are able to rapidly develop a variety of applications, such as interactive biophysics experiments and games. This paradigm has the potential to make first-hand experiences with biology accessible to all of society and to accelerate the rate of scientific discovery.Human-Biology Interaction (HBI) is an emerging subdomain of HCI enabling humans to have real-time interactions with living microbiological systems. The creation of new HBI applications is currently challenging and time consuming, even for experts, as it requires knowledge of biotechnology, hardware design, and software development in addition to continuous access to reliable and responsive biological materials. We therefore developed Bioty, a JavaScript-based web toolkit enabling rapid prototyping of versatile HBI applications utilizing swarms of motile, phototactic cells hosted on a remote cloud lab, and where virtual sandbox modes mitigate physical resource limitations. We evaluated and demonstrated the utility of Bioty through user studies with both HBI novices and experts, who used Bioty to create applications such as biotic video games, scientific experiments, and biological art. Our findings inform the design of development toolkits for HBI and swarm programming.


Journal of the American Academy of Child and Adolescent Psychiatry | 2017

5.13 Design and Efficacy of a Wearable Device for Social Affective Learning in Children With Autism

Jena Daniels; Jessey Schwartz; Nick Haber; Catalin Voss; Aaron Kline; Azar Fazel; Peter Washington; Titas De; Carl Feinstein; Terry Winograd; Dennis P. Wall


human factors in computing systems | 2017

Human Perception of Swarm Robot Motion

Griffin Dietz; Jane L. E; Peter Washington; Lawrence H. Kim; Sean Follmer


ieee international conference on healthcare informatics | 2018

A Gamified Mobile System for Crowdsourcing Video for Autism Research

Haik Kalantarian; Peter Washington; Jessey Schwartz; Jena Daniels; Nick Haber; Dennis P. Wall

Collaboration


Dive into the Peter Washington's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge