Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Zackory M. Erickson is active.

Publication


Featured researches published by Zackory M. Erickson.


international conference on robotics and automation | 2016

Multimodal execution monitoring for anomaly detection during robot manipulation

Daehyung Park; Zackory M. Erickson; Tapomayukh Bhattacharjee; Charles C. Kemp

Online detection of anomalous execution can be valuable for robot manipulation, enabling robots to operate more safely, determine when a behavior is inappropriate, and otherwise exhibit more common sense. By using multiple complementary sensory modalities, robots could potentially detect a wider variety of anomalies, such as anomalous contact or a loud utterance by a human. However, task variability and the potential for false positives make online anomaly detection challenging, especially for long-duration manipulation behaviors. In this paper, we provide evidence for the value of multimodal execution monitoring and the use of a detection threshold that varies based on the progress of execution. Using a data-driven approach, we train an execution monitor that runs in parallel to a manipulation behavior. Like previous methods for anomaly detection, our method trains a hidden Markov model (HMM) using multimodal observations from non-anomalous executions. In contrast to prior work, our system also uses a detection threshold that changes based on the execution progress. We evaluated our approach with haptic, visual, auditory, and kinematic sensing during a variety of manipulation tasks performed by a PR2 robot. The tasks included pushing doors closed, operating switches, and assisting able-bodied participants with eating yogurt. In our evaluations, our anomaly detection method performed substantially better with multimodal monitoring than single modality monitoring. It also resulted in more desirable ROC curves when compared with other detection threshold methods from the literature, obtaining higher true positive rates for comparable false positive rates.


international conference on robotics and automation | 2017

What does the person feel? Learning to infer applied forces during robot-assisted dressing

Zackory M. Erickson; Alexander Clegg; Wenhao Yu; Greg Turk; C. Karen Liu; Charles C. Kemp

During robot-assisted dressing, a robot manipulates a garment in contact with a persons body. Inferring the forces applied to the persons body by the garment might enable a robot to provide more effective assistance and give the robot insight into what the person feels. However, complex mechanics govern the relationship between the robots end effector and these forces. Using a physics-based simulation and data-driven methods, we demonstrate the feasibility of inferring forces across a persons body using only end effector measurements. Specifically, we present a long short-term memory (LSTM) network that at each time step takes a 9-dimensional input vector of force, torque, and velocity measurements from the robots end effector and outputs a force map consisting of hundreds of inferred force magnitudes across the persons body. We trained and evaluated LSTMs on two tasks: pulling a hospital gown onto an arm and pulhng shorts onto a leg. For both tasks, the LSTMs produced force maps that were similar to ground truth when visualized as heat maps across the limbs. We also evaluated their performance in terms of root-mean-square error. Their performance degraded when the end effector velocity was increased outside the training range, but generalized well to limb rotations. Overall, our results suggest that robots could learn to infer the forces people feel during robot-assisted dressing, although the extent to which this will generalize to the real world remains an open question.


arXiv: Robotics | 2016

Towards Assistive Feeding with a General-Purpose Mobile Manipulator

Daehyung Park; You Keun Kim; Zackory M. Erickson; Charles C. Kemp


intelligent robots and systems | 2017

Learning to navigate cloth using haptics

Alexander Clegg; Wenhao Yu; Zackory M. Erickson; Jie Tan; C. Karen Liu; Greg Turk


arXiv: Robotics | 2017

Semi-Supervised Haptic Material Recognition for Robots using Generative Adversarial Networks.

Zackory M. Erickson; Sonia Chernova; Charles C. Kemp


international conference on robotics and automation | 2018

Tracking Human Pose During Robot-Assisted Dressing Using Single-Axis Capacitive Proximity Sensing

Zackory M. Erickson; Maggie Collier; Ariel Kapusta; Charles C. Kemp


international conference on robotics and automation | 2018

Deep Haptic Model Predictive Control for Robot-Assisted Dressing

Zackory M. Erickson; Henry M. Clever; Greg Turk; C. Karen Liu; Charles C. Kemp


arXiv: Robotics | 2018

3D Human Pose Estimation on a Configurable Bed from a Pressure Image

Henry M. Clever; Ariel Kapusta; Daehyung Park; Zackory M. Erickson; Yash Chitalia; Charles C. Kemp


arXiv: Robotics | 2018

Estimating 3D Human Pose on a Configurable Bed from a Single Pressure Image.

Henry M. Clever; Ariel Kapusta; Daehyung Park; Zackory M. Erickson; Yash Chitalia; Charles C. Kemp


arXiv: Robotics | 2018

Classification of Household Materials via Spectroscopy.

Zackory M. Erickson; Nathan Luskey; Sonia Chernova; Charles C. Kemp

Collaboration


Dive into the Zackory M. Erickson's collaboration.

Top Co-Authors

Avatar

Charles C. Kemp

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daehyung Park

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ariel Kapusta

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

C. Karen Liu

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Greg Turk

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Henry M. Clever

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Alexander Clegg

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Sonia Chernova

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Wenhao Yu

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yash Chitalia

Georgia Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge