Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Odest Chadwicke Jenkins is active.

Publication


Featured researches published by Odest Chadwicke Jenkins.


Autonomous Robots | 2002

Automated Derivation of Primitives for Movement Classification

Ajo Fod; Maja J. Matarić; Odest Chadwicke Jenkins

We describe a new method for representing human movement compactly, in terms of a linear super-imposition of simpler movements termed primitives. This method is a part of a larger research project aimed at modeling motor control and imitation using the notion of perceptuo-motor primitives, a basis set of coupled perceptual and motor routines. In our model, the perceptual system is biased by the set of motor behaviors the agent can execute. Thus, an agent can automatically classify observed movements into its executable repertoire. In this paper, we describe a method for automatically deriving a set of primitives directly from human movement data.We used movement data gathered from a psychophysical experiment on human imitation to derive the primitives. The data were first filtered, then segmented, and principal component analysis was applied to the segments. The eigenvectors corresponding to a few of the highest eigenvalues provide us with a basis set of primitives. These are used, through superposition and sequencing, to reconstruct the training movements as well as novel ones. The validation of the method was performed on a humanoid simulation with physical dynamics. The effectiveness of the motion reconstruction was measured through an error metric. We also explored and evaluated a technique of clustering in the space of primitives for generating controllers for executing frequently used movements.


international conference on machine learning | 2004

A spatio-temporal extension to Isomap nonlinear dimension reduction

Odest Chadwicke Jenkins; Maja J. Matarić

We present an extension of Isomap nonlinear dimension reduction (Tenenbaum et al., 2000) for data with both spatial and temporal relationships. Our method, ST-Isomap, augments the existing Isomap framework to consider temporal relationships in local neighborhoods that can be propagated globally via a shortest-path mechanism. Two instantiations of ST-Isomap are presented for sequentially continuous and segmented data. Results from applying ST-Isomap to real-world data collected from human motion performance and humanoid robot teleoperation are also presented.


intelligent robots and systems | 2002

Deriving action and behavior primitives from human motion data

Odest Chadwicke Jenkins; Maja J. Matarić

We address the problem of creating basis behaviors for modularizing humanoid robot control and representing human activity. These behaviors, called perceptual-motor primitives, serve as a substrate for linking a systems perception of human activities and the ability to perform those activities. We present a data-driven method for deriving perceptual-motor action and behavior primitives from human motion capture data. In order to find these primitives, we employ a spatio-temporal non-linear dimension reduction technique on a set of motion segments. From this transformation, motions representing the same action can be clustered and generalized. Further dimension reduction iterations are applied to derive extended-duration behaviors.


computer vision and pattern recognition | 2008

Physical simulation for probabilistic motion tracking

Marek Vondrak; Leonid Sigal; Odest Chadwicke Jenkins

Human motion tracking is an important problem in computer vision. Most prior approaches have concentrated on efficient inference algorithms and prior motion models; however, few can explicitly account for physical plausibility of recovered motion. The primary purpose of this work is to enforce physical plausibility in the tracking of a single articulated human subject. Towards this end, we propose a full-body 3D physical simulation-based prior that explicitly incorporates motion control and dynamics into the Bayesian filtering framework. We consider the humanpsilas motion to be generated by a ldquocontrol looprdquo. In this control loop, Newtonian physics approximates the rigid-body motion dynamics of the human and the environment through the application and integration of forces. Collisions generate interaction forces to prevent physically impossible hypotheses. This allows us to properly model human motion dynamics, ground contact and environment interactions. For efficient inference in the resulting high-dimensional state space, we introduce exemplar-based control strategy to reduce the effective search space. As a result we are able to recover the physically-plausible kinematic and dynamic state of the body from monocular and multi-view imagery. We show, both quantitatively and qualitatively, that our approach performs favorably with respect to standard Bayesian filtering methods.


adaptive agents and multi-agents systems | 2003

Automated derivation of behavior vocabularies for autonomous humanoid motion

Odest Chadwicke Jenkins; Maja J. Matarić

In this paper we address the problem of automatically deriving vocabularies of motion modules from human motion data, taking advantage of the underlying spatio-temporal structure in motion. We approach this problem with a data-driven methodology for modularizing a motion stream (or time-series of human motion) into a vocabulary of parameterized primitive motion modules and a set of meta-level behaviors characterizing extended combinations of the primitives. Central to this methodology is the discovery of spatio-temporal structure in a motion stream. We estimate this structure by extending an existing nonlinear dimension reduction technique, Isomap, to handle motion data with spatial and temporal dependencies. The motion vocabularies derived by our methodology provide a substrate of autonomous behavior and can be used in a variety of applications. We demonstrate the utility of derived vocabularies for the application of synthesizing new humanoid motion that is structurally similar to the original demonstrated motion.


ieee-ras international conference on humanoid robots | 2004

Motion capture from inertial sensing for untethered humanoid teleoperation

Nathan Miller; Odest Chadwicke Jenkins; Marcelo Kallmann; Maja J. Matarić

We describe the design of a modular system for untethered real-time kinematic motion capture using sensors with inertial measuring units (IMU). Our system is comprised of a set of small and lightweight sensors. Each sensor provides its own global orientation (3 degrees of freedom) and is physically and computationally independent, requiring only external communication. Orientation information from sensors is communicated via wireless to host computer for processing. We present results of the real-time usage of our untethered motion capture system for teleoperating the NASA Robonaut. We also discuss potential applications for untethered motion capture with respect to humanoid robotics.


computer vision and pattern recognition | 2003

Markerless kinematic model and motion capture from volume sequences

Chi-Wei Chun; Odest Chadwicke Jenkins; Maja J. Matarić

An approach for model-free markerless motion capture of articulated kinematic structures is presented. This approach is centered our method for generating underlying nonlinear axes (or a skeleton curve) from the volume of an arbitrary rigid-body model. We describe the use of skeleton curves for deriving a kinematic model and motion (in the form of joint angles over time) from a captured volume sequence. Our motion capture method uses a skeleton curve, found in each frame of a volume sequence, to automatically determine kinematic postures. These postures are then aligned to determine a common kinematic model for the volume sequence. The derived kinematic model is then reapplied to each frame in the volume sequence to find the motion suited to this model. We demonstrate our method for several types of motion from synthetically generated volume sequences with arbitrary kinematic topology and human volume sequences captured from a set of multiple calibrated cameras.


human-robot interaction | 2009

The oz of wizard: simulating the human for interaction research

Aaron Steinfeld; Odest Chadwicke Jenkins; Brian Scassellati

The Wizard of Oz experiment method has a long tradition of acceptance and use within the field of human-robot interaction. The community has traditionally downplayed the importance of interaction evaluations run with the inverse model: the human simulated to evaluate robot behavior, or “Oz of Wizard”. We argue that such studies play an important role in the field of human-robot interaction. We differentiate between methodologically rigorous human modeling and placeholder simulations using simplified human models. Guidelines are proposed for when Oz of Wizard results should be considered acceptable. This paper also describes a framework for describing the various permutations of Wizard and Oz states.


intelligent robots and systems | 2010

Incremental learning of subtasks from unsegmented demonstration

Daniel H. Grollman; Odest Chadwicke Jenkins

We propose to incrementally learn the segmentation of a demonstrated task into subtasks and the individual subtask policies themselves simultaneously. Previous robot learning from demonstration techniques have either learned the individual subtasks in isolation, combined known subtasks, or used knowledge of the overall task structure to perform segmentation. Our infinite mixture of experts approach instead automatically infers an appropriate partitioning (number of subtasks and assignment of data points to each one) directly from the data. We illustrate the applicability of our technique by learning a suitable set of subtasks from the demonstration of a finite-state machine robot soccer goal scorer.


international symposium on robotics | 2017

Rosbridge: ROS for Non-ROS Users

Christopher Crick; Graylin Jay; Sarah Osentoski; Benjamin Pitzer; Odest Chadwicke Jenkins

We present rosbridge, a middleware abstraction layer which provides robotics technology with a standard, minimalist applications development framework accessible to applications programmers who are not themselves roboticists. Rosbridge provides a simple, socket-based programmatic access to robot interfaces and algorithms provided (for now) by ROS, the open-source “Robot Operating System”, the current state-of-the-art in robot middleware. In particular, it facilitates the use of web technologies such as Javascript for the purpose of broadening the use and usefulness of robotic technology. We demonstrate potential applications in the interface design, education, human-robot interaction and remote laboratory environments.

Collaboration


Dive into the Odest Chadwicke Jenkins's collaboration.

Top Co-Authors

Avatar

Maja J. Matarić

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Reed W. Hoyt

United States Army Research Institute of Environmental Medicine

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge