Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jo-Anne Ting is active.

Publication


Featured researches published by Jo-Anne Ting.


intelligent robots and systems | 2007

A Kalman filter for robust outlier detection

Jo-Anne Ting; Evangelos A. Theodorou; Stefan Schaal

In this paper, we introduce a modified Kalman filter that can perform robust, real-time outlier detection in the observations, without the need for manual parameter tuning by the user. Robotic systems that rely on high quality sensory data can be sensitive to data containing outliers. Since the standard Kalman filter is not robust to outliers, other variations of the Kalman filter have been proposed to overcome this issue, but these methods may require manual parameter tuning, use of heuristics or complicated parameter estimation. Our Kalman filter uses a weighted least squares-like approach by introducing weights for each data sample. A data sample with a smaller weight has a weaker contribution when estimating the current time steps state. We learn the weights and system dynamics using a variational Expectation-Maximization framework. We evaluate our Kalman filter algorithm on data from a robotic dog.


Neural Networks | 2008

2008 Special Issue: Variational Bayesian least squares: An application to brain-machine interface data

Jo-Anne Ting; Aaron D'Souza; Kenji Yamamoto; Toshinori Yoshioka; Donna L. Hoffman; Shinji Kakei; Lauren E. Sergio; John F. Kalaska; Mitsuo Kawato; Peter L. Strick; Stefan Schaal

An increasing number of projects in neuroscience require statistical analysis of high-dimensional data, as, for instance, in the prediction of behavior from neural firing or in the operation of artificial devices from brain recordings in brain-machine interfaces. Although prevalent, classical linear analysis techniques are often numerically fragile in high dimensions due to irrelevant, redundant, and noisy information. We developed a robust Bayesian linear regression algorithm that automatically detects relevant features and excludes irrelevant ones, all in a computationally efficient manner. In comparison with standard linear methods, the new Bayesian method regularizes against overfitting, is computationally efficient (unlike previously proposed variational linear regression methods, is suitable for data sets with large numbers of samples and a very high number of input dimensions) and is easy to use, thus demonstrating its potential as a drop-in replacement for other linear regression techniques. We evaluate our technique on synthetic data sets and on several neurophysiological data sets. For these neurophysiological data sets we address the question of whether EMG data collected from arm movements of monkeys can be faithfully reconstructed from neural activity in motor cortices. Results demonstrate the success of our newly developed method, in comparison with other approaches in the literature, and, from the neurophysiological point of view, confirms recent findings on the organization of the motor cortex. Finally, an incremental, real-time version of our algorithm demonstrates the suitability of our approach for real-time interfaces between brains and machines.


european conference on machine learning | 2007

Learning an Outlier-Robust Kalman Filter

Jo-Anne Ting; Evangelos A. Theodorou; Stefan Schaal

We introduce a modified Kalman filter that performs robust, real-time outlier detection, without the need for manual parameter tuning by the user. Systems that rely on high quality sensory data (for instance, robotic systems) can be sensitive to data containing outliers. The standard Kalman filter is not robust to outliers, and other variations of the Kalman filter have been proposed to overcome this issue. However, these methods may require manual parameter tuning, use of heuristics or complicated parameter estimation procedures. Our Kalman filter uses a weighted least squares-like approach by introducing weights for each data sample. A data sample with a smaller weight has a weaker contribution when estimating the current time steps state. Using an incremental variational Expectation-Maximization framework, we learn the weights and system dynamics. We evaluate our Kalman filter algorithm on data from a robotic dog.


robotics science and systems | 2006

A Bayesian Approach to Nonlinear Parameter Identification for Rigid Body Dynamics

Jo-Anne Ting; Michael Mistry; Jan Peters; Stefan Schaal; Jun Nakanishi

For robots of increasing complexity such as humanoid robots, conventional identification of rigid body dynamics models based on CAD data and actuator models becomes difficult and inaccurate due to the large number of additional nonlinear effects in these systems, e.g., stemming from stiff wires, hydraulic hoses, protective shells, skin, etc. Data driven parameter estimation offers an alternative model identification method, but it is often burdened by various other problems, such as significant noise in all measured or inferred variables of the robot. The danger of physically inconsistent results also exists due to unmodeled nonlinearities or insufficiently rich data. In this paper, we address all these problems by developing a Bayesian parameter identification method that can automatically detect noise in both input and output data for the regression algorithm that performs system identification. A post-processing step ensures physically consistent rigid body parameters by nonlinearly projecting the result of the Bayesian estimation onto constraints given by positive definite inertia matrices and the parallel axis theorem. We demonstrate on synthetic and actual robot data that our technique performs parameter identification with 5 to 20% higher accuracy than traditional methods. Due to the resulting physically consistent parameters, our algorithm enables us to apply advanced control methods that algebraically require physical consistency on robotic platforms.


international conference on machine learning | 2006

Bayesian regression with input noise for high dimensional data

Jo-Anne Ting; Aaron D'Souza; Stefan Schaal

This paper examines high dimensional regression with noise-contaminated input and output data. Goals of such learning problems include optimal prediction with noiseless query points and optimal system identification. As a first step, we focus on linear regression methods, since these can be easily cast into nonlinear learning problems with locally weighted learning approaches. Standard linear regression algorithms generate biased regression estimates if input noise is present and suffer numerically when the data contains redundancy and irrelevancy. Inspired by Factor Analysis Regression, we develop a variational Bayesian algorithm that is robust to ill-conditioned data, automatically detects relevant features, and identifies input and output noise -- all in a computationally efficient way. We demonstrate the effectiveness of our techniques on synthetic data and on a system identification task for a rigid body dynamics model of a robotic vision head. Our algorithm performs 10 to 70% better than previously suggested methods.


Encyclopedia of Machine Learning and Data Mining | 2010

Locally Weighted Regression for Control

Jo-Anne Ting; Franziska Meier; Sethu Vijayakumar; Stefan Schaal

Learning control refers to the process of acquiring a control strategy for a particular control system and a particular task by trial and error. It is usually distinguished from adaptive control [1] in that the learning system is permitted to fail during the process of learning, resembling how humans and animals acquire new movement strategies. In contrast, adaptive control emphasizes single trial convergence without failure, fulfilling stringent performance constraints, e.g., as needed in life-critical systems like airplanes and industrial robots.


international conference on robotics and automation | 2007

Automatic Outlier Detection: A Bayesian Approach

Jo-Anne Ting; Aaron D'Souza; Stefan Schaal


neural information processing systems | 2005

Predicting EMG Data from M1 Neurons with Variational Bayesian Least Squares

Jo-Anne Ting; Aaron D'Souza; Kenji Yamamoto; Toshinori Yoshioka; Donna S. Hoffman; Shinji Kakei; Lauren E. Sergio; John F. Kalaska; Mitsuo Kawato


neural information processing systems | 2008

Bayesian Kernel Shaping for Learning Control

Jo-Anne Ting; Mrinal Kalakrishnan; Sethu Vijayakumar; Stefan Schaal


international conference on robotics and automation | 2008

A Bayesian approach to empirical local linearization for robotics

Jo-Anne Ting; Aaron D'Souza; Sethu Vijayakumar; Stefan Schaal

Collaboration


Dive into the Jo-Anne Ting's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Evangelos A. Theodorou

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mrinal Kalakrishnan

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge