Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Tianshi Chen is active.

Publication


Featured researches published by Tianshi Chen.


Automatica | 2012

On the estimation of transfer functions, regularizations and Gaussian processes-Revisited

Tianshi Chen; Henrik Ohlsson; Lennart Ljung

Intrigued by some recent results on impulse response estimation by kernel and nonparametric techniques, we revisit the old problem of transfer function estimation from input-output measurements. We formulate a classical regularization approach, focused on finite impulse response (FIR) models, and find that regularization is necessary to cope with the high variance problem. This basic, regularized least squares approach is then a focal point for interpreting other techniques, like Bayesian inference and Gaussian process regression. The main issue is how to determine a suitable regularization matrix (Bayesian prior or kernel). Several regularization matrices are provided and numerically evaluated on a data bank of test systems and data sets. Our findings based on the data bank are as follows. The classical regularization approach with carefully chosen regularization matrices shows slightly better accuracy and clearly better robustness in estimating the impulse response than the standard approach-the prediction error method/maximum likelihood (PEM/ML) approach. If the goal is to estimate a model of given order as well as possible, a low order model is often better estimated by the PEM/ML approach, and a higher order model is often better estimated by model reduction on a high order regularized FIR model estimated with careful regularization. Moreover, an optimal regularization matrix that minimizes the mean square error matrix is derived and studied. The importance of this result lies in that it gives the theoretical upper bound on the accuracy that can be achieved for this classical regularization approach.


Automatica | 2014

Survey Kernel methods in system identification, machine learning and function estimation: A survey

Gianluigi Pillonetto; Francesco Dinuzzo; Tianshi Chen; Giuseppe De Nicolao; Lennart Ljung

Most of the currently used techniques for linear system identification are based on classical estimation paradigms coming from mathematical statistics. In particular, maximum likelihood and prediction error methods represent the mainstream approaches to identification of linear dynamic systems, with a long history of theoretical and algorithmic contributions. Parallel to this, in the machine learning community alternative techniques have been developed. Until recently, there has been little contact between these two worlds. The first aim of this survey is to make accessible to the control community the key mathematical tools and concepts as well as the computational aspects underpinning these learning techniques. In particular, we focus on kernel-based regularization and its connections with reproducing kernel Hilbert spaces and Bayesian estimation of Gaussian processes. The second aim is to demonstrate that learning techniques tailored to the specific features of dynamic systems may outperform conventional parametric approaches for identification of stable linear systems.


IEEE Transactions on Automatic Control | 2014

System Identification Via Sparse Multiple Kernel-Based Regularization Using Sequential Convex Optimization Techniques

Tianshi Chen; Martin S. Andersen; Lennart Ljung; Alessandro Chiuso; Gianluigi Pillonetto

Model estimation and structure detection with short data records are two issues that receive increasing interests in System Identification. In this paper, a multiple kernel-based regularization method is proposed to handle those issues. Multiple kernels are conic combinations of fixed kernels suitable for impulse response estimation, and equip the kernel-based regularization method with three features. First, multiple kernels can better capture complicated dynamics than single kernels. Second, the estimation of their weights by maximizing the marginal likelihood favors sparse optimal weights, which enables this method to tackle various structure detection problems, e.g., the sparse dynamic network identification and the segmentation of linear systems. Third, the marginal likelihood maximization problem is a difference of convex programming problem. It is thus possible to find a locally optimal solution efficiently by using a majorization minimization algorithm and an interior point method where the cost of a single interior-point iteration grows linearly in the number of fixed kernels. Monte Carlo simulations show that the locally optimal solutions lead to good performance for randomly generated starting points.


Automatica | 2013

Implementation of algorithms for tuning parameters in regularized least squares problems in system identification

Tianshi Chen; Lennart Ljung

There has been recently a trend to study linear system identification with high order finite impulse response (FIR) models using the regularized least-squares approach. One key of this approach is to solve the hyper-parameter estimation problem that is usually nonconvex. Our goal here is to investigate implementation of algorithms for solving the hyper-parameter estimation problem that can deal with both large data sets and possibly ill-conditioned computations. In particular, a QR factorization based matrix-inversion-free algorithm is proposed to evaluate the cost function in an efficient and accurate way. It is also shown that the gradient and Hessian of the cost function can be computed based on the same QR factorization. Finally, the proposed algorithm and ideas are verified by Monte-Carlo simulations on a large data-bank of test systems and data sets.


conference on decision and control | 2011

Kernel selection in linear system identification part II: A classical perspective

Tianshi Chen; Henrik Ohlsson; Graham C. Goodwin; Lennart Ljung

In this companion paper, the choice of kernels for estimating the impulse response of linear stable systems is considered from a classical, “frequentist”, point of view. The kernel determines the regularization matrix in a regularized least squares estimate of an FIR model. The quality is assessed from a mean square error (MSE) perspective, and measures and algorithms for optimizing the MSE are discussed. The ideas are tested on the same data bank as used in Part I of the companion papers. The resulting findings and conclusions in the two papers are very similar despite the different perspectives.


IFAC Proceedings Volumes | 2011

On the Estimation of Transfer Functions, Regularizations and Gaussian Processes - Revisited

Tianshi Chen; Henrik Ohlsson; Lennart Ljung

Intrigued by some recent results on impulse response estimation by kernel and nonparametric techniques, we revisit the old problem of transfer function estimation from input-output measurements.We ...


IEEE Transactions on Automatic Control | 2009

Global Robust Output Regulation by State Feedback for Strict Feedforward Systems

Tianshi Chen; Jie Huang

This note studies the global robust output regulation problem by state feedback for strict feedforward systems. By utilizing the general framework for tackling the output regulation problem , the output regulation problem is converted into a global robust stabilization problem for a class of feedforward systems that is subject to both time-varying static and dynamic uncertainties. Then the stabilization problem is solved by using a small gain based bottom-up recursive design procedure.


IEEE Transactions on Automatic Control | 2008

Disturbance Attenuation of Feedforward Systems With Dynamic Uncertainty

Tianshi Chen; Jie Huang

This paper studies the disturbance attenuation problem of a class of nonlinear systems in feedforward form that is subject to both dynamic uncertainty and disturbance. When the disturbance vanishes, the equilibrium of the closed-loop system is globally asymptotically stable. Two versions of small gain theorem with restrictions are employed to establish the global attractiveness and local stability of the closed-loop system at the equilibrium respectively.


Automatica | 2010

Brief paper: A small gain approach to global stabilization of nonlinear feedforward systems with input unmodeled dynamics

Tianshi Chen; Jie Huang

In this paper, we study the global robust stabilization problem of strict feedforward systems subject to input unmodeled dynamics. We present a recursive design method for a nested saturation controller which globally stabilizes the closed-loop system in the presence of input unmodeled dynamics. One of the difficulties of the problem is that the Jacobian linearization of our system at the origin may not be stabilizable. We overcome this difficulty by employing a special version of the small gain theorem to address the local stability, and, respectively, the asymptotic small gain theorem to establish the global convergence property, of the closed-loop system. An example is given to show that a redesign of the controller is required to guarantee the global robust asymptotic stability in the presence of the input unmodeled dynamics.


Automatica | 2016

Maximum entropy properties of discrete-time first-order stable spline kernel

Tianshi Chen; Tohid Ardeshiri; Francesca P. Carli; Alessandro Chiuso; Lennart Ljung; Gianluigi Pillonetto

The first order stable spline (SS-1) kernel (also known as the tuned-correlated (TC) kernel) is used extensively in regularized system identification, where the impulse response is modeled as a zero-mean Gaussian process whose covariance function is given by well designed and tuned kernels. In this paper, we discuss the maximum entropy properties of this kernel. In particular, we formulate the exact maximum entropy problem solved by the SS-1 kernel without Gaussian and uniform sampling assumptions. Under general sampling assumption, we also derive the special structure of the SS-1 kernel (e.g. its tridiagonal inverse and factorization have closed form expression), also giving to it a maximum entropy covariance completion interpretation.

Collaboration


Dive into the Tianshi Chen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jie Huang

The Chinese University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar

Henrik Ohlsson

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hong Chen

Harbin Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Run Pei

Harbin Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Zhiyuan Liu

Harbin Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge