Hua-Liang Wei
University of Sheffield
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Hua-Liang Wei.
IEEE Transactions on Neural Networks | 2005
Stephen A. Billings; Hua-Liang Wei
A new class of wavelet networks (WNs) is proposed for nonlinear system identification. In the new networks, the model structure for a high-dimensional system is chosen to be a superimposition of a number of functions with fewer variables. By expanding each function using truncated wavelet decompositions, the multivariate nonlinear networks can be converted into linear-in-the-parameter regressions, which can be solved using least-squares type methods. An efficient model term selection approach based upon a forward orthogonal least squares (OLS) algorithm and the error reduction ratio (ERR) is applied to solve the linear-in-the-parameters problem in the present study. The main advantage of the new WN is that it exploits the attractive features of multiscale wavelet decompositions and the capability of traditional neural networks. By adopting the analysis of variance (ANOVA) expansion, WNs can now handle nonlinear identification problems in high dimensions.
IEEE Transactions on Pattern Analysis and Machine Intelligence | 2007
Hua-Liang Wei; Stephen A. Billings
A new unsupervised forward orthogonal search (FOS) algorithm is introduced for feature selection and ranking. In the new algorithm, features are selected in a stepwise way, one at a time, by estimating the capability of each specified candidate feature subset to represent the overall features in the measurement space. A squared correlation function is employed as the criterion to measure the dependency between features and this makes the new algorithm easy to implement. The forward orthogonalization strategy, which combines good effectiveness with high efficiency, enables the new algorithm to produce efficient feature subsets with a clear physical interpretation
International Journal of Control | 2004
Hua-Liang Wei; S.A. Billings; J. Liu
The purpose of variable selection is to pre-select a subset consisting of the significant variables or to eliminate the redundant variables from all the candidate variables of a system under study prior to model term detection. It is required that the selected significant variables alone should sufficiently represent the system. Generally, not all the model terms, which are produced by combining different variables, make an equal contribution to the system output and terms, which make little contribution, can be omitted. A parsimonious representation, which contains only the significant terms, can often be obtained without the loss of representational accuracy by eliminating the redundant terms. Based on these observations, a new variable and term selection algorithm is proposed in this paper. The term detection algorithm can be applied to the general class of non-linear modelling problems which can be expressed as a linear-in-the-parameters form. The variable selection procedure is based on locally linear and cross-bilinear models, which are used together with the forward orthogonal least squares (OLS) and error reduction ratio (ERR) approach to determine the significant terms and to pre-select the important variables for both time series and input–output systems. Several numerical examples are provided to illustrate the applicability and effectiveness of the new approach.
Neural Networks | 2007
Stephen A. Billings; Hua-Liang Wei; M. A. Balikhin
A novel modelling framework is proposed for constructing parsimonious and flexible multiscale radial basis function networks (RBF). Unlike a conventional standard single scale RBF network, where all the basis functions have a common kernel width, the new network structure adopts multiscale Gaussian functions as the bases, where each selected centre has multiple kernel widths, to provide more flexible representations with better generalization properties for general nonlinear dynamical systems. As a direct extension of the traditional single scale Gaussian networks, the new multiscale network is easy to implement and is quick to learn using standard learning algorithms. A k-means clustering algorithm and an improved orthogonal least squares (OLS) algorithm are used to determine the unknown parameters in the network model including the centres and widths of the basis functions, and the weights between the basis functions. It is demonstrated that the new network can lead to a parsimonious model with much better generalization property compared with the traditional single width RBF networks.
International Journal of Modelling, Identification and Control | 2008
Hua-Liang Wei; Stephen A. Billings
Model structure selection plays a key role in non-linear system identification. The first step in non-linear system identification is to determine which model terms should be included in the model. Once significant model terms have been determined, a model selection criterion can then be applied to select a suitable model subset. The well known Orthogonal Least Squares (OLS) type algorithms are one of the most efficient and commonly used techniques for model structure selection. However, it has been observed that the OLS type algorithms may occasionally select incorrect model terms or yield a redundant model subset in the presence of particular noise structures or input signals. A very efficient Integrated Forward Orthogonal Search (IFOS) algorithm, which is assisted by the squared correlation and mutual information, and which incorporates a Generalised Cross-Validation (GCV) criterion and hypothesis tests, is introduced to overcome these limitations in model structure selection.
International Journal of Systems Science | 2005
Stephen A. Billings; Hua-Liang Wei
A new hybrid model structure combing polynomial models with multiresolution wavelet decompositions is introduced for nonlinear system identification. Polynomial models play an important role in approximation theory and have been extensively used in linear and nonlinear system identification. Wavelet decompositions, in which the basis functions have the property of localization in both time and frequency, outperform many other approximation schemes and offer a flexible solution for approximating arbitrary functions. Although wavelet representations can approximate even severe nonlinearities in a given signal very well, the advantage of these representations can be lost when wavelets are used to capture linear or low-order nonlinear behaviour in a signal. In order to sufficiently utilize the global property of polynomials and the local property of wavelet representations simultaneously, in this study polynomial models and wavelet decompositions are combined together in a parallel structure to represent nonlinear input–output systems. As a special form of the NARMAX model, this hybrid model structure will be referred to as the WAvelet-NARMAX model, or simply WANARMAX. Generally, such a WANARMAX representation for an input–output system might involve a large number of basis functions and therefore a great number of model terms. Experience reveals that only a small number of these model terms are significant to the system output. A new fast orthogonal least-squares algorithm, called the matching pursuit orthogonal least squares (MPOLS) algorithm, is also introduced in this study to determine which terms should be included in the final model.
International Journal of Control | 2008
Stephen A. Billings; Hua-Liang Wei
A new adaptive orthogonal search (AOS) algorithm is proposed for model subset selection and non-linear system identification. Model structure detection is a key step in any system identification problem. This consists of selecting significant model terms from a redundant dictionary of candidate model terms, and determining the model complexity (model length or model size). The final objective is to produce a parsimonious model that can well capture the inherent dynamics of the underlying system. In the new AOS algorithm, a modified generalized cross-validation criterion, called the adjustable prediction error sum of squares (APRESS), is introduced and incorporated into a forward orthogonal search procedure. The main advantage of the new AOS algorithm is that the mechanism is simple and the implementation is direct and easy, and more importantly it can produce efficient model subsets for most non-linear identification problems.
IEEE Transactions on Neural Networks | 2009
Hua-Liang Wei; Stephen A. Billings; Yifan Zhao; Lingzhong Guo
In this brief, by combining an efficient wavelet representation with a coupled map lattice model, a new family of adaptive wavelet neural networks, called lattice dynamical wavelet neural networks (LDWNNs), is introduced for spatio-temporal system identification. A new orthogonal projection pursuit (OPP) method, coupled with a particle swarm optimization (PSO) algorithm, is proposed for augmenting the proposed network. A novel two-stage hybrid training scheme is developed for constructing a parsimonious network model. In the first stage, by applying the OPP algorithm, significant wavelet neurons are adaptively and successively recruited into the network, where adjustable parameters of the associated wavelet neurons are optimized using a particle swarm optimizer. The resultant network model, obtained in the first stage, however, may be redundant. In the second stage, an orthogonal least squares algorithm is then applied to refine and improve the initially trained network by removing redundant wavelet neurons from the network. An example for a real spatio-temporal system identification problem is presented to demonstrate the performance of the proposed new modeling framework.
IEEE Transactions on Control Systems and Technology | 2011
Yang Li; Hua-Liang Wei; Stephen A. Billings
This brief introduces a new parametric modelling and identification method for linear time-varying systems using a block least mean square (LMS) approach where the time-varying parameters are approximated using multi-wavelet basis functions. This approach can be applied to track rapidly or even sharply varying processes and is developed by combining wavelet approximation theory with a block LMS algorithm. Numerical examples are provided to show the effectiveness of the proposed method for dealing with severely nonstationary processes. Application of the proposed approach to a real mechanical system indicates better tracking capability of the multi-wavelet basis function algorithm compared with the normalized least squares or recursive least squares routines.
IEEE Transactions on Neural Networks | 2007
Stephen A. Billings; Hua-Liang Wei
A sparse representation, with satisfactory approximation accuracy, is usually desirable in any nonlinear system identification and signal processing problem. A new forward orthogonal regression algorithm, with mutual information interference, is proposed for sparse model selection and parameter estimation. The new algorithm can be used to construct parsimonious linear-in-the-parameters models