Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Weili Guo is active.

Publication


Featured researches published by Weili Guo.


Neural Computation | 2015

Natural gradient learning algorithms for rbf networks

Junsheng Zhao; Haikun Wei; Chi Zhang; Weiling Li; Weili Guo; Kanjian Zhang

Radial basis function (RBF) networks are one of the most widely used models for function approximation and classification. There are many strange behaviors in the learning process of RBF networks, such as slow learning speed and the existence of the plateaus. The natural gradient learning method can overcome these disadvantages effectively. It can accelerate the dynamics of learning and avoid plateaus. In this letter, we assume that the probability density function (pdf) of the input and the activation function are gaussian. First, we introduce natural gradient learning to the RBF networks and give the explicit forms of the Fisher information matrix and its inverse. Second, since it is difficult to calculate the Fisher information matrix and its inverse when the numbers of the hidden units and the dimensions of the input are large, we introduce the adaptive method to the natural gradient learning algorithms. Finally, we give an explicit form of the adaptive natural gradient learning algorithm and compare it to the conventional gradient descent method. Simulations show that the proposed adaptive natural gradient method, which can avoid the plateaus effectively, has a good performance when RBF networks are used for nonlinear functions approximation.


Neurocomputing | 2015

Theoretical and numerical analysis of learning dynamics near singularity in multilayer perceptrons

Weili Guo; Haikun Wei; Junsheng Zhao; Kanjian Zhang

Abstract The multilayer perceptron is one of the most widely used neural networks in applications, however, its learning behavior often becomes very slow, which is due to the singularities in the parameter space. In this paper, we analyze the learning dynamics near singularities in multilayer perceptrons by using traditional methods. We obtain the explicit expressions of the averaged learning equations which play a significant role in theoretical and numerical analysis. After obtaining the best approximation on overlap singularity, the stability of overlap singularity is analyzed. Then we take the numerical analysis on singular regions. Real averaged dynamics near the singularities are obtained in comparison with the theoretical learning trajectories near singularity. In the simulation we analyze the averaged learning dynamics, batch mode learning dynamics and on-line learning dynamics, respectively.


Neural Computing and Applications | 2014

Averaged learning equations of error-function-based multilayer perceptrons

Weili Guo; Haikun Wei; Junsheng Zhao; Kanjian Zhang

The multilayer perceptrons (MLPs) have strange behaviors in the learning process caused by the existing singularities in the parameter space. A detailed theoretical or numerical analysis of the MLPs is difficult due to the non-integrability of the traditional log-sigmoid activation function which leads to difficulties in obtaining the averaged learning equations (ALEs). In this paper, the error function is suggested as the activation function of the MLPs. By solving the explicit expressions of two important expectations, we obtain the averaged learning equations which make it possible for further analysis of the learning dynamics in MLPs. The simulation results also indicate that the ALEs play a significant role in investigating the singular behaviors of MLPs.


Neurocomputing | 2014

Singularities in the identification of dynamic systems

Junsheng Zhao; Haikun Wei; Weili Guo; Kanjian Zhang

Abstract As is well known, the parameter spaces of hierarchical systems such as multilayer perceptrons include singularities and the plateau phenomenon is ubiquitous in the process of learning. In the singular regions, the Fisher information matrix degenerates and the loss function is almost unchanged when the parameters arrive in the singular regions, which is called the plateau phenomenon. We wonder about whether the singularities and the plateau phenomenon exist in the parameter identification process of the linear and the ordinary nonlinear systems. In this paper, we can see that in some of the parameter identification of the nonlinear systems, the Fisher information matrix degenerates, the singularities exist and we can see the plateau phenomenon in the learning curves. A simulation example is provided to demonstrate the theoretical analysis in Section 3 .


international conference on intelligent science and big data engineering | 2017

Probabilistic Hypergraph Optimization for Salient Object Detection

Jinxia Zhang; Shixiong Fang; Krista A. Ehinger; Weili Guo; Wankou Yang; Haikun Wei

In recent years, many graph-based methods have been introduced to detect saliency. These methods represent image regions and their similarity as vertices and edges in a graph. However, since they only represent pairwise relations between vertices, they give an incomplete representation of the relationships between regions. In this work, we propose a hypergraph based optimization framework for salient object detection to include not only the pairwise but also the higher-order relations among two or more vertices. In this framework, besides the relations among vertices, both the foreground and the background queries are explicitly exploited to uniformly highlight the salient objects and suppress the background. Furthermore, a probabilistic hypergraph is constructed based on local spatial correlation, global spatial correlation, and color correlation to represent the relations among vertices from different views. Extensive experiments demonstrate the effectiveness of the proposed method.


Neurocomputing | 2017

Stability analysis of opposite singularity in multilayer perceptrons

Weili Guo; Junsheng Zhao; Jinxia Zhang; Haikun Wei; Aiguo Song; Kanjian Zhang

Abstract For the bipolar-activation-function multilayer perceptrons (MLPs), there exist opposite singularities in the parameter space. The Fisher information matrix degenerates on the opposite singularity which causes strange learning behaviors. As the stability is the fundamental to analyze the properties of the opposite singularity, this paper concerns the stability analysis of the opposite singularity in MLPs. The analytical form of the best approximation on the opposite singularity is obtained at first, then the concrete expression of Hessian matrix can be obtained. By analyzing the eigenvalues of Hessian matrix on the opposite singularity, the stability of the opposite singularity is investigated. Finally, two experiments are taken to verify the obtained results.


chinese control conference | 2016

Mutual information based feature selection for multivariate time series forecasting

Tianhong Liu; Haikun Wei; Kanjian Zhang; Weili Guo


Journal of Machine Learning Research | 2018

Numerical analysis near singularities in RBF networks

Weili Guo; Haikun Wei; Yew-Soon Ong; Jaime Rubio Hervas; Junsheng Zhao; Hai Wang; Kanjian Zhang


IEEE Transactions on Systems, Man, and Cybernetics | 2018

Fisher Information Matrix of Unipolar Activation Function-Based Multilayer Perceptrons

Weili Guo; Yew-Soon Ong; Yingjiang Zhou; Jaime Rubio Hervas; Aiguo Song; Haikun Wei


IEEE Access | 2018

Influence Area of Overlap Singularity in Multilayer Perceptrons

Weili Guo; Yuan Yang; Yingjiang Zhou; Yushun Tan; Haikun Wei; Aiguo Song; Guochen Pang

Collaboration


Dive into the Weili Guo's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yew-Soon Ong

Nanyang Technological University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge