Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where A.K. Qin is active.

Publication


Featured researches published by A.K. Qin.


IEEE Transactions on Evolutionary Computation | 2009

Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization

A.K. Qin; V. L. Huang; Ponnuthurai N. Suganthan

Differential evolution (DE) is an efficient and powerful population-based stochastic search technique for solving optimization problems over continuous space, which has been widely applied in many scientific and engineering fields. However, the success of DE in solving a specific problem crucially depends on appropriately choosing trial vector generation strategies and their associated control parameter values. Employing a trial-and-error scheme to search for the most suitable strategy and its associated parameter settings requires high computational costs. Moreover, at different stages of evolution, different strategies coupled with different parameter settings may be required in order to achieve the best performance. In this paper, we propose a self-adaptive DE (SaDE) algorithm, in which both trial vector generation strategies and their associated control parameter values are gradually self-adapted by learning from their previous experiences in generating promising solutions. Consequently, a more suitable generation strategy along with its parameter settings can be determined adaptively to match different phases of the search process/evolution. The performance of the SaDE algorithm is extensively evaluated (using codes available from P. N. Suganthan) on a suite of 26 bound-constrained numerical optimization problems and compares favorably with the conventional DE and several state-of-the-art parameter adaptive DE variants.


congress on evolutionary computation | 2005

Self-adaptive differential evolution algorithm for numerical optimization

A.K. Qin; Ponnuthurai N. Suganthan

In this paper, we propose a novel self-adaptive differential evolution algorithm (SaDE), where the choice of learning strategy and the two control parameters F and CR are not required to be pre-specified. During evolution, the suitable learning strategy and parameter settings are gradually self-adapted according to the learning experience. The performance of the SaDE is reported on the set of 25 benchmark functions provided by CEC2005 special session on real parameter optimization.


Pattern Recognition | 2005

Rapid and brief communication: Evolutionary extreme learning machine

Qin-Yu Zhu; A.K. Qin; Ponnuthurai N. Suganthan; Guang-Bin Huang

Extreme learning machine (ELM) [G.-B. Huang, Q.-Y. Zhu, C.-K. Siew, Extreme learning machine: a new learning scheme of feedforward neural networks, in: Proceedings of the International Joint Conference on Neural Networks (IJCNN2004), Budapest, Hungary, 25-29 July 2004], a novel learning algorithm much faster than the traditional gradient-based learning algorithms, was proposed recently for single-hidden-layer feedforward neural networks (SLFNs). However, ELM may need higher number of hidden neurons due to the random determination of the input weights and hidden biases. In this paper, a hybrid learning algorithm is proposed which uses the differential evolutionary algorithm to select the input weights and Moore-Penrose (MP) generalized inverse to analytically determine the output weights. Experimental results show that this approach is able to achieve good generalization performance with much more compact networks.


ieee international conference on evolutionary computation | 2006

Self-adaptive Differential Evolution Algorithm for Constrained Real-Parameter Optimization

V. L. Huang; A.K. Qin; Ponnuthurai N. Suganthan

In this paper, we propose an extension of Self-adaptive Differential Evolution algorithm (SaDE) to solve optimization problems with constraints. In comparison with the original SaDE algorithm, the replacement criterion was modified for handling constraints. The performance of the proposed method is reported on the set of 24 benchmark problems provided by CEC2006 special session on constrained real parameter optimization.


Neural Networks | 2004

Robust growing neural gas algorithm with application in cluster analysis

A.K. Qin; Ponnuthurai N. Suganthan

We propose a novel robust clustering algorithm within the Growing Neural Gas (GNG) framework, called Robust Growing Neural Gas (RGNG) network.The Matlab codes are available from . By incorporating several robust strategies, such as outlier resistant scheme, adaptive modulation of learning rates and cluster repulsion method into the traditional GNG framework, the proposed RGNG network possesses better robustness properties. The RGNG is insensitive to initialization, input sequence ordering and the presence of outliers. Furthermore, the RGNG network can automatically determine the optimal number of clusters by seeking the extreme value of the Minimum Description Length (MDL) measure during network growing process. The resulting center positions of the optimal number of clusters represented by prototype vectors are close to the actual ones irrespective of the existence of outliers. Topology relationships among these prototypes can also be established. Experimental results have shown the superior performance of our proposed method over the original GNG incorporating MDL method, called GNG-M, in static data clustering tasks on both artificial and UCI data sets.


Pattern Recognition | 2005

Linear dimensionality reduction using relevance weighted LDA

E. K. Tang; Ponnuthurai N. Suganthan; Xin Yao; A.K. Qin

The linear discriminant analysis (LDA) is one of the most traditional linear dimensionality reduction methods. This paper incorporates the inter-class relationships as relevance weights into the estimation of the overall within-class scatter matrix in order to improve the performance of the basic LDA method and some of its improved variants. We demonstrate that in some specific situations the standard multi-class LDA almost totally fails to find a discriminative subspace if the proposed relevance weights are not incorporated. In order to estimate the relevance weights of individual within-class scatter matrices, we propose several methods of which one employs the evolution strategies.


congress on evolutionary computation | 2007

Multi-objective optimization based on self-adaptive differential evolution algorithm

V. L. Huang; A.K. Qin; Ponnuthurai N. Suganthan; Mehmet Fatih Tasgetiren

In this paper, our recently developed Self-adaptive Differential Evolution algorithm (SaDE) is extended to solve numerical optimization problems with multiple conflicting objectives. The performance of the proposed MOSaDE algorithm is evaluated on a suit of 19 benchmark problems provided for the CEC2007 special session (http://www.ntu.edu.sg/home/epnsugan/) on Performance Assessment of Multi-Objective Optimization Algorithms.


international conference on pattern recognition | 2004

Kernel neural gas algorithms with application to cluster analysis

A.K. Qin; Ponnuthurai N. Suganthan

We present a kernel neural gas (KNG) algorithm, to generalize the original neural gas (NG) algorithm into a higher dimensional feature space. The proposed KNG algorithm can successfully tackle nonlinearly structured datasets. Compared with several existing kernel clustering algorithms, the KNG can be insensitive to initializations, due to the employment of the sequential learning strategy and the neighborhood cooperation scheme. Further, a distortion sensitive KNG (DSKNG) algorithm is proposed to tackle the imbalanced clustering problem. Experimental results show that our KNG algorithm can successfully deal with nonlinearly structured datasets and multi-modal datasets, while the imbalanced clusters are detected bv the DSKNG.


international conference on pattern recognition | 2004

A novel kernel prototype-based learning algorithm

A.K. Qin; Ponnuthurai N. Suganthan

We propose a novel kernel prototype-based learning algorithm, called kernel generalized learning vector quantization (KGLYQ) algorithm, which can significantly improve the classification performance of the original generalized learning vector quantization algorithm in complex pattern classification tasks. In addition, the KGLVQ can also serve as a good general kernel learning framework for further investigation.


Pattern Recognition | 2005

Enhanced neural gas network for prototype-based clustering

A.K. Qin; Ponnuthurai N. Suganthan

In practical cluster analysis tasks, an efficient clustering algorithm should be less sensitive to parameter configurations and tolerate the existence of outliers. Based on the neural gas (NG) network framework, we propose an efficient prototype-based clustering (PBC) algorithm called enhanced neural gas (ENG) network. Several problems associated with the traditional PBC algorithms and original NG algorithm such as sensitivity to initialization, sensitivity to input sequence ordering and the adverse influence from outliers can be effectively tackled in our new scheme. In addition, our new algorithm can establish the topology relationships among the prototypes and all topology-wise badly located prototypes can be relocated to represent more meaningful regions. Experimental resultson synthetic and UCI datasets show that our algorithm possesses superior performance in comparison to several PBC algorithms and their improved variants, such as hard c-means, fuzzy c-means, NG, fuzzy possibilistic c-means, credibilistic fuzzy c-means, hard/fuzzy robust clustering and alternative hard/fuzzy c-means, in static data clustering tasks with a fixed number of prototypes.

Collaboration


Dive into the A.K. Qin's collaboration.

Top Co-Authors

Avatar

Ponnuthurai N. Suganthan

Nanyang Technological University

View shared research outputs
Top Co-Authors

Avatar

Marco Loog

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

V. L. Huang

Nanyang Technological University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

S. Baskar

Thiagarajar College of Engineering

View shared research outputs
Top Co-Authors

Avatar

C. H. Tay

Nanyang Technological University

View shared research outputs
Top Co-Authors

Avatar

E. K. Tang

Nanyang Technological University

View shared research outputs
Top Co-Authors

Avatar

Guang-Bin Huang

Nanyang Technological University

View shared research outputs
Top Co-Authors

Avatar

H. S. Pa

Nanyang Technological University

View shared research outputs
Top Co-Authors

Avatar

J. J. Liang

Nanyang Technological University

View shared research outputs
Researchain Logo
Decentralizing Knowledge