Wei-Jie Chen
Zhejiang University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Wei-Jie Chen.
Knowledge Based Systems | 2013
Yuan-Hai Shao; Zhen Wang; Wei-Jie Chen; Nai-Yang Deng
For the recently proposed projection twin support vector machine (PTSVM) [1], we propose a simpler and reasonable variant from theoretical point of view, called projection twin support vector machine with regularization term, RPTSVM for short. Note that only the empirical risk minimization is introduced in primal problems of PTSVM, incurring the possible singularity problems. Our RPTSVM reformulates the primal problems by adding a maximum margin regularization term, and, therefore, the singularity is avoided and the regularized risk principle is implemented. In addition, the nonlinear classification ignored in PTSVM is also considered in our RPTSVM. Further, a successive overrelaxation technique and a genetic algorithm are introduced to solve our optimization problems and to do the parameter selection, respectively. Computational comparisons of our RPTSVM against original PTSVM, TWSVM and MVSVM indicate that our RPTSVM obtains better generalization than others.
Pattern Recognition | 2014
Yuan-Hai Shao; Wei-Jie Chen; Jing-Jing Zhang; Zhen Wang; Nai-Yang Deng
Abstract In this paper, we propose an efficient weighted Lagrangian twin support vector machine (WLTSVM) for the imbalanced data classification based on using different training points for constructing the two proximal hyperplanes. The main contributions of our WLTSVM are: (1) a graph based under-sampling strategy is introduced to keep the proximity information, which is robustness to outliers, (2) the weight biases are embedded in the Lagrangian TWSVM formulations, which overcomes the bias phenomenon in the original TWSVM for the imbalanced data classification, (3) the convergence of the training procedure of Lagrangian functions is proven and (4) it is tested and compared with some other TWSVMs on synthetic and real datasets to show its feasibility and efficiency for the imbalanced data classification.
International Journal of Machine Learning and Cybernetics | 2014
Wei-Jie Chen; Yuan-Hai Shao; Ning Hong
Laplacian twin support vector machine (Lap-TSVM) is a state-of-the-art nonparallel-planes semi-supervised classifier. It tries to exploit the geometrical information embedded in unlabeled data to boost its generalization ability. However, Lap-TSVM may endure heavy burden in training procedure since it needs to solve two quadratic programming problems (QPPs) with the matrix “inversion” operation. In order to enhance the performance of Lap-TSVM, this paper presents a new formulation of Lap-TSVM, termed as Lap-STSVM. Rather than solving two QPPs in dual space, firstly, we convert the primal constrained QPPs of Lap-TSVM into unconstrained minimization problems (UMPs). Afterwards, a smooth technique is introduced to make these UMPs twice differentiable. At last, a fast Newton–Armijo algorithm is designed to solve the UMPs in Lap-STSVM. Experimental evaluation on both artificial and real-world datasets demonstrate the benefits of the proposed approach.
Knowledge Based Systems | 2012
Yuan-Hai Shao; Nai-Yang Deng; Zhi-Min Yang; Wei-Jie Chen; Zhen Wang
In many cases, the output of a classifier should be a calibrated posterior probability to enable post-processing. However, twin support vector machines (TWSVM) do not provide such probabilities. In this paper, we propose a TWSVM probability model, called PTWSVM, to estimate the posterior probability. Note that our model is quite different from the SVM probability model because of the different mechanism of TWSVM and SVM. In fact, for TWSVM, we first define a new ranking continues output by comparing the distances between the sample and the two non-parallel hyperplanes, and then map this ranking continues output into probability by training the parameters of an additional sigmoid function. Our PTWSVM has been tested on both artificial datasets and several data-mining-style datasets, and the numerical experiments indicate that PTWSVM yields nice results.
IEEE Signal Processing Letters | 2013
Yuan-Hai Shao; Nai-Yang Deng; Wei-Jie Chen; Zhen Wang
In this letter, we propose an improved version of generalized eigenvalue proximal support vector machine (GEPSVM), called IGEPSVM for short. The main improvements are 1) the generalized eigenvalue decomposition is replaced by the standard eigenvalue decomposition, resulting in simpler optimization problems without the possible singularity. 2) An extra meaningful parameter is introduced, resulting in the stronger classification generalization ability. Experimental results on both the artificial datasets and several benchmark datasets show that our IGEPSVM is superior to GEPSVM in both computation time and classification accuracy.
Applied Intelligence | 2013
Yuan-Hai Shao; Zhen Wang; Wei-Jie Chen; Nai-Yang Deng
In this paper, we propose a novel least squares twin parametric-margin support vector machine (TPMSVM) for binary classification, called LSTPMSVM for short. LSTPMSVM attempts to solve two modified primal problems of TPMSVM, instead of two dual problems usually solved. The solution of the two modified primal problems reduces to solving just two systems of linear equations as opposed to solving two quadratic programming problems along with two systems of linear equations in TPMSVM, which leads to extremely simple and fast algorithm. Classification using nonlinear kernel with reduced technique also leads to systems of linear equations. Therefore our LSTPMSVM is able to solve large datasets accurately without any external optimizers. Further, a particle swarm optimization (PSO) algorithm is introduced to do the parameter selection. Our experiments on synthetic as well as on several benchmark data sets indicate that our LSTPMSVM has comparable classification accuracy to that of TPMSVM but with remarkably less computational time.
Knowledge Based Systems | 2015
Yuan-Hai Shao; Wei-Jie Chen; Zhen Wang; Chun-Na Li; Nai-Yang Deng
In this paper, we formulate a twin-type support vector machine for large-scale classification problems, called weighted linear loss twin support vector machine (WLTSVM). By introducing the weighted linear loss, our WLTSVM only needs to solve simple linear equations with lower computational cost, and meanwhile, maintains the generalization ability. So, it is able to deal with large-scale problems efficiently without any extra external optimizers. The experimental results on several benchmark datasets indicate that, comparing to TWSVM, our WLTSVM has comparable classification accuracy but with less computational time.
Pattern Recognition | 2016
Wei-Jie Chen; Yuan-Hai Shao; Chun-Na Li; Nai-Yang Deng
Abstract Multi-label learning paradigm, which aims at dealing with data associated with potential multiple labels, has attracted a great deal of attention in machine intelligent community. In this paper, we propose a novel multi-label twin support vector machine (MLTSVM) for multi-label classification. MLTSVM determines multiple nonparallel hyperplanes to capture the multi-label information embedded in data, which is a useful promotion of twin support vector machine (TWSVM) for multi-label classification. To speed up the training procedure, an efficient successive overrelaxation (SOR) algorithm is developed for solving the involved quadratic programming problems (QPPs) in MLTSVM. Extensive experimental results on both synthetic and real-world multi-label datasets confirm the feasibility and effectiveness of the proposed MLTSVM.
Procedia Computer Science | 2013
Yuan-Hai Shao; Wei-Jie Chen; Wen-Biao Huang; Zhi-Min Yang; Nai-Yang Deng
Abstract In this paper, we propose a decision tree twin support vector machine (DTTSVM) for multi-class classification. To realize our DTTSVM, there are two main steps: (1), a binary tree is constructed based on the best separating principle, which maximizing the distance between the classes. (2), in our binary tree, the binary TWSVM decision model is built for each node to obtain our DTTSVM. By using the decision tree model, our DTTSVM effectively overcomes the possible ambiguous occurred in multi- TWSVM and MBSVM. The preliminary experimental results indicate that the proposed method produces simple decision trees that generalize well with respect to multi-TWSVM and MBSVM.
Applied Intelligence | 2014
Wei-Jie Chen; Yuan-Hai Shao; Deng-Ke Xu; Yong-Feng Fu
Recently, semi-supervised learning (SSL) has attracted a great deal of attention in the machine learning community. Under SSL, large amounts of unlabeled data are used to assist the learning procedure to construct a more reasonable classifier. In this paper, we propose a novel manifold proximal support vector machine (MPSVM) for semi-supervised classification. By introducing discriminant information in the manifold regularization (MR), MPSVM not only introduces MR terms to capture as much geometric information as possible from inside the data, but also utilizes the maximum distance criterion to characterize the discrepancy between different classes, leading to the solution of a pair of eigenvalue problems. In addition, an efficient particle swarm optimization (PSO)-based model selection approach is suggested for MPSVM. Experimental results on several artificial as well as real-world datasets demonstrate that MPSVM obtains significantly better performance than supervised GEPSVM, and achieves comparable or better performance than LapSVM and LapTSVM, with better learning efficiency.