Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yitian Xu is active.

Publication


Featured researches published by Yitian Xu.


IEEE Transactions on Neural Networks | 2017

A Novel Twin Support-Vector Machine With Pinball Loss

Yitian Xu; Zhiji Yang; Xianli Pan

Twin support-vector machine (TSVM), which generates two nonparallel hyperplanes by solving a pair of smaller-sized quadratic programming problems (QPPs) instead of a single larger-sized QPP, works faster than the standard SVM, especially for the large-scale data sets. However, the traditional TSVM adopts hinge loss which easily leads to its sensitivity of the noise and instability for resampling. To enhance the performance of the TSVM, we present a novel TSVM with the pinball loss (Pin-TSVM) which deals with the quantile distance and is less sensitive to noise points. We further investigate its properties, including the noise insensitivity, between-class distance maximization, and within-class scatter minimization. In addition, we compare our Pin-TSVM with the twin parametric-margin SVM and the SVM with the pinball loss in theory. Numerical experiments on a synthetic data set and 14 benchmark data sets with different noises demonstrate the feasibility and validity of our proposed method.


Knowledge Based Systems | 2016

A maximum margin and minimum volume hyper-spheres machine with pinball loss for imbalanced data classification

Yitian Xu; Zhiji Yang; Yuqun Zhang; Xianli Pan; Laisheng Wang

The twin hyper-sphere support vector machine (THSVM) classifies two classes of samples via two hyper-spheres instead of a pair of nonparallel hyper-planes as in the conversional twin support vector machine (TSVM). Moreover THSVM avoids the matrix inverse operation when solving two dual quadratic programming problems (QPPs). However it cannot yield a desirable result when dealing with the imbalanced data classification. To improve the generalization performance, we propose a maximum margin and minimum volume hyper-spheres machine with pinball loss (Pin-M3HM) for the imbalanced data classification in this paper. The basic idea is to construct two hyper-spheres with different centers and radiuses in a sequential order. The first one contains as many examples in majority class as possible, and the second one covers minority class of examples as possible. Moreover the margin between two hyper-spheres is as large as possible. Besides, the pinball loss function is introduced into it to avoid the noise disturbance. Experimental results on 24 imbalanced datasets from the repositories of UCI and KEEL, and a real spectral dataset of Chinese grape wines indicate that our proposed Pin-M3HM yields a good generalization performance for the imbalanced data classification.


IEEE Transactions on Neural Networks | 2018

Safe Screening Rules for Accelerating Twin Support Vector Machine Classification

Xianli Pan; Zhiji Yang; Yitian Xu; Laisheng Wang

The twin support vector machine (TSVM) is widely used in classification problems, but it is not efficient enough for large-scale data sets. Furthermore, to get the optimal parameter, the exhaustive grid search method is applied to TSVM. It is very time-consuming, especially for multiparameter models. Although many techniques have been presented to solve these problems, all of them always affect the performance of TSVM to some extent. In this paper, we propose a safe screening rule (SSR) for linear-TSVM, and give a modified SSR (MSSR) for nonlinear TSVM, which contains multiple parameters. The SSR and MSSR can delete most training samples and reduce the scale of TSVM before solving it. Sequential versions of SSR and MSSR are further introduced to substantially accelerate the whole parameter tuning process. One important advantage of SSR and MSSR is that they are safe, i.e., we can obtain the same solution as the original problem by utilizing them. Experiments on eight real-world data sets and an imbalanced data set with different imbalanced ratios demonstrate the efficiency and safety of SSR and MSSR.


Neurocomputing | 2016

Laplacian twin parametric-margin support vector machine for semi-supervised classification

Zhiji Yang; Yitian Xu

As an extension of twin support vector machine (TSVM), twin parametric-margin support vector machine (TPMSVM) makes the learning speed faster than that of the parametric-margin ?-support vector machine (par-?-SVM), and it is suitable for many cases, especially when the data has heteroscedastic error structure. This algorithm needs all the labels of training samples, which belongs to a supervised method. However, it is sometimes difficult to achieve each label of the data. To effectively handle this case, we propose a Laplacian twin parametric-margin support vector machine (LTPMSVM) for the semi-supervised classification, which exploits the geometric information of the marginal distribution embedded in unlabeled data to construct a more reasonable classifier. Additionally, the LTPMSVM has helpful properties to shed light on theoretical interpretation of parameters which control the bounds on proportions of support vectors and boundary errors. Experimental results on artificial datasets testify the properties of its parameters. Furthermore, results on the ABCDETC and twelve benchmark datasets indicate that our proposed LTPMSVM yields a good generalization performance with the comparable computing time to Laplacian twin support vector machine (LTSVM).


Neurocomputing | 2016

K-nearest neighbor-based weighted multi-class twin support vector machine

Yitian Xu

Twin-KSVC, as a novel multi-class classification algorithm, aims at finding two nonparallel hyper-planes for the two focused classes of samples by solving a pair of smaller-sized quadratic programming problems (QPPs), which makes the learning speed faster than other multi-class classification algorithms. However, the local information of samples is ignored, and then each sample shares the same weight when constructing the separating hyper-planes. In fact, they have different influences on the separating hyper-planes. Inspired by the studies above, we propose a K-nearest neighbor (KNN)-based weighted multi-class twin support vector machine (KWMTSVM) in this paper. Weight matrix W is employed in the objective function to exploit the local information of intra-class. Meanwhile, both weight vectors f and h are introduced into the constraints to exploit the information of inter-class. When component f j = 0 or h k = 0 , it implies that the j-th or k-th constraint is redundant. Removing these redundant constraints can effectively improve the computational speed of the classifier. Experimental results on eleven benchmark datasets and ABCD dataset demonstrate the validity of our proposed algorithm.


Knowledge Based Systems | 2017

A safe sample screening rule for Universum support vector machines

Jiang Zhao; Yitian Xu

Abstract Universum support vector machine ( U -SVM), due to its tremendous accuracy improvements, has been expanded and applied in all kinds of fields. Universum encodes related prior knowledge but does not belong to any class of interest. With Universum, the number of training samples and computational complexity are clearly increased. Inspired by the sparsity of SVMs, a safe sample screening rule (SSSR) for U -SVM is proposed in this paper. Our SSSR eliminates not only the labelled samples but also the Universum samples before training process, then the computational cost is dramatically reduced. Moreover, the same solution as the original problem can be obtained by utilizing our SSSR, that is, the training process is guaranteed to be accelerated safely. Besides, we extend our rule to the Universum twin support vector machine ( U -TSVM), and the SSSR for U -TSVM is also discussed in this paper. To the best of our knowledge, SSSR is the only existing safe screening method for U -SVMs. Numerical experiments on seventeen benchmark datasets, ABCDETC dataset and Chinese wine dataset demonstrate that the computational cost can be dramatically reduced without sacrificing the optimality of the final solution by our SSSR.


Neurocomputing | 2018

A safe screening based framework for support vector regression

Xianli Pan; Xinying Pang; Hongmei Wang; Yitian Xu

Abstract Support vector regression (SVR) is popular and efficient for regression problems. However, it is time-consuming to solve it, especially for large datasets. Inspired by the sparse solutions of SVR, a safe screening based framework for SVR (SVR-SBF), including both linear and nonlinear cases, is proposed in this paper to improve its training speed. This SBF has two steps: First, the constant solutions of SVR along the regularization path of parameter C are deleted before training; Second, a safe screening rule via variational inequalities (SSR-VI) is embedded into the grid search method to further discard the inactive solutions of SVR. This SBF can efficiently accelerate the training speed of SVR without affecting its solutions. Compared to existing safe rules, our SVR-SBF can identify more inactive solutions by finding constant solutions beforehand. In addition, our SBF is further expanded to more situations and models. To be specific, a modified SSR-VI is proposed to be adapted to other parameter selection methods, and models including variants of SVR and classical SVM are analyzed. Experiments on both synthetic and real datasets are conducted to demonstrate the superiority of SVR-SBF.


Neural Computing and Applications | 2017

Asymmetric ν-twin support vector regression

Yitian Xu; Xiaoyan Li; Xianli Pan; Zhiji Yang

Twin support vector regression (TSVR) aims at finding 𝜖-insensitive up- and down-bound functions for the training points by solving a pair of smaller-sized quadratic programming problems (QPPs) rather than a single large one as in the conventional SVR. So TSVR works faster than SVR in theory. However, TSVR gives equal emphasis to the points above the up-bound and below the down-bound, which leads to the same influences on the regression function. In fact, points in different positions have different effects on the regressor. Inspired by it, we propose an asymmetric ν-twin support vector regression based on pinball loss function (Asy- ν-TSVR). The new algorithm can effectively control the fitting error by tuning the parameters ν and p. Therefore, it enhances the generalization ability. Moreover, we study the distribution of samples and give the upper bounds for the samples locating in different positions. Numerical experiments on one artificial dataset, eleven benchmark datasets and a real wheat dataset demonstrate the validity of our proposed algorithm.


International Journal of Machine Learning and Cybernetics | 2017

KNN-based maximum margin and minimum volume hyper-sphere machine for imbalanced data classification

Yitian Xu; Yuqun Zhang; Jiang Zhao; Zhiji Yang; Xianli Pan

Imbalanced data classification is often met in our real life. In this paper, a novel k-nearest neighbor (KNN)-based maximum margin and minimum volume hyper-sphere machine (KNN-M3VHM) is presented for the imbalanced data classification. The basic idea is to construct two hyper-spheres with different centres and radiuses. The first one contains majority examples and the second one covers minority examples. When constructing the first hyper-sphere, we remove some redundant majority samples using k-nearest neighbor (KNN)-based strategy to balance two classes of samples. Meanwhile, we maximize the margin between two hyper-spheres and minimize their volumes, which can result in two tight boundaries around each class. Similar to the twin hyper-sphere support vector machine (THSVM), KNN-M3VHM solves two related SVM-type problems and avoids the matrix inverse operation when solving the convex optimization problems. KNN-M3VHM considers not only the within-class information but also the between-class margin, then it achieves better performance in comparison with other state-of-the-art algorithms. Experimental results on twenty-five datasets validate the significant advantages of our proposed algorithm.


Knowledge Based Systems | 2018

Scaling KNN multi-class twin support vector machine via safe instance reduction

Xinying Pang; Chang Xu; Yitian Xu

Abstract k-nearest neighbor-based weighted multi-class twin support vector machine(KMTSVM) is an effective algorithm to deal with multi-class optimization problems. The superiority of KMTSVM is that it adopts a “1-versus-1-versus-rest” structure, and takes the distribution information of all the instances into consideration. However, it costs much time to handle large scale problems. Motivated by the sparse solution of KMTSVM, in this paper, we propose a safe instance reduction rule to improve its computational efficiency, termed as SIR-KMTSVM. The SIR-KMTSVM can delete a majority of redundant instances both for focused classes and remaining classes, then the operation speed can be accelerated greatly. And our instance reduction rule is safe in the sense that the reduced problem can derive an identical optimal solution as the original one. More importantly, we analysis that the different k-nearest neighbors will have different acceleration effect on our SIR-KMTSVM. Besides, a fast algorithm DCDM is introduced to handle relatively large scale problems more efficiently. Sequential versions of SIR-KMTSVM are further introduced to substantially accelerate the whole training process. Experimental results on an artificial dataset, seventeen benchmark datasets and a real dataset confirm the effectiveness of our proposed algorithm.

Collaboration


Dive into the Yitian Xu's collaboration.

Top Co-Authors

Avatar

Xianli Pan

China Agricultural University

View shared research outputs
Top Co-Authors

Avatar

Zhiji Yang

China Agricultural University

View shared research outputs
Top Co-Authors

Avatar

Laisheng Wang

China Agricultural University

View shared research outputs
Top Co-Authors

Avatar

Hongmei Wang

China Agricultural University

View shared research outputs
Top Co-Authors

Avatar

Jiang Zhao

China Agricultural University

View shared research outputs
Top Co-Authors

Avatar

Xinying Pang

China Agricultural University

View shared research outputs
Top Co-Authors

Avatar

Yuqun Zhang

China Agricultural University

View shared research outputs
Top Co-Authors

Avatar

Chang Xu

Beijing Information Science

View shared research outputs
Top Co-Authors

Avatar

Xiaoyan Li

University of Science and Technology Beijing

View shared research outputs
Top Co-Authors

Avatar

Ye Song

China Agricultural University

View shared research outputs
Researchain Logo
Decentralizing Knowledge