Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yingjie Tian is active.

Publication


Featured researches published by Yingjie Tian.


Pattern Recognition | 2013

Robust twin support vector machine for pattern classification

Zhiquan Qi; Yingjie Tian; Yong Shi

In this paper, we proposed a new robust twin support vector machine (called R-TWSVM) via second order cone programming formulations for classification, which can deal with data with measurement noise efficiently. Preliminary experiments confirm the robustness of the proposed method and its superiority to the traditional robust SVM in both computation time and classification accuracy. Remarkably, since there are only inner products about inputs in our dual problems, this makes us apply kernel trick directly for nonlinear cases. Simultaneously we does not need to solve the extra inverse of matrices, which is totally different with existing TWSVMs. In addition, we also show that the TWSVMs are the special case of our robust model and simultaneously give a new dual form of TWSVM by degenerating R-TWSVM, which successfully overcomes the existing shortcomings of TWSVM.


Neural Networks | 2012

Laplacian twin support vector machine for semi-supervised classification

Zhiquan Qi; Yingjie Tian; Yong Shi

Semi-supervised learning has attracted a great deal of attention in machine learning and data mining. In this paper, we have proposed a novel Laplacian Twin Support Vector Machine (called Lap-TSVM) for the semi-supervised classification problem, which can exploit the geometry information of the marginal distribution embedded in unlabeled data to construct a more reasonable classifier and be a useful extension of TSVM. Furthermore, by choosing appropriate parameters, Lap-TSVM degenerates to either TSVM or TBSVM. All experiments on synthetic and real data sets show that the Lap-TSVMs classifier combined by two nonparallel hyperplanes is superior to Lap-SVM and TSVM in both classification accuracy and computation time.


IEEE Transactions on Systems, Man, and Cybernetics | 2014

Nonparallel Support Vector Machines for Pattern Classification

Yingjie Tian; Zhiquan Qi; Xuchan Ju; Yong Shi; Xiaohui Liu

We propose a novel nonparallel classifier, called nonparallel support vector machine (NPSVM), for binary classification. Our NPSVM that is fully different from the existing nonparallel classifiers, such as the generalized eigenvalue proximal support vector machine (GEPSVM) and the twin support vector machine (TWSVM), has several incomparable advantages: (1) two primal problems are constructed implementing the structural risk minimization principle; (2) the dual problems of these two primal problems have the same advantages as that of the standard SVMs, so that the kernel trick can be applied directly, while existing TWSVMs have to construct another two primal problems for nonlinear cases based on the approximate kernel-generated surfaces, furthermore, their nonlinear problems cannot degenerate to the linear case even the linear kernel is used; (3) the dual problems have the same elegant formulation with that of standard SVMs and can certainly be solved efficiently by sequential minimization optimization algorithm, while existing GEPSVM or TWSVMs are not suitable for large scale problems; (4) it has the inherent sparseness as standard SVMs; (5) existing TWSVMs are only the special cases of the NPSVM when the parameters of which are appropriately chosen. Experimental results on lots of datasets show the effectiveness of our method in both sparseness and classification accuracy, and therefore, confirm the above conclusion further. In some sense, our NPSVM is a new starting point of nonparallel classifiers.


Archive | 2012

Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions

Naiyang Deng; Yingjie Tian; Chunhua Zhang

Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions presents an accessible treatment of the two main components of support vector machines (SVMs)classification problems and regression problems. The book emphasizes the close connection between optimization theory and SVMs since optimization is one of the pillars on which SVMs are built. The authors share insight on many of their research achievements. They give a precise interpretation of statistical leaning theory for C-support vector classification. They also discuss regularized twin SVMs for binary classification problems, SVMs for solving multi-classification problems based on ordinal regression, SVMs for semi-supervised problems, and SVMs for problems with perturbations. To improve readability, concepts, methods, and results are introduced graphically and with clear explanations. For important concepts and algorithms, such as the Crammer-Singer SVM for multi-class classification problems, the text provides geometric interpretations that are not depicted in current literature. Enabling a sound understanding of SVMs, this book gives beginners as well as more experienced researchers and engineers the tools to solve real-world problems using SVMs.


Knowledge Based Systems | 2013

Structural twin support vector machine for classification

Zhiquan Qi; Yingjie Tian; Yong Shi

It has been shown that the structural information of data may contain useful prior domain knowledge for training a classifier. How to apply the structural information of data to build a good classifier is a new research focus recently. As we all know, the all existing structural large margin methods are the common in considering all structural information within classes into one model. In fact, these methods do not balance all structural informations relationships both infra-class and inter-class, which directly results in these prior information not being exploited sufficiently. In this paper, we design a new Structural Twin Support Vector Machine (called S-TWSVM). Unlike existing methods based on structural information, S-TWSVM uses two hyperplanes to decide the category of new data, of which each model only considers one classs structural information and closer to the class at the same time far away from the other class. This makes S-TWSVM fully exploit these prior knowledge to directly improve the algorithms the capacity of generalization. All experiments show that our proposed method is rigidly superior to the state-of-the-art algorithms based on structural information of data in both computation time and classification accuracy.


Neural Networks | 2012

Twin support vector machine with Universum data

Zhiquan Qi; Yingjie Tian; Yong Shi

The Universum, which is defined as the sample not belonging to either class of the classification problem of interest, has been proved to be helpful in supervised learning. In this work, we designed a new Twin Support Vector Machine with Universum (called U-TSVM), which can utilize Universum data to improve the classification performance of TSVM. Unlike U-SVM, in U-TSVM, Universum data are located in a nonparallel insensitive loss tube by using two Hinge Loss functions, which can exploit these prior knowledge embedded in Universum data more flexible. Empirical experiments demonstrate that U-TSVM can directly improve the classification accuracy of standard TSVM that use the labeled data alone and is superior to U-SVM in most cases.


Technological and Economic Development of Economy | 2012

Recent advances on support vector machines research

Yingjie Tian; Yong Shi; Xiaohui Liu

Abstract Support vector machines (SVMs), with their roots in Statistical Learning Theory (SLT) and optimization methods, have become powerful tools for problem solution in machine learning. SVMs reduce most machine learning problems to optimization problems and optimization lies at the heart of SVMs. Lots of SVM algorithms involve solving not only convex problems, such as linear programming, quadratic programming, second order cone programming, semi-definite programming, but also non-convex and more general optimization problems, such as integer programming, semi-infinite programming, bi-level programming and so on. The purpose of this paper is to understand SVM from the optimization point of view, review several representative optimization models in SVMs, their applications in economics, in order to promote the research interests in both optimization-based SVMs theory and economics applications. This paper starts with summarizing and explaining the nature of SVMs. It then proceeds to discuss optimization...


Expert Systems With Applications | 2011

Credit card churn forecasting by logistic regression and decision tree

Guangli Nie; Wei Rowe; Lingling Zhang; Yingjie Tian; Yong Shi

In this paper, two data mining algorithms are applied to build a churn prediction model using credit card data collected from a real Chinese bank. The contribution of four variable categories: customer information, card information, risk information, and transaction activity information are examined. The paper analyzes a process of dealing with variables when data is obtained from a database instead of a survey. Instead of considering the all 135 variables into the model directly, it selects the certain variables from the perspective of not only correlation but also economic sense. In addition to the accuracy of analytic results, the paper designs a misclassification cost measurement by taking the two types error and the economic sense into account, which is more suitable to evaluate the credit card churn prediction model. The algorithms used in this study include logistic regression and decision tree which are proven mature and powerful classification algorithms. The test result shows that regression performs a little better than decision tree.


Archive | 2011

Optimization Based Data Mining: Theory and Applications

Yong Shi; Yingjie Tian; Gang Kou; Yi Peng; Jianping Li

Optimization techniques have been widely adopted to implement various data mining algorithms. In addition to well-known Support Vector Machines (SVMs) (which are based on quadratic programming), different versions of Multiple Criteria Programming (MCP) have been extensively used in data separations. Since optimization based data mining methods differ from statistics, decision tree induction, and neural networks, their theoretical inspiration has attracted many researchers who are interested in algorithm development of data mining. Optimization based Data Mining: Theory and Applications, mainly focuses on MCP and SVM especially their recent theoretical progress and real-life applications in various fields. These include finance, web services, bio-informatics and petroleum engineering, which has triggered the interest of practitioners who look for new methods to improve the results of data mining for knowledge discovery.Most of the material in this book is directly from the research and application activities that the authors research group has conducted over the last ten years. Aimed at practitioners and graduates who have a fundamental knowledge in data mining, it demonstrates the basic concepts and foundations on how to use optimization techniques to deal with data mining problems.


Neurocomputing | 2010

Kernel subclass convex hull sample selection method for SVM on face recognition

Xiaofei Zhou; Wenhan Jiang; Yingjie Tian; Yong Shi

Support Vector Machine (SVM) is an effective classifier for classification task, but a vital shortcoming of SVM is that it needs huge computation for large-scale learning tasks. Sample selection is a feasible strategy to overcome the problem. In order to reduce training samples without sacrificing recognition accuracy, this paper presents a novel sample selection approach named Kernel Subclass Convex Hull (KSCH) sample selection approach, which tries to select boundary samples of each class convex hull. The sample selection idea is derived from the geometrical explanation of SVM. In geometry, constructing a SVM problem can be converted to a problem of computing the nearest points between two convex hulls. Therefore, each class convex hull virtually determines the separating plane of SVM. Since a convex hull of a set can be only constructed by boundary samples of the convex hull, using boundary samples of each class to train SVM will be equivalent to using all training samples to train the classifier. Based on the idea, KSCH method iteratively select boundary samples of each class convex hull in high-dimensional space (induced by kernel trick). The convex hull of chosen set is called subclass convex hull. With the increasing of the size of chosen set, each subclass convex hull can rapidly approximate each class convex hull. So the samples selected by our method can efficiently represent original training set and support SVM classification. Experimental results on MIT-CBCL face database and UMIST face database show that KSCH sample selection method can select fewer high-quality samples to maintain the recognition accuracy of SVM.

Collaboration


Dive into the Yingjie Tian's collaboration.

Top Co-Authors

Avatar

Yong Shi

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Zhiquan Qi

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Gang Kou

University of Electronic Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Jianping Li

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Yi Peng

University of Electronic Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Dalian Liu

Beijing Jiaotong University

View shared research outputs
Top Co-Authors

Avatar

Dewei Li

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Peng Zhang

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Jingjing Tang

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Xuchan Ju

Chinese Academy of Sciences

View shared research outputs
Researchain Logo
Decentralizing Knowledge