Lingfeng Niu
Chinese Academy of Sciences
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lingfeng Niu.
web intelligence | 2012
Lingfeng Niu; Yong Shi; Jianmin Wu
In the process of human learning, teachers always play an important role. However, for most of the existing machine learning method, the role of teachers is seldom considered. Recently, Vapnik introduced an advanced learning paradigm called Learning Using Privileged Information(LUPI) to include the elements of human teaching in machine learning. Through theoretical analysis and numerical experiments, the superiority of LUPI over the classical learning paradigm has received preliminary proof. In this paper, on the basis of existing work for LUPI, we introduce the privileged information into the modeling of L-1 support vector machine(SVM). Compared with the existing research of LUPI with L-2 SVM, the new method has the advantage of spending less time on tuning model parameters and the additional benefits of performing feature selection in the training process. Experiments on the digit recognition problem validate the effectiveness of our method.
international conference on conceptual structures | 2013
Lingfeng Niu; Xi Zhao; Yong Shi
Abstract Optimization is an important tool in computational finance and business intelligence. Multiple criteria mathematical pro- gram(MCMP), which is concerned with mathematical optimization problems involving more than one objective function to be optimized simultaneously, is one of the ways of utilizing optimization techniques. Due to the existence of multiple objec- tives, MCMPs are usually difficult to be optimized. In fact, for a nontrivial MCMP, there does not exist a single solution that optimizes all the objectives at the same time. In practice, many methods convert the original MCMP into a single-objective program and solve the obtained scalarized optimization problem. If the values of scalarization parameters, which measure the trade-offs between the conflicting objectives, are not chosen carefully, the converted single-objective optimization problem may be not solvable. Therefore, to make sure MCMP always can be solved successfully, heuristic search and expert knowledge for deciding the value of scalarization parameters are always necessary, which is not an easy task and limits the applications of MCMP to some extend. In this paper, we take the multiple criteria linear program(MCLP) for binary classification as the example and discuss how to modified the formulation of MCLP directly to guarantee the solvability. In details, we propose adding a quadratic regularization term into the converted single-objective linear program. The new regularized formulation does not only overcomes some defects of the original scalarized problem in modeling, it also can be shown in theory that the finite optimal solutions always exist. To test the performance of the proposed method, we compare our algorithm with sever- al state-of-the-art algorithms for binary classification on several different kinds of datasets. Preliminary experimental results demonstrate the effectiveness of our regularization method.
IEEE Transactions on Neural Networks | 2018
Yong Shi; Jianyu Miao; Zhengyu Wang; Peng Zhang; Lingfeng Niu
Feature selection aims to select a subset of features from high-dimensional data according to a predefined selecting criterion. Sparse learning has been proven to be a powerful technique in feature selection. Sparse regularizer, as a key component of sparse learning, has been studied for several years. Although convex regularizers have been used in many works, there are some cases where nonconvex regularizers outperform convex regularizers. To make the process of selecting relevant features more effective, we propose a novel nonconvex sparse metric on matrices as the sparsity regularization in this paper. The new nonconvex regularizer could be written as the difference of the
international conference on conceptual structures | 2012
Lingfeng Niu; Jianmin Wu; Yong Shi
\ell _{2,1}
Neural Computing and Applications | 2018
Fan Meng; Zhiquan Qi; Yingjie Tian; Lingfeng Niu
norm and the Frobenius (
international conference on conceptual structures | 2017
Huadong Wang; Jianyu Miao; Seyed Mojtaba Hosseini Bamakan; Lingfeng Niu; Yong Shi
\ell _{2,2}
intelligent data analysis | 2015
Xi Zhao; Yong Shi; Lingfeng Niu
) norm, which is named the
Proceedings of the 2014 IEEE/WIC/ACM International Joint Conferences on Web Intelligence (WI) and Intelligent Agent Technologies (IAT) on | 2014
Lingfeng Niu; Xi Zhao; Yong Shi
\ell _{2,1-2}
web intelligence | 2016
Jianyu Miao; Yong Shi; Lingfeng Niu
. To find the solution of the resulting nonconvex formula, we design an iterative algorithm in the framework of ConCave-Convex Procedure (CCCP) and prove its strong global convergence. An adopted alternating direction method of multipliers is embedded to solve the sequence of convex subproblems in CCCP efficiently. Using the scaled cluster indictors of data points as pseudolabels, we also apply
Procedia Computer Science | 2013
Lingfeng Niu; Xi Zhao
\ell _{2,1-2}