IEEE Transactions on Cybernetics | 2019

Multiview Learning With Generalized Eigenvalue Proximal Support Vector Machines

 
 
 

Abstract


Generalized eigenvalue proximal support vector machines (GEPSVMs) are a simple and effective binary classification method in which each hyperplane is closest to one of the two classes and as far as possible from the other class. They solve a pair of generalized eigenvalue problems to obtain two nonparallel hyperplanes. Multiview learning considers learning with multiple feature sets to improve the learning performance. In this paper, we propose multiview GEPSVMs (MvGSVMs) which effectively combine two views by introducing a multiview co-regularization term to maximize the consensus on distinct views, and skillfully transform a complicated optimization problem to a simple generalized eigenvalue problem. We also propose multiview improved GEPSVMs (MvIGSVMs), which use the minus instead of ratio in MvGSVMs to measure the differences of the distances between the two classes and the hyperplane and lead to a simpler eigenvalue problem. Linear MvGSVMs and MvIGSVMs are generalized to the nonlinear case by the kernel trick. Experimental results on multiple data sets show the effectiveness of our proposed approaches.

Volume 49
Pages 688-697
DOI 10.1109/TCYB.2017.2786719
Language English
Journal IEEE Transactions on Cybernetics

Full Text