IEEE Transactions on Neural Networks and Learning Systems | 2021

Can Boosted Randomness Mimic Learning Algorithms of Geometric Nature? Example of a Simple Algorithm That Converges in Probability to Hard-Margin SVM

 
 

Abstract


In the light of the general question posed in the title, we write down a very simple randomized learning algorithm, based on boosting, that can be seen as a nonstationary Markov random process. Surprisingly, the decision hyperplanes resulting from this algorithm converge in probability to the exact hard-margin solutions of support vector machines (SVMs). This fact is curious because the hard-margin hyperplane is not a statistical solution, but a purely geometric one—driven by margin maximization and strictly dependent on particular locations of some data points that are placed in the contact region of two classes, namely the support vectors. The proposed algorithm detects support vectors probabilistically, without being aware of their geometric definition. We give proofs of the main convergence theorem and several auxiliary lemmas. The analysis sheds new light on the relation between boosting and SVMs and also on the nature of SVM solutions since they can now be regarded equivalently as limits of certain random trajectories. In the experimental part, correctness of the proposed algorithm is verified against known SVM solvers: libsvm, liblinear, and also against optimization packages: cvxopt (Python) and Wolfram Mathematica.

Volume 32
Pages 3798-3818
DOI 10.1109/TNNLS.2021.3059653
Language English
Journal IEEE Transactions on Neural Networks and Learning Systems

Full Text