Archive | 2019

Robust Learning Based on The Information Theoretic Learning

 

Abstract


Robust learning plays an important role in many fields such as computer vision, \nmachine learning, pattern recognition, image processing, signal processing, etc. \nEspecially, subspace learning and sparse learning (SLSL) have been widely used \nas useful tools to extract features in highly redundant raw data. However, most existing \nSLSL algorithms establish their learning models based on a second order statistic \nmeasurement, e.g. mean square error, which is sensitive to heavy noise and outliers/ \nocclusions in the data. This thesis devises advanced algorithms for robust SLSL for \ndimensional reduction of higher order tensor data and for learning sparse coefficients in \nthe presence of outliers/occlusions. \nIn this thesis, we concentrate on formulating new mathematical models for SLSL \nto solve outlier sample and sample outlier problems based on the information theoretic \nlearning (ITL). For robust subspace learning, we developed two algorithms based on \nITL-based metrics. We first take the advantages of MCC in suppressing outlier information \nto solve two-dimensional singular value decomposition in the presence of outlier \nsamples and sample outliers, and then extend the framework to higher order tensor \ndecomposition. A half-quadratic iterative optimization method is proposed to solve the \nproposed objective function. The proposed algorithm achieves better performance in face \nimage reconstruction and classification. However, the MCC-based loss function uses the \nsecond order measurement to constraint the representation error, which is not always the \nbest choice. To solve this problem, we then propose the generalized correntropy criterion \nto give more flexibility in controlling the reconstruction error. The experimental results \nin face image reconstruction, classification, and clustering show the advantages of the \nproposed algorithm. \nFor robust sparse learning, we develop two robust orthogonal matching pursuit \nalgorithms to learn robust sparse features based on ITL. Since the correntropy and \ngeneralized correntropy are both based on the second order statistics in the kernel space, \nwhich still will be influenced by outliers, we proposed a kernel non-second order statistics \nmeasurement in the kernel space for robust sparse learning in orthogonal matching \npursuit. Besides, in the original matching pursuit algorithm, the correlation between \ntwo vectors are measured by the inner product operation. However, the inner product \noperation is not a robust function, which will magnify the effect from outliers. Based on \nthis, a new correlation function is proposed based on the ITL to make the atom selection \nprocedure more accurate.

Volume None
Pages None
DOI 10.25904/1912/3265
Language English
Journal None

Full Text