IEEE Signal Processing Letters | 2019
Grassmann Manifold Optimization for Fast $L_1$-Norm Principal Component Analysis
Abstract
In this letter, we propose a fast Grassmann manifold optimization method for <inline-formula><tex-math notation= LaTeX >$L_1$</tex-math></inline-formula>-norm based principal component analysis (GM-<inline-formula><tex-math notation= LaTeX >$L_1$</tex-math></inline-formula>-PCA). Our approach is a two-step iterative cost-minimization and manifold retraction technique that efficiently finds all principal components simultaneously. We perform complexity analysis and show that GM-<inline-formula><tex-math notation= LaTeX >$L_1$</tex-math></inline-formula>-PCA achieves a significant reduction in processing time while obtaining comparable or better results to current state-of-the-art <inline-formula><tex-math notation= LaTeX >$L_1$</tex-math></inline-formula>-PCA methods. We further demonstrate the improvement of GM-<inline-formula><tex-math notation= LaTeX >$L_1$</tex-math></inline-formula>-PCA technique over <inline-formula><tex-math notation= LaTeX >$L_2$</tex-math></inline-formula>-PCA on a dataset of facial imagery corrupted with outlying data points. Our experiments show that GM-<inline-formula><tex-math notation= LaTeX >$L_1$</tex-math></inline-formula>-PCA is computationally more efficient and produces results with lower reprojection error than previous methods. Furthermore, the processing time of our approach is relatively independent of dataset size and well suited for various big-data problems commonly encountered today.