IEEE Transactions on Neural Networks and Learning Systems | 2021

Generalized Learning Riemannian Space Quantization: A Case Study on Riemannian Manifold of SPD Matrices

 
 
 

Abstract


Learning vector quantization (LVQ) is a simple and efficient classification method, enjoying great popularity. However, in many classification scenarios, such as electroencephalogram (EEG) classification, the input features are represented by symmetric positive-definite (SPD) matrices that live in a curved manifold rather than vectors that live in the flat Euclidean space. In this article, we propose a new classification method for data points that live in the curved Riemannian manifolds in the framework of LVQ. The proposed method alters generalized LVQ (GLVQ) with the Euclidean distance to the one operating under the appropriate Riemannian metric. We instantiate the proposed method for the Riemannian manifold of SPD matrices equipped with the Riemannian natural metric. Empirical investigations on synthetic data and real-world motor imagery EEG data demonstrate that the performance of the proposed generalized learning Riemannian space quantization can significantly outperform the Euclidean GLVQ, generalized relevance LVQ (GRLVQ), and generalized matrix LVQ (GMLVQ). The proposed method also shows competitive performance to the state-of-the-art methods on the EEG classification of motor imagery tasks.

Volume 32
Pages 281-292
DOI 10.1109/TNNLS.2020.2978514
Language English
Journal IEEE Transactions on Neural Networks and Learning Systems

Full Text