Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining | 2019

Disambiguation Enabled Linear Discriminant Analysis for Partial Label Dimensionality Reduction

 
 

Abstract


Partial label learning is an emerging weakly-supervised learning framework where each training example is associated with multiple candidate labels among which only one is valid. Dimensionality reduction serves as an effective way to help improve the generalization ability of learning system, while the task of partial label dimensionality reduction is challenging due to the unknown ground-truth labeling information. In this paper, the first attempt towards partial label dimensionality reduction is investigated by endowing the popular linear discriminant analysis (LDA) techniques with the ability of dealing with partial label training examples. Specifically, a novel learning procedure named DELIN is proposed which alternates between LDA dimensionality reduction and candidate label disambiguation based on estimated labeling confidences over candidate labels. On one hand, the projection matrix of LDA is optimized by utilizing disambiguation-guided labeling confidences. On the other hand, the labeling confidences are disambiguated by resorting to kNN aggregation in the LDA-induced feature space. Extensive experiments on synthetic as well as real-world partial label data sets clearly validate the effectiveness of DELIN in improving the generalization ability of state-of-the-art partial label learning algorithms.

Volume None
Pages None
DOI 10.1145/3292500.3330901
Language English
Journal Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining

Full Text