IEEE Access | 2019

Multi-Label Metric Transfer Learning Jointly Considering Instance Space and Label Space Distribution Divergence

 
 
 
 
 
 
 

Abstract


Multi-label learning deals with problems in which each instance is associated with a set of labels. Most multi-label learning algorithms ignore the potential distribution differences between the training domain and the test domain in the instance space and label space, as well as the intrinsic geometric information of the label space. These restrictive assumptions limit the ability of the existing multi-label learning algorithms to classify between domains. To solve this problem, in this paper, we propose a novel distribution-adaptation-based method, the multi-label metric transfer learning (MLMTL), to relax these two assumptions and handle more general multi-label learning tasks effectively. In particular, MLMTL extends the maximum mean discrepancy method into multi-label classification by learning and adjusting the weights for the multi-labeled training instances. In this way, MLMTL bridges the instance distribution and label distribution divergence between training and test datasets. In addition, based on the balanced multi-label training data, we explore the intrinsic geometric information of the label space by encoding it into a distance metric learning framework. Extensive experiments on five benchmark datasets show that the proposed approach significantly outperforms the state-of-the-art multi-label learning algorithms.

Volume 7
Pages 10362-10373
DOI 10.1109/ACCESS.2018.2889572
Language English
Journal IEEE Access

Full Text