Archive | 2019

Strict Subspace and Label-Space Structure for Domain Adaptation

 
 
 
 
 

Abstract


One of the most important issues of transfer learning is domain adaptation which aims at adapting a classifier or model trained in the source domain for use in the target domain, while two domains may be different but related. Intuitively, a good feature representation across domain is crucial. In this paper, we put forward a novel feature representation approach for unsupervised domain adaptation, namely Strict Subspace and Label-space Structure for Domain Adaptation (SSLS). SSLS learns two feature representations that project the source domain and target domain into two different subspaces where marginal and conditional distribution shift can be reduced effectively. Specially, we make the distances of corresponding points in the projection subspaces as well as the label space close by Laplacian graph, which will guarantee the strictness of subspace structure and the quality of the pseudo labels. Extensive experiments verify that our method is superior to several state-of-the-art methods on three real world cross-domain visual recognition tasks Office+Caltech, USPS+MNIST, and PIE.

Volume None
Pages 301-313
DOI 10.1007/978-3-030-29551-6_26
Language English
Journal None

Full Text