IEEE Access | 2021

Enhancing Metric-Based Few-Shot Classification With Weighted Large Margin Nearest Center Loss

 
 
 

Abstract


Metric-learning-based methods, which attempt to learn a deep embedding space on extremely large episodes, have been successfully applied to few-shot classification problems. In this paper, we propose the adoption of large margin nearest center (LMNC) loss during episodic training to enhance metric-learning-based few-shot classification methods. Loss functions (such as cross-entropy and mean square error) commonly used in episodic training strive to achieve the strict goal that differently labeled examples in the embedding space are separated by an infinite distance. However, the learned embedding space cannot guarantee that this goal will be achieved for every episode sampled from a large number of classes. Instead of an infinite distance, LMNC loss requires only that differently labeled examples be separated by a large margin, which can well relax the strict constraint of the traditional loss functions, easily leading to a discriminative embedding space. Moreover, considering the multilevel similarity between various classes, we alleviate the constraint of a fixed large margin and extend LMNC loss to weighted LMNC (WLMNC) loss, which can effectively take advantage of interclass information, achieving a more separable embedding space with adaptive interclass margins. Experiments on state-of-the-art benchmarks demonstrate that the adoption of LMNC and WLMNC losses can strongly improve the embedding learning performance and classification accuracy of metric-based few-shot classification methods for various few-shot scenarios. In particular, LMNC and WLMNC losses can obtain 1.86% and 2.46% gains in prototypical network on miniImageNet for 5-way 1-shot scenario, respectively.

Volume 9
Pages 90805-90815
DOI 10.1109/ACCESS.2021.3091704
Language English
Journal IEEE Access

Full Text