Proceedings of the 29th ACM International Conference on Multimedia | 2021
Relationship-Preserving Knowledge Distillation for Zero-Shot Sketch Based Image Retrieval
Abstract
Zero-shot sketch-based image retrieval is challenging for the modal gap between distributions of sketches and images and the inconsistency of label spaces during training and testing. Previous methods mitigate the modal gap by projecting sketches and images into a joint embedding space. Most of them also bridge seen and unseen classes by leveraging semantic embeddings, i.e., word vectors and hierarchical similarities. In this paper, we propose Relationship-Preserving Knowledge Distillation (RPKD) to study generalizable embeddings from the perspective of knowledge distillation bypassing the usage of semantic embeddings. In particular, we firstly distill the instance-level knowledge to preserve inter-class relationships without semantic similarities that require extra effort to collect. We also reconcile the contrastive relationships among instances between different embedding spaces, which is complementary to instance-level relationships. Furthermore, embedding-induced supervision, which measures the similarities of an instance to partial class embedding centers from the teacher, is developed to align the student s classification confidences. Extensive experiments conducted on three benchmark ZS-SBIR datasets, i.e., Sketchy, TU-Berlin, and QuickDraw, demonstrate the superiority of our proposed RPKD approach comparing to the state-of-the-art methods.