IEEE Transactions on Neural Networks and Learning Systems | 2021

Distilling Ordinal Relation and Dark Knowledge for Facial Age Estimation

 
 
 
 

Abstract


In this article, we propose a knowledge distillation approach with two teachers for facial age estimation. Due to the nonstationary patterns of the facial-aging process, the relative order of age labels provides more reliable information than exact age values for facial age estimation. Thus, the first teacher is a novel ranking method capturing the ordinal relation among age labels. Especially, it formulates the ordinal relation learning as a task of recovering the original ordered sequences from shuffled ones. The second teacher adopts the same model as the student that treats facial age estimation as a multiclass classification task. The proposed method leverages the intermediate representations learned by the first teacher and the softened outputs of the second teacher as supervisory signals to improve the training procedure and final performance of the compact student for facial age estimation. Hence, the proposed knowledge distillation approach is capable of distilling the ordinal knowledge from the ranking model and the dark knowledge from the multiclass classification model into a compact student, which facilitates the implementation of facial age estimation on platforms with limited memory and computation resources, such as mobile and embedded devices. Extensive experiments involving several famous data sets for age estimation have demonstrated the superior performance of our proposed method over several existing state-of-the-art methods.

Volume 32
Pages 3108-3121
DOI 10.1109/TNNLS.2020.3009523
Language English
Journal IEEE Transactions on Neural Networks and Learning Systems

Full Text