IEEE Access | 2019

A Gradually Distilled CNN for SAR Target Recognition

 
 
 
 

Abstract


Convolutional neural networks (CNNs) have been widely used in synthetic aperture radar (SAR) target recognition. Traditional CNNs suffer from expensive computation and high memory consumption, impeding their deployment in real-time recognition systems of SAR sensors, as these systems have low memory resources and low speed of calculation. In this paper, a micro CNN (MCNN) for real-time SAR recognition system is proposed. The proposed MCNN has only two layers, and it is compressed from a deep convolutional neural network (DCNN) with 18 layers by a novel knowledge distillation algorithm called gradual distillation. MCNN is a ternary network, and all its weights are either −1 or 1 or 0. Following a student-teacher paradigm, the DCNN is the teacher network and MCNN is its student network. The gradual distillation makes MCNN a better learning route than traditional knowledge distillation. The experiments on the MSTAR dataset show that the proposed MCNN can obtain a high recognition rate which is almost the same as the DCNN. However, compared with the DCNN, the memory footprint of the proposed MCNN is compressed 177 times, and the calculated amount is 12.8 times less, which means that the proposed MCNN can obtain better performance with the smaller network.

Volume 7
Pages 42190-42200
DOI 10.1109/ACCESS.2019.2906564
Language English
Journal IEEE Access

Full Text