Neural networks : the official journal of the International Neural Network Society | 2019

Regularization of Deep Neural Networks with Spectral Dropout

 
 
 

Abstract


The big breakthrough on the ImageNet challenge in 2012 was partially due to the Dropout technique used to avoid overfitting. Here, we introduce a new approach called Spectral Dropout to improve the generalization ability of deep neural networks. We cast the proposed approach in the form of regular Convolutional Neural Network (CNN) weight layers using a decorrelation transform with fixed basis functions. Our spectral dropout method prevents overfitting by eliminating weak and noisy Fourier domain coefficients of the neural network activations, leading to remarkably better results than the current regularization methods. Furthermore, the proposed is very efficient due to the fixed basis functions used for spectral transformation. In particular, compared to Dropout and Drop-Connect, our method significantly speeds up the network convergence rate during the training process (roughly ×2), with considerably higher neuron pruning rates (an increase of ∼30%). We demonstrate that the spectral dropout can also be used in conjunction with other regularization approaches resulting in additional performance gains.

Volume 110
Pages \n 82-90\n
DOI 10.1016/j.neunet.2018.09.009
Language English
Journal Neural networks : the official journal of the International Neural Network Society

Full Text