Abstract
The big breakthrough on the ImageNet challenge in 2012 was partially due to the ‘Dropout’ technique used to avoid overfitting. Here, we introduce a new approach called ‘Spectral Dropout’ to improve the generalization ability of deep neural networks. We cast the proposed approach in the form of regular Convolutional Neural Network (CNN) weight layers using a decorrelation transform with fixed basis functions. Our spectral dropout method prevents overfitting by eliminating weak and ‘noisy’ Fourier domain coefficients of the neural network activations, leading to remarkably better results than the current regularization methods. Furthermore, the proposed is very efficient due to the fixed basis functions used for spectral transformation. In particular, compared to Dropout and Drop-Connect, our method significantly speeds up the network convergence rate during the training process (roughly ), with considerably higher neuron pruning rates (an increase of 30%). We demonstrate that the spectral dropout can also be used in conjunction with other regularization approaches resulting in additional performance gains.
Original language | English |
---|---|
Pages (from-to) | 82-90 |
Number of pages | 9 |
Journal | Neural Networks |
Volume | 110 |
DOIs | |
Publication status | Published - Feb 2019 |