Header menu link for other important links
X
Deterministic dropout for deep neural networks using composite random forest
B. Santra, , D.P. Mukherjee
Published in Elsevier B.V.
2020
Volume: 131
   
Pages: 205 - 212
Abstract
Dropout prevents overfitting in deep neural networks. Typical strategy of dropout involves random termination of connections irrespective of their importance. Termination blocks the propagation of class discriminative information across the network. As a result, dropout may lead to inferior performance. We propose a deterministic dropout where only unimportant connections are dropped ensuring propagation of class discriminative information. We identify the unimportant connections using a novel composite random forest, integrated into the network. We prove that better generalization is achieved by terminating these unimportant connections. The proposed algorithm is useful in preventing overfitting in noisy datasets. The proposal is equally good for datasets with smaller number of training examples. Experiments on several benchmark datasets show up to 8% improvement in classification accuracy. © 2020 Elsevier B.V.
About the journal
JournalData powered by TypesetPattern Recognition Letters
PublisherData powered by TypesetElsevier B.V.
ISSN01678655