Header menu link for other important links
X
Incorporating rotational invariance in convolutional neural network architecture
H. Kandi, A. Jain, S. Velluva Chathoth, , G.R.K.S. Subrahmanyam
Published in Springer London
2019
Volume: 22
   
Issue: 3
Pages: 935 - 948
Abstract
Convolutional neural networks (CNNs) are one of the deep learning architectures capable of learning complex set of nonlinear features useful for effectively representing the structure of input to the network. Existing CNN architectures are invariant to small distortions, translations, scaling but are sensitive to rotations. In this paper, unlike the approaches where training samples with different orientations are included, we propose a new architecture in which addition of a rotational invariant map gives few folds of improvement towards rotational invariance to the network. We also propose an improved architecture where rotational invariance is achieved by rotationally varying the convolutional maps. We show that the proposed methods give better invariance towards rotations as compared to conventional training of CNN architecture (where the network is trained without considering the different orientation of training samples). The methods achieve rotation-independent classification by introducing few modifications in conventional CNNs, but do not add any trainable parameter to the network, thus keeping the number of free parameters/weights constant. We demonstrate the performance of proposed rotation invariant architectures for handwritten digits and texture data sets. © 2018, Springer-Verlag London Ltd., part of Springer Nature.
About the journal
JournalData powered by TypesetPattern Analysis and Applications
PublisherData powered by TypesetSpringer London
ISSN14337541