Header menu link for other important links
X
Generalized zero-shot learning via over-complete distribution
Published in IEEE Computer Society
2020
Pages: 13297 - 13305
Abstract
A well trained and generalized deep neural network (DNN) should be robust to both seen and unseen classes. However, the performance of most of the existing supervised DNN algorithms degrade for classes which are unseen in the training set. To learn a discriminative classifier which yields good performance in Zero-Shot Learning (ZSL) settings, we propose to generate an Over-Complete Distribution (OCD) using Conditional Variational Autoencoder (CVAE) of both seen and unseen classes. In order to enforce the separability between classes and reduce the class scatter, we propose the use of Online Batch Triplet Loss (OBTL) and Center Loss (CL) on the generated OCD. The effectiveness of the framework is evaluated using both Zero-Shot Learning and Generalized Zero-Shot Learning protocols on three publicly available benchmark databases, SUN, CUB and AWA2. The results show that generating over-complete distributions and enforcing the classifier to learn a transform function from overlapping to non-overlapping distributions can improve the performance on both seen and unseen classes. © 2020 IEEE
About the journal
JournalData powered by TypesetProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
PublisherData powered by TypesetIEEE Computer Society
ISSN10636919