My caption 😄

Extending Siamese Neural Networks

Siamese neural networks have successfully been applied to one-shot image classification. In this work, we extend Siamese neural networks to few-shot and zero-shot learning. For few-shot learning, we augment the conventional dual encoder architecture to contain $(k+1)$ encoder branches. For zero-shot learning, we propose Semantic Asymmetric Siamese Networks (SemASN), an asymmetric architecture with two distinct, non-weight sharing encoders on different data modalities (here images and semantic descriptions). While relaxing these dual and symmetric Siamese properties, we leave the contrastive training and inference procedure unchanged throughout all k-shot setups. We apply our proposed model to image classification on the Omniglot and AWA2 benchmark datasets. Despite the model’s versatility, its classification performance keeps up with state-of-the-art approaches specialized in few-, one- and zero-shot learning.

More details to follow upon publication of the work.