Meta-Learning Class Structures for Improved Few-Shot Classification
Learning with few samples is a challenging problem as training a typical deep neural network needs a lot of data. Recent state-of-the-art methods in few-shot classification use prototype-based classifiers, which calculate class prototypes from few samples as the approximations to true class centers and adopt the nearest neighbor rule for classification. However, class prototypes learned from few samples without any constraints tend to be biased, becoming a bottleneck in few-shot classification. In this work, we propose to regularize the learning of a class prototype under the guidance of meta-learned class structures, which are the distributions of class centers in certain directions. Concretely, we use class structures through a prototype correction network (PCN). The PCN regularizes the learning of class prototypes by moving sample embeddings, which are the vector representations of the samples, to match with the class structures in certain directions such that the sample embeddings are close to their class centers. Hence, the class prototypes obtained as the per-class average of the sample embeddings are also close to their class centers. We meta-train the PCN on a large number of tasks to capture class structures generalizable to unseen classes. The experimental analysis on classification accuracy demonstrates the effectiveness of our method. Our method is compatible with all prototype-based methods and achieves the best performance among all the state-of-the art approaches on two widely used few-shot classification benchmark datasets.
- Hongning Wang, Committee Chair, (CS/SEAS/UVA)
- Aidong Zhang, Advisor, (CS/SEAS/UVA)
- Yangfeng Ji (CS/SEAS/UVA)
- Jundong Li (CS, ECE/SEAS, SDS/UVA)