We found a match
Your institution may have rights to this item. Sign in to continue.
- Title
Overcomplete-to-sparse representation learning for few-shot class-incremental learning.
- Authors
Mengying, Fu; Binghao, Liu; Tianren, Ma; Qixiang, Ye
- Abstract
Few-shot class-incremental learning (FSCIL) aims to continually learn new semantics given a few training samples of new classes. As training examples are too few to construct good representation upon, FSCIL is required to generalize learned semantics from old to new classes, as well as reduce the representation aliasing between them (old classes ‘forgetting’). This motivates us to develop overcomplete-to-sparse representation learning (O2SRL). It solves the ‘new class generalization’ and ‘old class forgetting’ problems systematically by regularizing both feature completeness and sparsity. Specifically, O2SRL consists of a spatial excitation module (SEM) and a channel purification module (CPM). SEM drives the model to learn overcomplete and generic features, which not only represent all classes well but also benefit generalization to new classes. CPM regularizes the sparsity and uniqueness of features, reducing semantic aliasing between classes and alleviating the forgetting of old classes. These two modules facilitate each other to configure unique and robust representation for both old and new classes. Experiments show that O2SRL improves the state-of-the-art of FSCIL by significant margins on public datasets including CUB200, CIFAR100, and mini-ImageNet. O2SRL’s effectiveness is also validated under the general few-shot learning setting.
- Publication
Multimedia Systems, 2024, Vol 30, Issue 2, p1
- ISSN
0942-4962
- Publication type
Article
- DOI
10.1007/s00530-024-01294-z