We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Multistage feature fusion knowledge distillation.
- Authors
Li, Gang; Wang, Kun; Lv, Pengfei; He, Pan; Zhou, Zheng; Xu, Chuanyun
- Abstract
Generally, the recognition performance of lightweight models is often lower than that of large models. Knowledge distillation, by teaching a student model using a teacher model, can further enhance the recognition accuracy of lightweight models. In this paper, we approach knowledge distillation from the perspective of intermediate feature-level knowledge distillation. We combine a cross-stage feature fusion symmetric framework, an attention mechanism to enhance the fused features, and a contrastive loss function for teacher and student models at the same stage to comprehensively implement a multistage feature fusion knowledge distillation method. This approach addresses the problem of significant differences in the intermediate feature distributions between teacher and student models, making it difficult to effectively learn implicit knowledge and thus improving the recognition accuracy of the student model. Compared to existing knowledge distillation methods, our method performs at a superior level. On the CIFAR100 dataset, it boosts the recognition accuracy of ResNet20 from 69.06% to 71.34%, and on the TinyImagenet dataset, it increases the recognition accuracy of ResNet18 from 66.54% to 68.03%, demonstrating the effectiveness and generalizability of our approach. Furthermore, there is room for further optimization of the overall distillation structure and feature extraction methods in this approach, which requires further research and exploration.
- Subjects
DISTILLATION; FEATURE extraction; IMPLICIT learning; TEACHING models
- Publication
Scientific Reports, 2024, Vol 14, Issue 1, p1
- ISSN
2045-2322
- Publication type
Article
- DOI
10.1038/s41598-024-64041-4