Abstract
Few-shot classification has gained significant attention owing to the effectiveness in classifying unseen classes with a few annotated images. Although previous works achieve encouraging classification performance, they heavily rely on one-hot labels during the meta-learning process, which may result in the supervision collapse and limited generalization. To address these challenges, the few-shot classification based on a self-supervised multi-task distillation (SMD) is proposed for mitigating the nuisance arising from one-hot labels. Specifically, SMD formulates multiple auxiliary tasks to enhance the cross entropy classification in a multi-task learning manner, including the self-supervised classification task and the self-distilled classification task. These auxiliary tasks do not rely on one-hot labels in meta-learning, which can effectively enhance generalization performance of the model. Finally, extensive experiment results on two benchmark datasets, i.e., CIFAR-FS and FC-100, demonstrate the superiority and effectiveness of SMD.
| Original language | English |
|---|---|
| Title of host publication | Proceedings - 2023 IEEE 29th International Conference on Parallel and Distributed Systems, ICPADS 2023 |
| Publisher | IEEE Computer Society |
| Pages | 363-369 |
| Number of pages | 7 |
| ISBN (Electronic) | 9798350330717 |
| DOIs | |
| Publication status | Published - 2023 |
| Externally published | Yes |
| Event | 29th IEEE International Conference on Parallel and Distributed Systems, ICPADS 2023 - Ocean Flower Island, Hainan, China Duration: 17 Dec 2023 → 21 Dec 2023 |
Publication series
| Name | Proceedings of the International Conference on Parallel and Distributed Systems - ICPADS |
|---|---|
| ISSN (Print) | 1521-9097 |
Conference
| Conference | 29th IEEE International Conference on Parallel and Distributed Systems, ICPADS 2023 |
|---|---|
| Country/Territory | China |
| City | Ocean Flower Island, Hainan |
| Period | 17/12/23 → 21/12/23 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.
Keywords
- few-shot classification
- multi-task learning
- self distillation
- self supervision
Fingerprint
Dive into the research topics of 'Self-supervised Multi-task Distillation for Few-shot Classification'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver