Self-supervised Multi-task Distillation for Few-shot Classification

Enze Ji, Shi Chen*, Tiandong Ji, Jing Li, Zhikui Chen

*Corresponding author for this work

Research output: Chapter in Book/Conference Proceeding/ReportConference Paper published in a bookpeer-review

Abstract

Few-shot classification has gained significant attention owing to the effectiveness in classifying unseen classes with a few annotated images. Although previous works achieve encouraging classification performance, they heavily rely on one-hot labels during the meta-learning process, which may result in the supervision collapse and limited generalization. To address these challenges, the few-shot classification based on a self-supervised multi-task distillation (SMD) is proposed for mitigating the nuisance arising from one-hot labels. Specifically, SMD formulates multiple auxiliary tasks to enhance the cross entropy classification in a multi-task learning manner, including the self-supervised classification task and the self-distilled classification task. These auxiliary tasks do not rely on one-hot labels in meta-learning, which can effectively enhance generalization performance of the model. Finally, extensive experiment results on two benchmark datasets, i.e., CIFAR-FS and FC-100, demonstrate the superiority and effectiveness of SMD.

Original languageEnglish
Title of host publicationProceedings - 2023 IEEE 29th International Conference on Parallel and Distributed Systems, ICPADS 2023
PublisherIEEE Computer Society
Pages363-369
Number of pages7
ISBN (Electronic)9798350330717
DOIs
Publication statusPublished - 2023
Externally publishedYes
Event29th IEEE International Conference on Parallel and Distributed Systems, ICPADS 2023 - Ocean Flower Island, Hainan, China
Duration: 17 Dec 202321 Dec 2023

Publication series

NameProceedings of the International Conference on Parallel and Distributed Systems - ICPADS
ISSN (Print)1521-9097

Conference

Conference29th IEEE International Conference on Parallel and Distributed Systems, ICPADS 2023
Country/TerritoryChina
CityOcean Flower Island, Hainan
Period17/12/2321/12/23

Bibliographical note

Publisher Copyright:
© 2023 IEEE.

Keywords

  • few-shot classification
  • multi-task learning
  • self distillation
  • self supervision

Fingerprint

Dive into the research topics of 'Self-supervised Multi-task Distillation for Few-shot Classification'. Together they form a unique fingerprint.

Cite this