Boundary-Aware Transformers for Skin Lesion Segmentation

Jiacheng Wang, Lan Wei, Liansheng Wang*, Qichao Zhou, Lei Zhu, Jing Qin

*Corresponding author for this work

Research output: Chapter in Book/Conference Proceeding/ReportConference Paper published in a bookpeer-review

Abstract

Skin lesion segmentation from dermoscopy images is of great importance for improving the quantitative analysis of skin cancer. However, the automatic segmentation of melanoma is a very challenging task owing to the large variation of melanoma and ambiguous boundaries of lesion areas. While convolutional neutral networks (CNNs) have achieved remarkable progress in this task, most of existing solutions are still incapable of effectively capturing global dependencies to counteract the inductive bias caused by limited receptive fields. Recently, transformers have been proposed as a promising tool for global context modeling by employing a powerful global attention mechanism, but one of their main shortcomings when applied to segmentation tasks is that they cannot effectively extract sufficient local details to tackle ambiguous boundaries. We propose a novel boundary-aware transformer (BAT) to comprehensively address the challenges of automatic skin lesion segmentation. Specifically, we integrate a new boundary-wise attention gate (BAG) into transformers to enable the whole network to not only effectively model global long-range dependencies via transformers but also, simultaneously, capture more local details by making full use of boundary-wise prior knowledge. Particularly, the auxiliary supervision of BAG is capable of assisting transformers to learn position embedding as it provides much spatial information. We conducted extensive experiments to evaluate the proposed BAT and experiments corroborate its effectiveness, consistently outperforming state-of-the-art methods in two famous datasets (Code is available at https://github.com/jcwang123/BA-Transformer ).

Original languageEnglish
Title of host publicationMedical Image Computing and Computer Assisted Intervention – MICCAI 2021 - 24th International Conference, Proceedings
EditorsMarleen de Bruijne, Marleen de Bruijne, Philippe C. Cattin, Stéphane Cotin, Nicolas Padoy, Stefanie Speidel, Yefeng Zheng, Caroline Essert
PublisherSpringer Science and Business Media Deutschland GmbH
Pages206-216
Number of pages11
ISBN (Print)9783030871925
DOIs
Publication statusPublished - 2021
Externally publishedYes
Event24th International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2021 - Virtual, Online
Duration: 27 Sept 20211 Oct 2021

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12901 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference24th International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2021
CityVirtual, Online
Period27/09/211/10/21

Bibliographical note

Publisher Copyright:
© 2021, Springer Nature Switzerland AG.

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 3 - Good Health and Well-being
    SDG 3 Good Health and Well-being

Keywords

  • Deep learning
  • Medical image segmentation
  • Transformer

Fingerprint

Dive into the research topics of 'Boundary-Aware Transformers for Skin Lesion Segmentation'. Together they form a unique fingerprint.

Cite this