We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
MAMC-Net: an effective deep learning framework for whole-slide image tumor segmentation.
- Authors
Zeng, Li; Tang, Hongzhong; Wang, Wei; Xie, Mingjian; Ai, Zhaoyang; Chen, Lei; Wu, Yongjun
- Abstract
Segmenting histopathological image automatically is an important task in computer-aided pathology analysis. However, it is challenging to segment and analyze digitalized histopathology images due to the large size of WSI, diversity and complexity of features. In this paper, we propose a multi-resolution attention and multi-scale convolution network (MAMC-Net) for the automatic tumor segmentation of WSI. First, the proposed MAMC-Net design the multi-resolution attention module that utilizes multi-resolution images as the pyramid inputs to generate a wider range feature information and richer details. Specifically, we employ an attention mechanism at each level to capture discriminative features related with the segmentation task. Furthermore, a multi-scale convolution module is designed to multi-scale feature representation by aggregating intact semantic information from the deep layer of encoder and high-resolution details from the final layer of decoder. To further obtain the accurate segmentation results, we adopt a fully connected Conditional Random Field (CRF) to splice the overlapping maps to avoid discontinuities and inconsistencies of cancer boundaries. Finally, we demonstrate the effectiveness of our framework on open-source datasets, including CAME-LYON17 (breast cancer metastases) and BOT (gastric cancer) datasets. The experimental results show that our proposed MAMC-Net obtains superior performance compared with other state-of-the-art methods, such as a Dice coefficient (DSC) of 0.929, an IOU score of 0.867, recall of 0.933 on the breast cancer dataset, a Dice coefficient (DSC) of 0.89, an IOU score of 0.802, recall of 0.903 on the gastric cancer dataset.
- Publication
Multimedia Tools & Applications, 2023, Vol 82, Issue 25, p39349
- ISSN
1380-7501
- Publication type
Article
- DOI
10.1007/s11042-023-15065-x