Triple-attention interaction network for breast tumor classification based on multi-modality images.

Pattern Recognit.(2023)

引用 0|浏览50
暂无评分
摘要
Breast cancer can be diagnosed using medical imaging. Classification performance of medical imaging can be improved by multi-modality image fusion. However, existing fusion algorithm fail to consider the importance of modality interactions and cannot fully utilize multi-modality information. Attention mechanisms can effectively explore and combine multi-modality information. Thus, we propose a novel triple-attention interaction network for breast tumor classification based on diffusion-weighted imaging (DWI) and apparent dispersion coefficient (ADC) images. A triple inter-modality interaction mechanism is proposed to fully fuse the multi-modality information. Three modal interactions were performed through the developed inter-modality relation module, channel interaction module, and multi-level attention fu-sion module to explore the correlation, complementary, and discriminative information, respectively. Ad-ditionally, we introduce a novel dual parallel-attention module for the incorporation of spatial and chan-nel attention to improve the discriminative ability of single-modality features. Using these mechanisms, the proposed algorithm can mine and explore useful multi-modality information fully, to improve clas-sification performance. Experimental results demonstrate that our algorithm outperforms other multi-modality fusion algorithm, and extensive ablation studies were conducted to verify the advantages of our algorithm. The area under the receiver operating characteristic curve, accuracy, specificity, and sensitivity were 90.5%, 89.0%, 85.6%, and 92.4%, respectively.(c) 2023 Elsevier Ltd. All rights reserved.
更多
查看译文
关键词
Breast tumor classification,Multi-modality fusion,Triple inter-modality interaction,Intra-modality interaction
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要