Barely-supervised Brain Tumor Segmentation via Employing Segment Anything Model

IEEE Transactions on Circuits and Systems for Video Technology(2024)

引用 0|浏览0
暂无评分
摘要
This work explores barely-supervised brain tumor segmentation where minimal supervision, i.e ., fewer than ten labeled samples, is available. Current methods often neglect two key problems in barely-supervised segmentation: i) the insufficient labeled data may be not able to offer enough information to networks for accurately segmenting tumor areas across various cases; ii) networks might overfit to the relation of multiple modalities of the limited labeled data, thus overly depending on certain modalities while overlooking other valuable modalities during segmentation. To tackle these two problems, we propose a barely-supervised training framework, called BarelySAM. BarelySAM first employs Segment Anything Model (SAM) during training by generating pseudo labels for unlabeled data. In this manner, pre-trained knowledge exhibited in SAM can be exploited to compensate for limited knowledge in labeled data, boosting network training and thus improving performance. For the overfitting problem, Multi-modality Dependency Minimization (MDM) is designed in BarelySAM to construct various partial combinations for full-modal samples, thus enforcing networks to exploit each modality effectively. Experimental results on two benchmark datasets validate the effectiveness of the integrated SAM and the designed MDM module. In particular, our method attains a 89.92% Dice score for whole tumor segmentation on BRATS2020 with just 6 (2%) labeled samples, just 1.09% lower than the performance of a fully supervised approach. Besides, experiments on barely-supervised multi-modal brain tumor segmentation also validate that our method is inherently robust against missing modalities.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要