Multi-source fusion network for remote sensing image segmentation with hierarchical transformer

IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM(2023)

引用 0|浏览1
暂无评分
摘要
Recently, due to the limitations of single sensor, it is hard to improve the performance of land cover classification. The traditional image segmentation methods can not process the optical remote sensing images effectively, especially when optical sensor is affected by complex weather conditions. However, as an active radar, synthetic aperture radar(SAR) has the advantage of not being restricted by weather conditions with the the penetrability of electromagnetic radiation. So multi-sensor data fusion provides a great potential for land cover classification. In this paper, a new fusion network called SegFusion is proposed to improve the performance of land cover classification. There are two main components in SegFusion which are hierarchical Transformer encoder and Swin-Fusion(SW-Fusion) module. First, a hierarchical Transformer encoder is used to extract multilevel feature of optical and SAR images. By integrating features from different layers, we can obtain powerful representation that combines both low-resolution fine-grained features and high-resolution coarse-grained features. Second, SW-Fusion module is used to fuse the features of optical and SAR data. In SW-Fusion, we use modified Swin Transformer[1] block with multi-head cross-attention mechanism to exchange information between features from different sources.
更多
查看译文
关键词
Land Cover Classification,Synthetic Aperture Radar (SAR),Optical Image,Data Fusion
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要