MARU-Net: Multiscale Attention Gated Residual U-Net With Contrastive Loss for SAR-Optical Image Matching

IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens.(2023)

引用 0|浏览0
暂无评分
摘要
Accurate synthetic aperture radar-optical matching is essential for combining the complementary information from the two sensors. However, the main challenge is overcoming the different heterogeneous characteristics of the two imaging sensors. In this article, we propose an end-to-end machine learning pipeline inspired by recent advances in image segmentation. We develop a siamese multiscale attention-gated residual U-Net for feature extraction from satellite images. The siamese architecture shares weights and transforms the heterogeneous images into a homogeneous feature space. Fast Fourier transform is used to compute the cross-correlation between the feature maps and produce a similarity map. A contrastive loss is introduced to aid the training procedure of the model and maximize the discriminability of the model. The experimental results on a benchmark dataset show that the proposed method has superior matching accuracy and precision compared to other state-of-the-art methods.
更多
查看译文
关键词
Feature extraction,Optical sensors,Optical imaging,Adaptive optics,Optical network units,Computer architecture,Convolution,Deep learning,optical,synthetic aperture radar (SAR),template matching
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要