Material-Guided Multiview Fusion Network for Hyperspectral Object Tracking.

IEEE Trans. Geosci. Remote. Sens.(2024)

引用 0|浏览8
暂无评分
摘要
Hyperspectral videos (HSVs) have more potential in object tracking than color videos, thanks to their material identification ability. Nevertheless, previous works have not fully explored the benefits of the material information, resulting in limited representation ability and tracking accuracy. To address this issue, this article introduces a material-guided multiview fusion network (MMF-Net) for improved tracking. Specifically, we combine false-color information, hyperspectral information, and material information obtained by hyperspectral unmixing to provide a rich multiview representation of the object. Cross-material attention (CMA) is employed to capture the interaction among materials, enabling the network to focus on the most relevant materials for the target. Furthermore, leveraging the discriminative ability of material view, a novel material-guided multiview fusion module is proposed to capture both intraview and cross-view long-range spatial dependencies for effective feature aggregation. Thanks to the enhanced representation ability of each view and the integration of the complementary advantages of all views, our network is more capable of suppressing the tracking drift in various challenging scenes and achieving accurate object localization. Extensive experiments show that our tracker achieves state-of-the-art tracking performance. The source code will be available at https://github.com/hscv/MMF-Net .
更多
查看译文
关键词
Hyperspectral object tracking,hyperspectral unmixing,multihead attention,multiview fusion
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要