An Extensible Framework for Open Heterogeneous Collaborative Perception
arxiv(2024)
摘要
Collaborative perception aims to mitigate the limitations of single-agent
perception, such as occlusions, by facilitating data exchange among multiple
agents. However, most current works consider a homogeneous scenario where all
agents use identity sensors and perception models. In reality, heterogeneous
agent types may continually emerge and inevitably face a domain gap when
collaborating with existing agents. In this paper, we introduce a new open
heterogeneous problem: how to accommodate continually emerging new
heterogeneous agent types into collaborative perception, while ensuring high
perception performance and low integration cost? To address this problem, we
propose HEterogeneous ALliance (HEAL), a novel extensible collaborative
perception framework. HEAL first establishes a unified feature space with
initial agents via a novel multi-scale foreground-aware Pyramid Fusion network.
When heterogeneous new agents emerge with previously unseen modalities or
models, we align them to the established unified space with an innovative
backward alignment. This step only involves individual training on the new
agent type, thus presenting extremely low training costs and high
extensibility. To enrich agents' data heterogeneity, we bring OPV2V-H, a new
large-scale dataset with more diverse sensor types. Extensive experiments on
OPV2V-H and DAIR-V2X datasets show that HEAL surpasses SOTA methods in
performance while reducing the training parameters by 91.5
new agent types. We further implement a comprehensive codebase at:
https://github.com/yifanlu0227/HEAL
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要