IRANet: Identity-relevance aware representation for cloth-changing person re-identification

Image and Vision Computing(2022)

引用 10|浏览52
暂无评分
摘要
Existing person re-identification methods mainly focus on searching the target person across disjoint camera views in a short period of time. With this setting, these methods rely on the assumption that both query and gallery images of the same person have the same clothing. To tackle the challenges of clothing changes over a long duration, this paper proposes an identity-relevance aware neural network (IRANet) for cloth-changing person re-identification. Specifically, a human head detection module is designed to localize the human head part with the help of the human parsing estimation. The detected human head part contains abundant identity information, including facial features and head type. Then, raw person images in conjunction with detected head areas are respectively transformed into feature representation with the feed-forward network. The learned features of raw person images contain more attributes of global context, meanwhile the learned features of head areas contain more identity-relevance attributes. Finally, a head-guided attention module is employed to guide the global features learned by raw person images to focus more on the identity-relevance head areas. The proposed method achieves mAP accuracy of 25.4% on the Celeb-reID-light dataset, 19.0% on the Celeb-reID dataset, and 53.0% (Cloth-changing setting) on the PRCC dataset, which shows the superiority of our approach for the cloth changing person re-identification task. (c) 2021 Elsevier B.V. All rights reserved.
更多
查看译文
关键词
Cloth-changing person re-identification,Identity-relevance,Feature representation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要