Attention-Oriented Deep Multi-Task Hash Learning

ELECTRONICS(2023)

引用 0|浏览8
暂无评分
摘要
Hashing has wide applications in image retrieval at large scales due to being an efficient approach to approximate nearest neighbor calculation. It can squeeze complex high-dimensional arrays via binarization while maintaining the semantic properties of the original samples. Currently, most existing hashing methods always predetermine the stable length of hash code before training the model. It is inevitable for these methods to increase the computing time, as the code length converts, caused by the task requirements changing. A single hash code fails to reflect the semantic relevance. Toward solving these issues, we put forward an attention-oriented deep multi-task hash learning (ADMTH) method, in which multiple hash codes of varying length can be simultaneously learned. Compared with the existing methods, ADMTH is one of the first attempts to apply multi-task learning theory to the deep hashing framework to generate and explore multi-length hash codes. Meanwhile, it embeds the attention mechanism in the backbone network to further extract discriminative information. We utilize two common available large-scale datasets, proving its effectiveness. The proposed method substantially improves retrieval efficiency and assures the image characterizing quality.
更多
查看译文
关键词
deep hashing,attention,multi-task learning,deep learning,image retrieval
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要