Entropy-Optimized Deep Weighted Product Quantization for Image Retrieval.

IEEE transactions on image processing : a publication of the IEEE Signal Processing Society(2024)

引用 0|浏览0
暂无评分
摘要
Hashing and quantization have greatly succeeded by benefiting from deep learning for large-scale image retrieval. Recently, deep product quantization methods have attracted wide attention. However, representation capability of codewords needs to be further improved. Moreover, since the number of codewords in the codebook depends on experience, representation capability of codewords is usually imbalanced, which leads to redundancy or insufficiency of codewords and reduces retrieval performance. Therefore, in this paper, we propose a novel deep product quantization method, named Entropy Optimized deep Weighted Product Quantization (EOWPQ), which not only encodes samples into the weighted codewords in a new flexible manner but also balances the codeword assignment, improving while balancing representation capability of codewords. Specifically, we encode samples using the linear weighted sum of codewords instead of a single codeword as traditionally. Meanwhile, we establish the linear relationship between the weighted codewords and semantic labels, which effectively maintains semantic information of codewords. Moreover, in order to balance the codeword assignment, that is, avoiding some codewords representing most samples or some codewords representing very few samples, we maximize the entropy of the coding probability distribution and obtain the optimal coding probability distribution of samples by utilizing optimal transport theory, which achieves the optimal assignment of codewords and balances representation capability of codewords. The experimental results on three benchmark datasets show that EOWPQ can achieve better retrieval performance and also show the improvement of representation capability of codewords and the balance of codeword assignment.
更多
查看译文
关键词
Product quantization,deep learning,image retrieval
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要