Progressive Similarity Preservation Learning for Deep Scalable Product Quantization

IEEE TRANSACTIONS ON MULTIMEDIA(2024)

Cited 0|Views15
No score
Abstract
Product quantization is an effective strategy for compact feature learning in image retrieval, which generates compact quantization codes of different lengths for varying scenarios. However, existing deep quantization methods obtain quantization codes with different lengths by training multiple models separately for each code length, which brings about large training time cost and degrades deployment flexibility. To this end, we propose a new deep scalable Progressive Similarity Preservation Product Quantization (PSPPQ) framework, which enables us to train the quantized features in different code lengths simultaneously and imposes no additional cost during inference. By progressively approximating the ground truth similarity of image pairs, we achieve direct optimization of similarity ranking, which improves the retrieval accuracy and generates sequential quantization codes with more efficiency. Besides, by combining the advantages of classification loss and hinge loss, we design a semantic ArcFace loss to optimize our network architecture. Experiments on three datasets demonstrate the effectiveness of our proposed method with variable code lengths for scalable image retrieval.
More
Translated text
Key words
Codes,Quantization (signal),Feature extraction,Training,Semantics,Visualization,Costs,Image retrieval,product quantization,scalable code length
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined