Supervised Hashing Using Graph Cuts and Boosted Decision Trees

IEEE Transactions on Pattern Analysis and Machine Intelligence(2015)

引用 96|浏览128
暂无评分
摘要
To build large-scale query-by-example image retrieval systems, embedding image features into a binary Hamming space provides great benefits. Supervised hashing aims to map the original features to compact binary codes that are able to preserve label based similarity in the binary Hamming space. Most existing approaches apply a single form of hash function, and an optimization process which is typically deeply coupled to this specific form. This tight coupling restricts the flexibility of those methods, and can result in complex optimization problems that are difficult to solve. In this work we proffer a flexible yet simple framework that is able to accommodate different types of loss functions and hash functions. The proposed framework allows a number of existing approaches to hashing to be placed in context, and simplifies the development of new problem-specific hashing methods. Our framework decomposes the hashing learning problem into two steps: binary codes (hash bits) learning and hash function learning. The first step can typically be formulated as binary quadratic problems, and the second step can be accomplished by training standard binary classifiers. For solving large-scale binary code inference, we show how it is possible to ensure the binary quadratic problems to be submodular such that an efficient graph cuts can be used. To achieve efficiency as well as efficacy on large-scale high-dimensional data, we propose to use boosted decision trees as the hash functions, which are nonlinear and highly descriptive and are very fast to train and evaluate. Experiments demonstrate that our proposed method significantly outperforms most state-of-the-art methods, especially on high-dimensional data.
更多
查看译文
关键词
binary codes,decision trees,graph cuts,hashing,image retrieval,nearest neighbour search,kernel,hamming distance,optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要