Online Hashing.

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS(2018)

引用 131|浏览131
暂无评分
摘要
Although hash function learning algorithms have achieved great success in recent years, most existing hash models are off-line, which are not suitable for processing sequential or online data. To address this problem, this paper proposes an online hash model to accommodate data coming in stream for online learning. Specifically, a new loss function is proposed to measure the similarity loss between a pair of data samples in hamming space. Then, a structured hash model is derived and optimized in a passive-aggressive way. Theoretical analysis on the upper bound of the cumulative loss for the proposed online hash model is provided. Furthermore, we extend our online hashing (OH) from a single model to a multimodel OH that trains multiple models so as to retain diverse OH models in order to avoid biased update. The competitive efficiency and effectiveness of the proposed online hash models are verified through extensive experiments on several large-scale data sets as compared with related hashing methods.
更多
查看译文
关键词
Data models,Loss measurement,Binary codes,Hash functions,Upper bound,Learning (artificial intelligence)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要