Improved deep embedding learning based on stochastic symmetric triplet loss and local sampling

Neurocomputing(2020)

引用 11|浏览12
暂无评分
摘要
Designing more powerful feature representations has motivated the development of deep metric learning algorithms over the last few years. The idea is to transform data into a representation space where some prior similarity relationships between examples are preserved, e.g., distances between similar examples being smaller than those between dissimilar examples. While such approaches have produced some impressive results, they often suffer from difficulties in training. In this paper, we introduce an improved triplet-based loss for deep metric learning. Our method aims to minimize distances between similar examples, while maximizing distances between those that are dissimilar under a stochastic selection rule. Additionally, we propose a simple sampling strategy, which focuses on maintaining locally the similarity relationships of examples in their neighborhoods. This technique aims to reduce the local overlap between different classes in different parts of the embedded space. Experimental results on three standard benchmark data sets confirm that our method provides more accurate and faster training than other state-of-the-art methods.
更多
查看译文
关键词
Deep learning,Representation learning,Metric learning,Loss function
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要