Kullback–Leibler Divergence Metric Learning

IEEE Transactions on Cybernetics(2022)

引用 23|浏览137
暂无评分
摘要
The Kullback–Leibler divergence (KLD), which is widely used to measure the similarity between two distributions, plays an important role in many applications. In this article, we address the KLD metric-learning task, which aims at learning the best KLD-type metric from the distributions of datasets. Concretely, first, we extend the conventional KLD by introducing a linear mapping and obtain the be...
更多
查看译文
关键词
Task analysis,Manifolds,Learning systems,Loss measurement,Speech recognition,Feature extraction
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要