Two improved attribute weighting schemes for value difference metric

Knowledge and Information Systems(2018)

引用 7|浏览31
暂无评分
摘要
Due to its simplicity, efficiency and efficacy, value difference metric (VDM) has continued to perform well against more sophisticated newcomers and thus has remained of great interest to the distance metric learning community. Of numerous approaches to improving VDM by weakening its attribute independence assumption, attribute weighting has received less attention (only two attribute weighting schemes) but demonstrated remarkable class probability estimation performance. Among two existing attribute weighting schemes, one is non-symmetric and the other is symmetric. In this paper, we propose two simple improvements for setting attribute weights for use with VDM. One is the non-symmetric Kullback–Leibler divergence weighted value difference metric (KLD-VDM) and the other is the symmetric gain ratio weighted value difference metric (GR-VDM). We performed extensive evaluations on a large number of datasets and found that KLD-VDM and GR-VDM significantly outperform two existing attribute weighting schemes in terms of the negative conditional log likelihood and root relative squared error, yet at the same time maintain the computational simplicity and robustness that characterize VDM.
更多
查看译文
关键词
Distance metric learning,Value difference metric,Attribute weighting,Gain ratio,Kullback–Leibler divergence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要