H-divergence: A Decision-Theoretic Probability Discrepancy Measure

user-5f8411ab4c775e9685ff56d3(2021)

引用 0|浏览30
暂无评分
摘要
Measuring the discrepancy between two probability distributions is a fundamental problem in machine learning and statistics. Based on ideas from decision theory, we investigate a new class of discrepancies that are based on the optimal decision loss. Two probability distributions are different if the optimal decision loss is higher on the mixture distribution than on each individual distribution. We show that this generalizes popular notions of discrepancy measurements such as the Jensen Shannon divergence and the maximum mean discrepancy. We apply our approach to two-sample tests, which evaluates whether two sets of samples come from the same distribution. On various benchmark and real datasets, we demonstrate that tests based on our generalized notion of discrepancy is able to achieve superior test power. We also apply our approach to sample quality evaluation as an alternative to the FID score, and to understanding the effects of climate change on different social and economic activities.
更多
查看译文
关键词
Jensen–Shannon divergence,Probability distribution,Optimal decision,Mixture distribution,Decision theory,Statistics,Divergence,Mathematics,Maximum mean discrepancy,Sample quality
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要