Relative Entropy Derivative Bounds.

ENTROPY(2013)

引用 4|浏览13
暂无评分
摘要
We show that the derivative of the relative entropy with respect to its parameters is lower and upper bounded. We characterize the conditions under which this derivative can reach zero. We use these results to explain when the minimum relative entropy and the maximum log likelihood approaches can be valid. We show that these approaches naturally activate in the presence of large data sets and that they are inherent properties of any density estimation process involving large numbers of random variables.
更多
查看译文
关键词
relative entropy,Kullback-Leibler divergence,Shannon differential entropy,asymptotic equipartition principle,typical set,Fisher information,maximum log likelihood
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要