Generalized Exponential Concentration Inequality for R\'enyi Divergence Estimation

ICML'14: Proceedings of the 31st International Conference on International Conference on Machine Learning - Volume 32(2016)

引用 62|浏览29
暂无评分
摘要
Estimating divergences in a consistent way is of great importance in many machine learning tasks. Although this is a fundamental problem in nonparametric statistics, to the best of our knowledge there has been no finite sample exponential inequality convergence bound derived for any divergence estimators. The main contribution of our work is to provide such a bound for an estimator of R\'enyi-$\alpha$ divergence for a smooth H\"older class of densities on the $d$-dimensional unit cube $[0, 1]^d$. We also illustrate our theoretical results with a numerical experiment.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要