Kernel Stein Discrepancy on Lie Groups: Theory and Applications

arXiv (Cornell University)(2023)

引用 0|浏览5
暂无评分
摘要
Distributional approximation is a fundamental problem in machine learning with numerous applications across all fields of science and engineering and beyond. The key challenge in most approximation methods is the need to tackle the intractable normalization constant pertaining to the parametrized distributions used to model the data. In this paper, we present a novel Stein operator on Lie groups leading to a kernel Stein discrepancy (KSD) which is a normalization-free loss function. We present several theoretical results characterizing the properties of this new KSD on Lie groups and its minimizers namely, the minimum KSD estimator (MKSDE). Proof of several properties of MKSDE are presented, including strong consistency, CLT and a closed form of the MKSDE for the von Mises-Fisher distribution on SO(N). Finally, we present experimental evidence depicting advantages of minimizing KSD over maximum likelihood estimation.
更多
查看译文
关键词
lie groups,stein
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要