Gaussian-Smoothed Sliced Probability Divergences
arxiv(2024)
摘要
Gaussian smoothed sliced Wasserstein distance has been recently introduced
for comparing probability distributions, while preserving privacy on the data.
It has been shown that it provides performances similar to its non-smoothed
(non-private) counterpart. However, the computationaland statistical properties
of such a metric have not yet been well-established. This work investigates the
theoretical properties of this distance as well as those of generalized
versions denoted as Gaussian-smoothed sliced divergences. We first show that
smoothing and slicing preserve the metric property and the weak topology. To
study the sample complexity of such divergences, we then introduce
μ̂̂̂_n the double empirical distribution for the
smoothed-projected μ. The distribution μ̂̂̂_n is a result of a
double sampling process: one from sampling according to the origin distribution
μ and the second according to the convolution of the projection of μ on
the unit sphere and the Gaussian smoothing. We particularly focus on the
Gaussian smoothed sliced Wasserstein distance and prove that it converges with
a rate O(n^-1/2). We also derive other properties, including continuity, of
different divergences with respect to the smoothing parameter. We support our
theoretical findings with empirical studies in the context of
privacy-preserving domain adaptation.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要