Weighted least-squares approximation with determinantal point processes and generalized volume sampling
CoRR(2023)
摘要
We consider the problem of approximating a function from L^2 by an element
of a given m-dimensional space V_m, associated with some feature map
φ, using evaluations of the function at random points x_1,…,x_n.
After recalling some results on optimal weighted least-squares using
independent and identically distributed points, we consider weighted
least-squares using projection determinantal point processes (DPP) or volume
sampling. These distributions introduce dependence between the points that
promotes diversity in the selected features φ(x_i). We first provide a
generalized version of volume-rescaled sampling yielding quasi-optimality
results in expectation with a number of samples n = O(mlog(m)), that means
that the expected L^2 error is bounded by a constant times the best
approximation error in L^2. Also, further assuming that the function is in
some normed vector space H continuously embedded in L^2, we further prove
that the approximation is almost surely bounded by the best approximation error
measured in the H-norm. This includes the cases of functions from L^∞
or reproducing kernel Hilbert spaces. Finally, we present an alternative
strategy consisting in using independent repetitions of projection DPP (or
volume sampling), yielding similar error bounds as with i.i.d. or volume
sampling, but in practice with a much lower number of samples. Numerical
experiments illustrate the performance of the different strategies.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要