Multi-index antithetic stochastic gradient algorithm

arxiv(2023)

引用 0|浏览6
暂无评分
摘要
Stochastic Gradient Algorithms (SGAs) are ubiquitous in computational statistics, machine learning and optimisation. Recent years have brought an influx of interest in SGAs, and the non-asymptotic analysis of their bias is by now well-developed. However, relatively little is known about the optimal choice of the random approximation (e.g mini-batching) of the gradient in SGAs as this relies on the analysis of the variance and is problem specific. While there have been numerous attempts to reduce the variance of SGAs, these typically exploit a particular structure of the sampled distribution by requiring a priori knowledge of its density’s mode. In this paper, we construct a Multi-index Antithetic Stochastic Gradient Algorithm (MASGA) whose implementation is independent of the structure of the target measure. Our rigorous theoretical analysis demonstrates that for log-concave targets, MASGA achieves performance on par with Monte Carlo estimators that have access to unbiased samples from the distribution of interest. In other words, MASGA is an optimal estimator from the mean square error-computational cost perspective within the class of Monte Carlo estimators. To illustrate the robustness of our approach, we implement MASGA also in some simple non-log-concave numerical examples, however, without providing theoretical guarantees on our algorithm’s performance in such settings.
更多
查看译文
关键词
Stochastic gradient,Multi-level Monte Carlo,Langevin dynamics,Approximate sampling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要