A Class of Stochastic Variance Reduced Methods with an Adaptive Stepsize

semanticscholar(2019)

引用 0|浏览0
暂无评分
摘要
Stochastic variance reduced methods have recently surged into prominence for solving large scale optimization problems in the context of machine learning. Tan, Ma and Dai et al. first proposed the new stochastic variance reduced gradient (SVRG) method with the Barzilai-Borwein (BB) method to compute step sizes automatically, which performs well in practice. On this basis, we propose a class of stochastic variance reduced methods with an adaptive stepsize which is based on local estimation of Lipschitz constant. Specifically, we adapt this stepsize to SVRG and stochastic recursive gradient algorithm (SARAH), which leads to two algorithms: SVRG-AS and SARAH-AS. We prove that both SVRG-AS and SARAH-AS converge linearly for strongly convex objective function. Numerical experiments on standard datasets indicate that our algorithms are effective and robust. The performance of SVRG-AS is better than SVRG-BB, and SARAH-AS is comparable to SARAH with best-tuned stepsizes. And our proposed stepsize is suitable for some other stochastic variance reduced methods. This work was supported by the National Natural Science Foundation of China (11731013,11571014,11331012) Corresponding author:Congying Han School of Mathematical Sciences, Key Laboratory of Big Data Mining and Knowledge Management, UCAS, Beijing, 100049, China Tel.: +86-010-88256908 E-mail: hancy@ucas.ac.cn Yan Liu School of Mathematical Sciences, University of Chinese Academy of Sciences (UCAS), Beijing, 100049, China E-mail: liuyan23ucas@outlook.com Tiande Guo School of Mathematical Sciences, Key Laboratory of Big Data Mining and Knowledge Management, UCAS, Beijing, 100049, China E-mail: tdguo@ucas.ac.cn
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要