Accelerated Coordinate Descent With Arbitrary Sampling And Best Rates For Minibatches

22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89(2019)

引用 40|浏览37
暂无评分
摘要
Accelerated coordinate descent is a widely popular optimization algorithm due to its efficiency on large-dimensional problems. It achieves state-of-the-art complexity on an important class of empirical risk minimization problems. In this paper we design and analyze an accelerated coordinate descent (ACD) method which in each iteration updates a random subset of coordinates according to an arbitrary but fixed probability law, which is a parameter of the method. While minibatch variants of ACD are more popular and relevant in practice, there is no importance sampling for ACD that outperforms the standard uniform minibatch sampling. Through insights enabled by our general analysis, we design new importance sampling for minibatch ACD which significantly outperforms previous state-of-the-art minibatch ACD in practice. We prove a rate that is at most O(root tau) times worse than the rate of minibatch ACD with uniform sampling, but can be O(n/T) times better, where tau is the minibatch size. Since in modern supervised learning training systems it is standard practice to choose tau << n, and often tau = O(1), our method can lead to dramatic speedups. Lastly, we obtain similar results for minibatch nonaccelerated CD as well, achieving improvements on previous best rates.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要