Improving the Stochastic Gradient Descent’s Test Accuracy by Manipulating the ℓ Norm of its Gradient Approximation

ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2023)

引用 0|浏览0
暂无评分
摘要
The stochastic gradient descent (SGD) is a simple yet very influential algorithm used to find the minimum of a loss (cost) function which is dependent on datasets with large cardinality, such in cases typically associated with deep learning (DL). There exists several variants/improvements over the "vanilla" SGD, which from a high-level perspective, may be understood as using an adaptive element-wise step-size (SS). Moreover, from an algorithmic point of view, there is a clear "incremental improvement path" which relates all of them, i.e. from simple alternative such SG Clipping (SGC) to the well-known variance correction (Adagrad), follow by an (EMA) exponential moving average (RMSprop) to alternative furtherance such Newton (AdaDelta) or bias correction along with different EMA options for the gradient itself (Adam, AdaMAx, AdaBelief, etc.).In this paper, inspired by previous non-stochastic results on how to avoid divergence for ill chosen SS (for the accelerated proximal gradient algorithm), instead of directly using the standard SGD gradient’s EMA ${{\mathbf{\bar g}}_k}$, we propose to modify its entries so as to force $\left\{ {{{\left\| {{{{\mathbf{\bar g}}}_k}} \right\|}_\infty }} \right\}{\text{'s}}$ moving average to be non-increasing. Our reproducible computational results compare our proposed algorithm, called SGD-ℓ , with several optimizers (such Adam, AdaMax, SGC, etc.); while, as expected, SGD-ℓ allows us to use larger SS without divergence problems, (i) it also matches a well-tuned Adam’s performance (superior to "default parameters" Adam), and (ii) heuristically, its convergence properties (rate, oscillations, etc.) are superior when compared to other well-known algorithms.
更多
查看译文
关键词
stochastic gradient descent,ADAM,ℓ∞,norm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要