Regularized Adaptive Momentum Dual Averaging with an Efficient Inexact Subproblem Solver for Training Structured Neural Network
arxiv(2024)
摘要
We propose a Regularized Adaptive Momentum Dual Averaging (RAMDA) algorithm
for training structured neural networks. Similar to existing regularized
adaptive methods, the subproblem for computing the update direction of RAMDA
involves a nonsmooth regularizer and a diagonal preconditioner, and therefore
does not possess a closed-form solution in general. We thus also carefully
devise an implementable inexactness condition that retains convergence
guarantees similar to the exact versions, and propose a companion efficient
solver for the subproblems of both RAMDA and existing methods to make them
practically feasible. We leverage the theory of manifold identification in
variational analysis to show that, even in the presence of such inexactness,
the iterates of RAMDA attain the ideal structure induced by the regularizer at
the stationary point of asymptotic convergence. This structure is locally
optimal near the point of convergence, so RAMDA is guaranteed to obtain the
best structure possible among all methods converging to the same point, making
it the first regularized adaptive method outputting models that possess
outstanding predictive performance while being (locally) optimally structured.
Extensive numerical experiments in large-scale modern computer vision, language
modeling, and speech tasks show that the proposed RAMDA is efficient and
consistently outperforms state of the art for training structured neural
network. Implementation of our algorithm is available at
http://www.github.com/ismoptgroup/RAMDA/.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要