Negative eigenvalues of the Hessian in deep neural networks.
ICLR(2019)
摘要
The loss function of deep networks is known to be non-convex but the precise nature of this nonconvexity is still an active area of research. In this work, we study the loss landscape of deep networks through the eigendecompositions of their Hessian matrix. In particular, we examine how important the negative eigenvalues are and the benefits one can observe in handling them appropriately.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络