Free Probability, Newton lilypads and hyperbolicity of Jacobians as a solution to the problem of tuning the architecture of neural networks

semanticscholar(2021)

引用 0|浏览2
暂无评分
摘要
Gradient descent during the learning process of a neural network can be subject to many instabilities. The spectral density of the Jacobian is a key component for analyzing robustness. Following the works of Pennington et al., such Jacobians are modeled using free multiplicative convolutions from Free Probability Theory (FPT). We present a reliable and very fast method for computing the associated spectral densities. This method has a controlled and proven convergence. Our technique is based on an homotopy method: it is an adaptative NewtonRaphson scheme which chains basins of attraction. We find contiguous lilypad-like basins and step from one to the next, heading towards the objective. In order to demonstrate the applicability of our method we show that the relevant FPT metrics computed before training are highly correlated to final test losses – up to 85%. We also give evidence that a very desirable feature for neural networks is the hyperbolicity of their Jacobian at initialization.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要