Statistical Mechanics of Deep Learning

ANNUAL REVIEW OF CONDENSED MATTER PHYSICS, VOL 11, 2020(2020)

引用 232|浏览238
暂无评分
摘要
The recent striking success of deep neural networks in machine learning raises profound questions about the theoretical principles underlying their success. For example, what can such deep networks compute? How can we train them? How does information propagate through them? Why can they generalize? And how can we teach them to imagine? We review recent work in which methods of physical analysis rooted in statistical mechanics have begun to provide conceptual insights into these questions. These insights yield connections between deep learning and diverse physical and mathematical topics, including random landscapes, spin glasses, jamming, dynamical phase transitions, chaos, Riemannian geometry, random matrix theory, free probability, and nonequilibrium statistical mechanics. Indeed, the fields of statistical mechanics and machine learning have long enjoyed a rich history of strongly coupled interactions, and recent advances at the intersection of statisticalmechanics and deep learning suggest these interactions will only deepen going forward.
更多
查看译文
关键词
neural networks,machine learning,dynamical phase transitions,chaos,spin glasses,jamming,random matrix theory,interacting particle systems,nonequilibrium statistical mechanics
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要