Function-Space Regularization in Neural Networks: A Probabilistic Perspective
ICML 2023(2023)
摘要
Parameter-space regularization in neural network optimization is a
fundamental tool for improving generalization. However, standard
parameter-space regularization methods make it challenging to encode explicit
preferences about desired predictive functions into neural network training. In
this work, we approach regularization in neural networks from a probabilistic
perspective and show that by viewing parameter-space regularization as
specifying an empirical prior distribution over the model parameters, we can
derive a probabilistically well-motivated regularization technique that allows
explicitly encoding information about desired predictive functions into neural
network training. This method – which we refer to as function-space empirical
Bayes (FSEB) – includes both parameter- and function-space regularization, is
mathematically simple, easy to implement, and incurs only minimal computational
overhead compared to standard regularization techniques. We evaluate the
utility of this regularization technique empirically and demonstrate that the
proposed method leads to near-perfect semantic shift detection,
highly-calibrated predictive uncertainty estimates, successful task adaption
from pre-trained models, and improved generalization under covariate shift.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要