How to Keep Cool While Training

ICLR 2023(2023)

引用 0|浏览35
暂无评分
摘要
Modern classification neural networks are notoriously prone to being overly confident in their predictions. With multiple calibration methods having been proposed so far, there has been noteworthy progress in reducing this overconfidence. However, to the best of our knowledge, prior methods have exclusively focused on the factors that affect calibration, leaving open the reverse question of how (mis)calibration impacts network training. Aiming for a better understanding of this interplay, we propose a temperature-based Cooling method for calibrating classification neural networks during training. Cooling has a substantial effect on the gradients and reduces the need for a learning rate schedule. We investigate different variants of Cooling, with the simplest one, last layer Cooling, being also the best-performant one, improving network performance on a range of datasets, network architectures, and hyperparameter settings.
更多
查看译文
关键词
neural network,calibration,network calibration,cooling,temperature scaling,classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要