Training feedforward neural nets in Hopfield-energy-based configuration: A two-step approach

PATTERN RECOGNITION(2024)

引用 0|浏览22
暂无评分
摘要
We introduce Hopfield-Energy-Based Learning, a general learning framework that is inspired by energy-based models, to train feedforward neural nets. Our approach includes two training phases applied iteratively: first, the minimization of the internal energy, which captures dependencies between input samples and network parameters, is carried out in an unsupervised manner; second, the problem-dependent supervised external energy (e.g., cross-entropy loss) combined with partially reversed internal energy gradients are back-propagated in a standard manner. The intuition is that the first stage helps parameters to settle into the state that simply partitions data into clusters; while in the second stage, the network is allowed to deviate from that clustering a bit (hence gradient reversal) in order to converge to parameters that ultimately perform well on the task at hand. Notably, the data used for the two steps might not be one and the same (e.g., can come from different domains) and the approach naturally tailors itself to solve unsupervised domain adaptation problems without adopting any distribution alignment techniques. We also show that the proposed training strategy substantially improves the performance of several ConvNets on standard supervised classification tasks; showing improvements of at least 1.2% (2.64% on CIFAR-10, 4.5% on CIFAR-100, and 1.35% on ImageNet). Our formulation is general, performs well in practice, and holds promise for scenarios where labeled data is limited.
更多
查看译文
关键词
Hopfield-based energy,Feedforward neural nets,Learning algorithm,Supervised learning,Unsupervised domain adaptation,Computer vision
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要