Contraction Analysis of Hopfield Neural Networks with Hebbian Learning.

CDC(2022)

引用 1|浏览0
暂无评分
摘要
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modeling and analysis of Hopfield neural networks with dynamic recurrent connections undergoing Hebbian learning. To capture the synaptic sparsity of neural circuits, we propose a low dimensional formulation for the model and then characterize its key dynamical properties. First, we give a biologically-inspired forward invariance result. Then, we give sufficient conditions for the non-Euclidean contractivity of the model. Our contraction analysis leads to stability and robustness of time-varying trajectories - for networks with both excitatory and inhibitory synapses governed by both Hebbian and anti-Hebbian rules. Our proposed contractivity test is based upon biologically meaningful quantities, e.g., neural and synaptic decay rate, maximum out-degree, and the maximum synaptic strength. Finally, we show that the model satisfies Dale's principle. The effectiveness of our results is illustrated via a numerical example.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要