A dynamic neural field model of continuous input integration

Wojtak Weronika,Coombes Stephen, Inria Sophia Antipolis Méditerranée Research Centre, Bicho Estela,Erlhagen Wolfram

Biological Cybernetics(2021)

引用 1|浏览0
暂无评分
摘要
The ability of neural systems to turn transient inputs into persistent changes in activity is thought to be a fundamental requirement for higher cognitive functions. In continuous attractor networks frequently used to model working memory or decision making tasks, the persistent activity settles to a stable pattern with the stereotyped shape of a “bump” independent of integration time or input strength. Here, we investigate a new bump attractor model in which the bump width and amplitude not only reflect qualitative and quantitative characteristics of a preceding input but also the continuous integration of evidence over longer timescales. The model is formalized by two coupled dynamic field equations of Amari-type which combine recurrent interactions mediated by a Mexican-hat connectivity with local feedback mechanisms that balance excitation and inhibition. We analyze the existence, stability and bifurcation structure of single and multi-bump solutions and discuss the relevance of their input dependence to modeling cognitive functions. We then systematically compare the pattern formation process of the two-field model with the classical Amari model. The results reveal that the balanced local feedback mechanisms facilitate the encoding and maintenance of multi-item memories. The existence of stable subthreshold bumps suggests that different to the Amari model, the suppression effect of neighboring bumps in the range of lateral competition may not lead to a complete loss of information. Moreover, bumps with larger amplitude are less vulnerable to noise-induced drifts and distance-dependent interaction effects resulting in more faithful memory representations over time.
更多
查看译文
关键词
dynamic neural field,conservation law,localized states,stability,input integration
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要