Learning Timescales in Gated and Adaptive Continuous Time Recurrent Neural Networks.

SMC(2020)

引用 4|浏览6
暂无评分
摘要
Recurrent neural networks that can capture temporal characteristics on multiple timescales are a key architecture in machine learning solutions as well as in neurocognitive models. A crucial open question is how these architectures can adopt both multi-term dependencies and systematic fluctuations from the data or from sensory input, similar to the adaptation and abstraction capabilities of the human brain. In this paper, we propose an extension of the classic Continuous Time Recurrent Neural Network (CTRNN) by allowing it to learn to gate its timescale characteristic during activation and thus dynamically change the timescales in processing sequences. This mechanism is simple but bio-plausible as it is motivated by the modulation of oscillation modes between neural populations. We test how the novel Gating Adaptive CTRNNs can solve difficult synthetic sequence prediction problems and explore the development of the timescale characteristics as well as the interplay of multiple timescales. As a particularly interesting finding, we report that timescale distributions emerge, which simultaneously capture systematic patterns as well as spontaneous fluctuations. Our extended architecture is interesting for cognitive models that aim to investigate the development of specific timescale characteristic under temporally complex perception and action, and vice versa.
更多
查看译文
关键词
CTRNN, Recurrent Neural Network, Timescale, Adaptive, Gating, Cognitive Model
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要