Sparse Distributed Memory is a Continual Learner

ICLR 2023(2023)

引用 6|浏览40
暂无评分
摘要
Continual learning is a problem for artificial neural networks that their biological counterparts are adept at solving. Building on work using Sparse Distributed Memory (SDM) to connect a core neural circuit with the powerful Transformer model, we create a modified Multi-Layered Perceptron (MLP) that is a strong continual learner. We find that every component of our MLP variant translated from biology is necessary for continual learning. Our solution is also free from any memory replay or task information, and introduces novel methods to train sparse networks that may be broadly applicable.
更多
查看译文
关键词
Sparse Distributed Memory,Sparsity,Top-K Activation,Continual Learning,Biologically Inspired
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要