Emergent latent symbol systems in recurrent neural networks

CONNECTION SCIENCE(2012)

引用 13|浏览1
暂无评分
摘要
Fodor and Pylyshyn [1988. Connectionism and cognitive architecture: A critical analysis. Cognition, 281–2, 3–71] famously argued that neural networks cannot behave systematically short of implementing a combinatorial symbol system. A recent response from Frank et al. [2009. Connectionist semantic systematicity. Cognition, 1103, 358–379] claimed to have trained a neural network to behave systematically without implementing a symbol system and without any in-built predisposition towards combinatorial representations. We believe systems like theirs may in fact implement a symbol system on a deeper and more interesting level: one where the symbols are latent – not visible at the level of network structure. In order to illustrate this possibility, we demonstrate our own recurrent neural network that learns to understand sentence-level language in terms of a scene. We demonstrate our model's learned understanding by testing it on novel sentences and scenes. By paring down our model into an architecturally minimal version, we demonstrate how it supports combinatorial computation over distributed representations by using the associative memory operations of Vector Symbolic Architectures. Knowledge of the model's memory scheme gives us tools to explain its errors and construct superior future models. We show how the model designs and manipulates a latent symbol system in which the combinatorial symbols are patterns of activation distributed across the layers of a neural network, instantiating a hybrid of classical symbolic and connectionist representations that combines advantages of both.
更多
查看译文
关键词
model design,combinatorial computation,network structure,latent symbol system,combinatorial symbol,combinatorial symbol system,symbol system,neural network,combinatorial representation,emergent latent symbol system,own recurrent neural network,associative memory,long short term memory,recurrent neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要