Self-Attention Based Semantic Decomposition in Vector Symbolic Architectures
arxiv(2024)
摘要
Vector Symbolic Architectures (VSAs) have emerged as a novel framework for
enabling interpretable machine learning algorithms equipped with the ability to
reason and explain their decision processes. The basic idea is to represent
discrete information through high dimensional random vectors. Complex data
structures can be built up with operations over vectors such as the "binding"
operation involving element-wise vector multiplication, which associates data
together. The reverse task of decomposing the associated elements is a
combinatorially hard task, with an exponentially large search space. The main
algorithm for performing this search is the resonator network, inspired by
Hopfield network-based memory search operations.
In this work, we introduce a new variant of the resonator network, based on
self-attention based update rules in the iterative search problem. This update
rule, based on the Hopfield network with log-sum-exp energy function and
norm-bounded states, is shown to substantially improve the performance and rate
of convergence. As a result, our algorithm enables a larger capacity for
associative memory, enabling applications in many tasks like perception based
pattern recognition, scene decomposition, and object reasoning. We substantiate
our algorithm with a thorough evaluation and comparisons to baselines.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要