Narrowing Attention in Capsule Networks.

ICPR(2022)

引用 0|浏览1
暂无评分
摘要
Despite their recent success capsule networks (CapsNets) are still very computationally intensive and fail to achieve state-of-the-art performances on advanced datasets. As a consequence CapsNets are usually combined with additional conventional feature extraction layers to solve complex tasks. Based on the hypothesis that more efficient and distinct routing can alleviate these drawbacks, we propose a novel CapsNet algorithm, which utilises narrowed attention to determine the coupling coefficients between lower and higher level capsules. In particular, we employ tiny subnetworks with sigmoid activation functions to enforce concise routing decisions, thus reducing the tendency of CapsNets to explain the entire image rather than focusing on the essential information for a given task. This non-iterative routing strategy is computationally fast and memory efficient, results in interpretable coupling decisions and can be easily integrated into existing models due to its strong alignment with capsule theory. In addition, these solely capsule-based models are robust to a wide range of image transformations, have stable convergence characteristics and can be further improved by capsule-specific yet straightforward applications of dropout and batch normalisation. In a series of experiments, we demonstrate that narrowed attention routing enables the training of deep capsule networks without the need for additional feature extraction layers, while outperforming existing CapsNet architectures on a variety of well-known benchmark datasets.
更多
查看译文
关键词
capsule,attention,networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要