HATS: A Hierarchical Sequence-Attention Framework for Inductive Set-of-Sets Embeddings

KDD '19: The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining Anchorage AK USA August, 2019(2019)

引用 18|浏览96
暂无评分
摘要
In many complex domains, the input data are often not suited for the typical vector representations used in deep learning models. For example, in relational learning and computer vision tasks, the data are often better represented as sets (e.g., the neighborhood of a node, a cloud of points). In these cases, a key challenge is to learn an embedding function that is invariant to permutations of the input. While there has been some recent work on principled methods for learning permutation-invariant representations of sets, these approaches are limited in their applicability to set-of-sets (SoS) tasks, such as subgraph prediction and scene classification. In this work, we develop a deep neural network framework to learn inductive SoS embeddings that are invariant to SoS permutations. Specifically, we propose HATS, a hierarchical sequence model with attention mechanisms for inductive set-of-sets embeddings. We develop stochastic optimization and inference methods for learning HATS, and our experiments demonstrate that HATS achieves superior performance across a wide range of set-of-sets tasks.
更多
查看译文
关键词
hierarchical attention, inductive embedding, long-short term memory network, permutation-invariance, sequence model, set-of-sets
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要