Triplet Interaction Improves Graph Transformers: Accurate Molecular Graph Learning with Triplet Graph Transformers
CoRR(2024)
摘要
Graph transformers typically lack direct pair-to-pair communication, instead
forcing neighboring pairs to exchange information via a common node. We propose
the Triplet Graph Transformer (TGT) that enables direct communication between
two neighboring pairs in a graph via novel triplet attention and aggregation
mechanisms. TGT is applied to molecular property prediction by first predicting
interatomic distances from 2D graphs and then using these distances for
downstream tasks. A novel three-stage training procedure and stochastic
inference further improve training efficiency and model performance. Our model
achieves new state-of-the-art (SOTA) results on open challenge benchmarks
PCQM4Mv2 and OC20 IS2RE. We also obtain SOTA results on QM9, MOLPCBA, and
LIT-PCBA molecular property prediction benchmarks via transfer learning. We
also demonstrate the generality of TGT with SOTA results on the traveling
salesman problem (TSP).
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要