When transformer meets graph neural networks

semanticscholar(2021)

引用 0|浏览0
暂无评分
摘要
In this paper, we present our solution to the OGB Large-Scale Challenge (OGB-LSC) at KDD Cup 2021. We mainly use three types of models: (1) Standard Transformer; (2) A two-branch Transformer, where one branch is for regression and the other is for classification. The two branches learn from each other; (3) the GIN models with virtual nodes which are provided by the organizers. The Transformers models take raw SMILES sequences as input, and the GIN models take the 2D graphs obtained through RDKit as input. We first verify our proposed network architecture and tune the hyperparameters according to the validation performance on the official validation sets, and then we merge all validation data into the training sets to finalize the models. We obtain 0.1253 MAE score on the test set. Our code is available at https://github.com/TransfromerMeetsGraph/GNNLearner.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要