CasANGCL: pre -training and fine-tuning model based on cascaded attention network and graph contrastive learning for molecular property prediction

Zixi Zheng,Yanyan Tan,Hong Wang, Shengpeng Yu, Tianyu Liu,Cheng Liang

BRIEFINGS IN BIOINFORMATICS(2023)

引用 20|浏览26
暂无评分
摘要
Motivation: Molecular property prediction is a significant requirement in AI -driven drug design and discovery, aiming to predict the molecular property information (e.g. toxicity) based on the mined biomolecular knowledge. Although graph neural networks have been proven powerful in predicting molecular property, unbalanced labeled data and poor generalization capability for new -synthesized molecules are always key issues that hinder further improvement of molecular encoding performance. Results: We propose a novel self-supervised representation learning scheme based on a Cascaded Attention Network and Graph Contrastive Learning (CasANGCL). We design a new graph network variant, designated as cascaded attention network, to encode local-global molecular representations. We construct a two -stage contrast predictor framework to tackle the label imbalance problem of training molecular samples, which is an integrated end -to -end learning scheme. Moreover, we utilize the information -flow scheme for training our network, which explicitly captures the edge information in the node/graph representations and obtains more fine-grained knowledge. Our model achieves an 81.9% ROC-AUC average performance on 661 tasks from seven challenging benchmarks, showing better portability and generalizations. Further visualization studies indicate our model's better representation capacity and provide interpretability.
更多
查看译文
关键词
molecular representation,cascaded attention network,graph contrastive learning,self-supervised learning,molecular property prediction
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要