Crystal Transformer Based Universal Atomic Embedding for Accurate and Transferable Prediction of Materials Properties

Luozhijie Jin, Zijian Du, Le Shu,Yongfeng Mei,Hao Zhang

arxiv(2024)

引用 0|浏览2
暂无评分
摘要
In this work, we propose a novel approach to generate universal atomic embeddings, significantly enhancing the representational and accuracy aspects of atomic embeddings, which ultimately improves the accuracy of property prediction. Moreover, we demonstrate the excellent transferability of universal atomic embeddings across different databases and various property tasks. Our approach centers on developing the CrystalTransformer model. Unlike traditional methods, this model does not possess a fundamental graph network architecture but utilizes the Transformer architecture to extract latent atomic features. This allows the CrystalTransformer to mitigate the inherent topological information bias of graph neural networks while maximally preserving the atomic chemical information, making it more accurate in encoding complex atomic features and thereby offering a deeper understanding of the atoms in materials. In our research, we highlight the advantages of CrystalTransformer in generating universal atomic embeddings through comparisons with current mainstream graph neural network models. Furthermore, we validate the effectiveness of universal atomic embeddings in enhancing the accuracy of model predictions for properties and demonstrate their transferability across different databases and property tasks through various experiments. As another key aspect of our study, we discover the strong physical interpretability implied in universal atomic embeddings through clustering and correlation analysis, indicating the immense potential of our universal atomic embeddings as atomic fingerprints.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要