Post-Pretraining Large Language Model Enabled Reverse Design of MOFs for Hydrogen Storage

Zhimeng Liu, Yuqiao Su,Yujie Guo,Jing Lin, Shulin Wang, Zeyang Song,Zuoshuai Xi,Hongyi Gao, Lei Shi,Ge Wang

crossref(2024)

引用 0|浏览0
暂无评分
摘要
Abstract. Large language models (LLMs) have achieved remarkable performance in general domains, they still face significant challenges when applied to specialized problems in fields like materials science. In this study, we enhance the performance of LLMs in the specific field of metal-organic frameworks (MOFs) for hydrogen storage by employing a post-pretraining approach to customize the LLM with domain-specific learning. By incorporating a comprehensive dataset comprising more than 2,000 MOF structures, over 7,000 related scientific papers, and a corpus exceeding 210 million tokens of specialized materials and chemical knowledge, we developed a domain-specific LLM for MOFs, referred to as MOFs-LLM. Through supervised fine-tuning, we unlocked the potential of MOFs-LLM in various tasks, including performance prediction, inverse design, mechanistic studies and application prospect analysis, with a specific focus on hydrogen storage material design challenges. In the practical application of reverse design, we utilize MOFs-LLM to mutate numerous ligands and select suitable building blocks, resulting in a structural space encompassing more than 100,000 MOFs. A MOF structure with highly promising hydrogen storage performance was ultimately successfully identified. This work effectively demonstrates the successful application of LLMs in a specific material science domain and provides a methodological pathway that can serve as a valuable reference for future research
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要