ChipNeMo: Domain-Adapted LLMs for Chip Design

Mingjie Liu, Teodor-Dumitru Ene,Robert Kirby,Chris Cheng,Nathaniel Pinckney,Rongjian Liang, Jonah Alben, Himyanshu Anand,Sanmitra Banerjee, Ismet Bayraktaroglu,Bonita Bhaskaran,Bryan Catanzaro,Arjun Chaudhuri, Sharon Clay,Bill Dally, Laura Dang, Parikshit Deshpande,Siddhanth Dhodhi, Sameer Halepete, Eric Hill, Jiashang Hu, Sumit Jain, Ankit Jindal,Brucek Khailany, George Kokai,Kishor Kunal, Xiaowei Li, Charley Lind,Hao Liu, Stuart Oberman, Sujeet Omar, Ghasem Pasandi,Sreedhar Pratty, Jonathan Raiman, Ambar Sarkar,Zhengjiang Shao, Hanfei Sun, Pratik P Suthar, Varun Tej, Walker Turner, Kaizhe Xu,Haoxing Ren

arxiv(2023)

引用 0|浏览94
暂无评分
摘要
ChipNeMo aims to explore the applications of large language models (LLMs) for industrial chip design. Instead of directly deploying off-the-shelf commercial or open-source LLMs, we instead adopt the following domain adaptation techniques: domain-adaptive tokenization, domain-adaptive continued pretraining, model alignment with domain-specific instructions, and domain-adapted retrieval models. We evaluate these methods on three selected LLM applications for chip design: an engineering assistant chatbot, EDA script generation, and bug summarization and analysis. Our evaluations demonstrate that domain-adaptive pretraining of language models, can lead to superior performance in domain related downstream tasks compared to their base LLaMA2 counterparts, without degradations in generic capabilities. In particular, our largest model, ChipNeMo-70B, outperforms the highly capable GPT-4 on two of our use cases, namely engineering assistant chatbot and EDA scripts generation, while exhibiting competitive performance on bug summarization and analysis. These results underscore the potential of domain-specific customization for enhancing the effectiveness of large language models in specialized applications.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要