DDK: Distilling Domain Knowledge for Efficient Large Language ModelsJiaheng Liu,Chenchen Zhang,Jinyang Guo,Yuanxing Zhang,Haoran Que,Ken Deng, ZhiqiBai, Jie Liu,Ge Zhang,JiakaiWang,Yanan Wu,Congnan Liu,Jiamang Wang,Lin Qu,Wenbo Su,Bo ZhengNeurIPS 2024(2024)引用 7|浏览37关键词Knowledge Distillation,Large Language Models,Model AcceralationAI 理解论文溯源树样例生成溯源树,研究论文发展脉络Chat Paper正在生成论文摘要