Tailoring Instructions to Student's Learning Levels Boosts Knowledge Distillation
PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1(2023)
摘要
It has been commonly observed that a teacher model with superior performance
does not necessarily result in a stronger student, highlighting a discrepancy
between current teacher training practices and effective knowledge transfer. In
order to enhance the guidance of the teacher training process, we introduce the
concept of distillation influence to determine the impact of distillation from
each training sample on the student's generalization ability. In this paper, we
propose Learning Good Teacher Matters (LGTM), an efficient training technique
for incorporating distillation influence into the teacher's learning process.
By prioritizing samples that are likely to enhance the student's generalization
ability, our LGTM outperforms 10 common knowledge distillation baselines on 6
text classification tasks in the GLUE benchmark.
更多查看译文
关键词
learning levels,knowledge,student,instructions
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要