Unity is Strength: Cross-Task Knowledge Distillation to Improve Code Review Generation

CoRR(2023)

引用 0|浏览8
暂无评分
摘要
Code review is a fundamental process in software development that plays a critical role in ensuring code quality and reducing the likelihood of errors and bugs. However, code review might be complex, subjective, and time-consuming. Comment generation and code refinement are two key tasks of this process and their automation has traditionally been addressed separately in the literature using different approaches. In this paper, we propose a novel deep-learning architecture, DISCOREV, based on cross-task knowledge distillation that addresses these two tasks simultaneously. In our approach, the fine-tuning of the comment generation model is guided by the code refinement model. We implemented this guidance using two strategies, feedback-based learning objective and embedding alignment objective. We evaluated our approach based on cross-task knowledge distillation by comparing it to the state-of-the-art methods that are based on independent training and fine-tuning. Our results show that our approach generates better review comments as measured by the BLEU score.
更多
查看译文
关键词
review,code,knowledge,unity,cross-task
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要