Enhancing Code Prediction Transformer with AST Structural Matrix.

Yongyue Yang, Liting Huang,Chunyang Ye, Fenghang Li,Hui Zhou

2023 IEEE 23rd International Conference on Software Quality, Reliability, and Security (QRS)(2023)

引用 0|浏览0
暂无评分
摘要
Deep learning Transformer architectures play a critical role in developing advanced code prediction models, which are essential in modern Integrated Development Environments (IDEs). Nevertheless, these architectures encounter a significant challenge in effectively capturing and utilizing the structural information present in Abstract Syntax Trees (ASTs). To tackle this challenge, we propose an innovative approach that leverages AST structural matrices to enhance Transformers for source code prediction. Specifically, we integrate three types of AST structural matrices - the R matrix, A&S matrix, and MVG matrix - into the attention module of the Transformer to effectively capture the structural information within ASTs. To ensure optimal utilization, we have devised a range of strategies and integration methods tailored specifically for the attention module. To assess the effectiveness of our proposal, we conduct empirical studies using a standard Python dataset. The results demonstrate that incorporating AST structural matrices significantly enhances the accuracy of code prediction models, leading to an overall improvement from 73.18% to 75.10%. Furthermore, we conduct an in-depth analysis of the impact of each matrix type, offering a comprehensive understanding of their application scenarios. This insightful analysis provides valuable guidance on how to effectively leverage each matrix type in various contexts.
更多
查看译文
关键词
AST,code prediction,transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要