TCNAS: Transformer Architecture Evolving in Code Clone Detection

ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2024)

引用 0|浏览2
暂无评分
摘要
Code clone detection aims at finding code fragments with syntactic or semantic similarity. Most of current approaches mainly focus on detecting syntactic similarity while ignoring semantic long-term context alignment, and these detection methods encode the source code using human-designed models, a process which requires both expert input and a significant cost of time for experimentation and refinement. To address these challenges, we introduce the Transformer Code Neural Architecture Search (TCNAS), an approach designed to optimize transformer-based architectures for detection. In TCNAS, all channels are trained and evaluated equitably to enhance search efficiency. Besides, we introduce the dataflow of the code by extracting the semantic information from the code fragments. TCNAS facilitates the discovery of an optimal model structure geared towards the detection, eliminating the need for manual design. The searched optimal architecture is utilized to detect the code pairs. We conduct various empirical experiments on the benchmark, which covering all four types of code clone detection. The results demonstrate our approach consistently yields competitive detection scores across a range of evaluations.
更多
查看译文
关键词
Code clone detection,deep learning,Transformer,neural architecture search
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要