Improving domain-specific neural code generation with few-shot meta-learning

INFORMATION AND SOFTWARE TECHNOLOGY(2024)

引用 0|浏览29
暂无评分
摘要
Context: Neural code generation aims to automatically generate code snippets guided by Natural Language Descriptions (NLDs). In recent years, various neural code generation models for mainstream Programming Languages (PLs), such as Java and Python, have been proposed and demonostrated significant success in prior studies. Nonetheless, due to the scarcity of available training examples for some domain-specific PLs, such as Solidity, Bash, and Clojure, simply adopting previous neural models may lead to overfitting and inadequate learning. Objective: To overcome this challenge, we propose MetaCoder, a novel meta-learning code generation approach that efficiently extracts general-purpose knowledge from a large-scale source language and rapidly adapts to domain-specific scenarios, even with relatively few samples. Method: MetaCoder employs MAML, a powerful few-shot meta-learning method, to construct a transfer learning framework. This framework learns general-purpose knowledge from large-scale source languages and applies it in domain-specific target languages. To acquire more general-purpose knowledge, heterogeneous sub-tasks are constructed from the source language during the pre-training phase of MAML. As such, combining with CodeBERT and K-means, we design an unsupervised category assignment method for code generation samples, thereby exploiting the n-way k-shot rule to construct the heterogeneous sub-tasks. Consequently, MetaCoder can be applied to the code generation field. Results: We evaluate MetaCoder with both tree-based (e.g., TreeGen) and sequence-based (e.g., CodeGPT) backbones on two domain-specific PLs, including Solidity and Bash. Extensive experiments demonstrate the superior performance of our approach compared to baselines and verified its capability of code generation visually in practice. Conclusion: MetaCoder effectively extracts general-purpose knowledge from large-scale source languages, thereby enhancing model performance. Therefore, we highly recommend MetaCoder as a code generation approach for domain-specific PLs.
更多
查看译文
关键词
Code generation,Few-shot learning,Meta-learning,Transfer learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要