An Effective Low-Dimensional Software Code Representation using BERT and ELMo.

QRS(2022)

引用 0|浏览23
暂无评分
摘要
Contextualised word representations (e.g., ELMo and BERT) have been shown to outperform static representations (e.g., Word2vec, Fasttext, and GloVe) for many NLP tasks. In this paper, we investigate the use of contextualised embeddings for code search and classification, an area receiving less attention. We construct CodeELMo by training ELMo from scratch and fine tuning CodeBERT embeddings using masked language modeling based on natural language (NL) texts related to software development concepts and programming language (PL) texts consisting of method comment pairs from open source code bases. The dimensionality of the Finetuned Code BERT embeddings is reduced using linear transformations and augmented with a CodeELMo representation to develop CodeELBE – a lowdimensional contextualised software code representation. Results for binary classification and retrieval tasks show that CodeELBE 1 considerably improves retrieval performance on standard deep code search datasets compared to CodeBERT and baseline BERT models.
更多
查看译文
关键词
Code Search,Contextualised Word Embeddings,ELMo,CodeBERT
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要