Automated Intention Mining with Comparatively Fine-tuning BERT

2021 5th International Conference on Natural Language Processing and Information Retrieval (NLPIR)(2021)

引用 0|浏览0
暂无评分
摘要
In the field of software engineering, intention mining is an interesting but challenging task, where the goal is to have a good understanding of user generated texts so as to capture their requirements that are useful for software maintenance and evolution. Recently, BERT and its variants have achieved state-of-the-art performance among various natural language processing tasks such as machine translation, machine reading comprehension and natural language inference. However, few studies try to investigate the efficacy of pre-trained language models in the task. In this paper, we present a new baseline with fine-tuned BERT model. Our method achieves state-of-the-art results on three benchmark data sets, outscoring baselines by a substantial margin. We also further investigate the efficacy of the pre-trained BERT model with shallower network depths through a simple strategy for layer selection.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要