Enhancing Relation Classification Using Focal Loss

2022 China Automation Congress (CAC)(2022)

引用 0|浏览2
暂无评分
摘要
Relation classification (RC) is becoming an increasingly significant subject in natural language processing. However, the public datasets suffer from the problem of class imbalance, which limits the performance and generalizability of the model. In this paper, we propose an approach that integrates the bidirectional encoder representation from transformers (BERT) model with focal loss to alleviate the influence of the class imbalance problem in RC datasets. By this way, the model can decrease the contribution of high-frequency instances and easy instances during training, allowing the model to rapidly concentrate on low-frequency instances and hard instances. As a consequence, experiments on the SemEval-2010 Task 8 benchmark dataset indicate that our method achieves a considerable advance above the previous methods, with a Macro-F1 score of 90.62%.
更多
查看译文
关键词
Relation classification,Pre-trained language model,Focal loss
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要