NTAM: A New Transition-Based Attention Model for Nested Named Entity Recognition.

Nan Gao, Bowei Yang, Yongjian Wang,Peng Chen

NLPCC (2)(2023)

引用 0|浏览4
暂无评分
摘要
Traditional Named Entity Recognition (NER) research only deals with flat entities and ignores nested entities. The transition-based method maps a sentence to a designated forest to recognize nested entities by predicting an action sequence through a state transition system which includes transition actions and a state of structures. However, the subsequent transition actions are affected by the previous transition actions resulting in error propagation, and the method ignores the correlation between the structures. To tackle these issues, we propose a new transition-based attention model (NTAM) to recognize nested entities. First, the structures and transition actions of the state transition system are redefined to eliminate error propagation. The prediction of an action sequence is converted to the prediction of a series of states, which predict whether the words between the structures can form entities. Second, we introduce an attention mechanism that strengthens the association between the structures. Experiments on two public nested NER datasets outperform previous state-of-the-art models.
更多
查看译文
关键词
nested named entity recognition,entity recognition,attention model,transition-based
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要