Position-aware self-attention based neural sequence labeling

Pattern Recognition(2021)

引用 18|浏览60
暂无评分
摘要
•This paper identifies the problem of modeling discrete context dependencies in sequence labeling tasks.•This paper develops a well-designed self-attentional context fusion network to provide complementary context information on the basis of Bi-LSTM.•This paper proposes a novel position-aware self-attention to incorporate three different positional factors for exploring the relative position information among token.•The proposed model achieves state-of-the-arts performance on part-of-speech (POS) tagging, named entity recognition (NER) and phrase chunking tasks.
更多
查看译文
关键词
Equence labeling,Self-attention,Discrete context dependency
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要