Chinese Event Temporal Relation Extraction on Multi-Dimensional Attention.

IJCNN(2023)

引用 0|浏览4
暂无评分
摘要
Extracting event temporal relations is an important task of natural language processing. It is even more challenging to extract the temporal relations of events without using annotated auxiliary information, which is time-consuming and expensive. Therefore, we propose an event temporal relation extraction model ETEMA for Chinese text based on BERT and multi-dimensional attention mechanism, using contextual information interaction and tensor matching methods. Specifically, ETEMA uses BERT to mine the semantic information of event sentences, and uses attention and semantic information to interactively combine event information with its contextual information. Experimental results on ACE2005-extended, a corpus of Chinese temporal relations, show that the proposed model ETEMA achieves optimal performance without using any auxiliary annotated information.
更多
查看译文
关键词
event temporal relation extraction, multi-dimensional attention, neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要