Distantly Supervised Neural Network Model For Relation Extraction

Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data: 14th China National Conference, CCL 2015 and Third International Symposium, NLP-NABD 2015, Guangzhou, China, November 13-14, 2015, Proceedings(2015)

引用 3|浏览25
暂无评分
摘要
For the task of relation extraction, distant supervision is an efficient approach to generate labeled data by aligning knowledge base (KB) with free texts. Albeit easy to scale to thousands of different relations, this procedure suffers from introducing wrong labels because the relations in knowledge base may not be expressed by aligned sentences (mentions). In this paper, we propose a novel approach to alleviate the problem of distant supervision with representation learning in the framework of deep neural network. Our model - Distantly Supervised Neural Network (DSNN) - constructs the more powerful mention level representation by tensor-based transformation and further learns the entity pair level representation which aggregates and denoises the features of associated mentions. With this denoised representation, all of the relation labels can be jointly learned. Experimental results show that with minimal feature engineering, our model generally outperforms state-of-the-art methods for distantly supervised relation extraction.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要