Text synthesis from keywords: a comparison of recurrent-neural-network-based architectures and hybrid approaches

Neural Computing and Applications(2019)

引用 0|浏览346
暂无评分
摘要
This paper concerns an application of recurrent neural networks to text synthesis in the word level, with the help of keywords. First, a Parts Of Speech tagging library is employed to extract verbs and nouns from the texts used in our work, a part of which are then considered, after automatic eliminations, as the aforementioned keywords. Our ultimate aim is to train a recurrent neural network to map the keyword sequence of a text to the entire text. Successive reformulations of the keyword and full-text word sequences are performed, so that they can serve as the input and target of the network as efficiently as possible. The predicted texts are understandable enough, and the model performance depends on the problem difficulty, determined by the percentage of full-text words that are considered as keywords, that ranges from 1/3 to 1/2 approximately, the training memory cost, mainly affected by the network architecture, as well as the similarity between different texts, which determines the best architecture.
更多
查看译文
关键词
Deep machine learning,Sequence modeling,Natural language processing,Text mining
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要