Reinforced Zero-Shot Cross-Lingual Neural Headline Generation

IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING(2020)

引用 3|浏览93
暂无评分
摘要
Cross-lingual neural headline generation (CNHG), which aims at training a single, large neural network that directly generates a target language headline given a source language news document, has received considerable attention in recent years. Unlike conventional neural headline generation, CNHG faces the problem that there are no large-scale parallel corpora of source language articles and target language headlines. Consequently, CNHG is a zero-shot scenario. To solve this problem, we propose zero resource CNHG with reinforcement learning. We develop a reinforcement learning framework that is composed of two modules: a neural machine translation (NMT) module and a CNHG module. The translation module translates an input document into a source language document, and the headline generation module takes the previous output as input to generate a target language headline. Then, both modules receive a reward for joint training. The experimental results reveal that our method significantly outperforms baseline models.
更多
查看译文
关键词
Data models,Training data,Training,Learning (artificial intelligence),Speech processing,Neural networks,Task analysis,Neural networks,cross-lingual headline generation (CNHG),reinforcement learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要