Attention With Sparsity Regularization for Neural Machine Translation and Summarization.

IEEE/ACM Transactions on Audio, Speech, and Language Processing(2019)

引用 40|浏览465
暂无评分
摘要
The attention mechanism has become the de facto standard component in neural sequence to sequence tasks, such as machine translation and abstractive summarization. It dynamically determines which parts in the input sentence should be focused on when generating each word in the output sequence. Ideally, only few relevant input words should be attended to at each decoding time step and the attention...
更多
查看译文
关键词
Decoding,Speech processing,Linear programming,Entropy,Task analysis,Standards,Training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要