Large-Scale Transfer Learning For Natural Language Generation

57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019)(2019)

引用 102|浏览8211
暂无评分
摘要
Large-scale pretrained language models define state of the art in natural language processing, achieving outstanding performance on a variety of tasks. We study how these architectures can be applied and adapted for natural language generation, comparing a number of architectural and training schemes. We focus in particular on open-domain dialog as a typical high entropy generation task, presenting and comparing different architectures for adapting pretrained models with state of the art results.
更多
查看译文
关键词
Natural Language Generation,Language Modeling,Pretrained Models,Neural Machine Translation,Spoken Dialogue Systems
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要