Positioning yourself in the maze of Neural Text Generation: A Task-Agnostic Survey

arXiv (Cornell University)(2020)

引用 0|浏览0
暂无评分
摘要
Neural text generation metamorphosed into several critical natural language applications ranging from text completion to free form narrative generation. In order to progress research in text generation, it is critical to absorb the existing research works and position ourselves in this massively growing field. Specifically, this paper surveys the fundamental components of modeling approaches relaying task agnostic impacts across various generation tasks such as storytelling, summarization, translation etc., In this context, we present an abstraction of the imperative techniques with respect to learning paradigms, pretraining, modeling approaches, decoding and the key challenges outstanding in the field in each of them. Thereby, we deliver a one-stop destination for researchers in the field to facilitate a perspective on where to situate their work and how it impacts other closely related generation tasks.
更多
查看译文
关键词
neural text generation,positioning,maze,task-agnostic
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要