A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation

Transactions of the Association for Computational Linguistics, pp. 93-108, 2020.

Cited by: 27|Views244
Weibo:
We present a knowledge-enhanced pretraining model with multi-task learning for commonsense story generation

Abstract:

Story generation, namely, generating a reasonable story from a leading context, is an important but challenging task. In spite of the success in modeling fluency and local coherence, existing neura...

Code:

Data:

0
Full Text
Bibtex
Weibo
Your rating :
0

 

Tags
Comments