A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
Transactions of the Association for Computational Linguistics, pp. 93-108, 2020.
We present a knowledge-enhanced pretraining model with multi-task learning for commonsense story generation
Story generation, namely, generating a reasonable story from a leading context, is an important but challenging task. In spite of the success in modeling fluency and local coherence, existing neura...
PPT (Upload PPT)