Improving Conditioning in Context-Aware Sequence to Sequence Models
Abstract:
Neural sequence to sequence models are well established for applications which can be cast as mapping a single input sequence into a single output sequence. In this work, we focus on cases where generation is conditioned on both a short query and a long context, such as abstractive question answering or document-level translation. We mo...More
Code:
Data:
Tags
Comments