Graph Convolutional Encoders for Syntax-aware Neural Machine Translation

EMNLP, pp. 1957-1967, 2017.

Cited by: 169|Views96
EI
Weibo:
The convolutional neural network+graph-convolutional networks model improves over the convolutional neural network baseline by +1.9 and +1.1 for BLEU1 and BLEU4, respectively

Abstract:

We present a simple and effective approach to incorporating syntactic structure into neural attention-based encoder-decoder models for machine translation. We rely on graph-convolutional networks (GCNs), a recent class of neural networks developed for modeling graph-structured data. Our GCNs use predicted syntactic dependency trees of sou...More

Code:

Data:

Your rating :
0

 

Tags
Comments