Unsupervised Recurrent Neural Network Grammars
arXiv: Computation and Language, 2019.
EI
Abstract:
Recurrent neural network grammars (RNNG) are generative models of language which jointly model syntax and surface structure by incrementally generating a syntax tree and sentence in a top-down, left-to-right order. Supervised RNNGs achieve strong language modeling and parsing performance, but require an annotated corpus of parse trees. In...More
Code:
Data:
Tags
Comments