Unsupervised Recurrent Neural Network Grammars

Adhiguna Kuncoro
Adhiguna Kuncoro
Gábor Melis
Gábor Melis

arXiv: Computation and Language, 2019.

Cited by: 30|Bibtex|Views177|Links
EI

Abstract:

Recurrent neural network grammars (RNNG) are generative models of language which jointly model syntax and surface structure by incrementally generating a syntax tree and sentence in a top-down, left-to-right order. Supervised RNNGs achieve strong language modeling and parsing performance, but require an annotated corpus of parse trees. In...More

Code:

Data:

Your rating :
0

 

Tags
Comments