Exploring pretrained models for joint morpho-syntactic parsing of russian

Computational Linguistics and Intellectual Technologies(2020)

引用 3|浏览1
暂无评分
摘要
In this paper, we build a joint morpho-syntactic parser for Russian. We describe a method to train a joint model which is significantly faster and as accurate as a traditional pipeline of models. We explore various ways to encode the word-level information and how they can affect the parser’s performance. To this end, we utilize learned from scratch character-level word embeddings and grammeme embeddings that have shown state-of-theart results for similar tasks for Russian in the past. We compare them with the pretrained contextualized word embeddings, such as ELMo and BERT, known to lead to the breakthrough in miscellaneous tasks in English. As a result, we prove that their usage can significantly improve parsing quality.
更多
查看译文
关键词
pretrained models,morpho-syntactic
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要