T5G2P: Multilingual Grapheme-to-Phoneme Conversion with Text-to-Text Transfer Transformer.

Pattern Recognition: 7th Asian Conference, ACPR 2023, Kitakyushu, Japan, November 5–8, 2023, Proceedings, Part III(2023)

引用 0|浏览3
暂无评分
摘要
In recent years, the Text-to-Text Transfer Transformer (T5) neural network has proved more powerful for many text-related tasks, including the grapheme-to-phoneme conversion (G2P). The paper describes the training process of T5-base models for several languages. It shows the advantages of training G2P models using that language-specific basis over the G2P models fine-tuned from the multilingual base model. The paper also explains the reasons for training G2P models on whole sentences (not a dictionary) and evaluates the trained G2P models on unseen sentences and words.
更多
查看译文
关键词
conversion,transfer,grapheme-to-phoneme,text-to-text
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要