Analysis on the number of layers in the transformer-based model for neural machine translation

Dongxing Li,Zuying Luo

Third International Conference on Computer Science and Communication Technology (ICCSCT 2022)(2022)

引用 0|浏览2
暂无评分
摘要
Recently, the transformer-based models have been widely used in sequence-to-sequence (seq2seq) tasks, especially neural machine translation (NMT). In the original transformer, the layer number in encoder is equal to the layer number in decoder. However, the structure is more complex and task is more difficult in decoder than those in encoder, so the layer number should not be same. In order to verify how many layer number in encoder and decoder is properly valued, we improve transformer as our model and conduct four experiments on four translation tasks of IWSLT2017. The experimental results show that the layer number in decoder should be larger than that in encoder, which can bring better translation performance.
更多
查看译文
关键词
neural machine translation,layers,model,transformer-based
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要