The NiuTrans System for the WMT21 Efficiency Task

Chenglong Wang,Chi Hu,Yongyu Mu,Zhongxiang Yan, Shi Wu, Mingxiang Hu, Hongxin Cao,Bei Li, Yandan Lin,Tong Xiao, Jun Zhu

arXiv (Cornell University)(2021)

引用 0|浏览1
暂无评分
摘要
This paper describes the NiuTrans system for the WMT21 translation efficiency task (http://statmt.org/wmt21/efficiency-task.html). Following last year's work, we explore various techniques to improve efficiency while maintaining translation quality. We investigate the combinations of lightweight Transformer architectures and knowledge distillation strategies. Also, we improve the translation efficiency with graph optimization, low precision, dynamic batching, and parallel pre/post-processing. Our system can translate 247,000 words per second on an NVIDIA A100, being 3$\times$ faster than last year's system. Our system is the fastest and has the lowest memory consumption on the GPU-throughput track. The code, model, and pipeline will be available at NiuTrans.NMT (https://github.com/NiuTrans/NiuTrans.NMT).
更多
查看译文
关键词
wmt21 efficiency task,niutrans system
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要