Combiner: Full Attention Transformer with Sparse Computation Cost.

Annual Conference on Neural Information Processing Systems(2021)

引用 43|浏览90
暂无评分
摘要
Use the "Report an Issue" link to request a name change.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络