Chrome Extension
WeChat Mini Program
Use on ChatGLM

Synthesizer: Rethinking Self-Attention in Transformer Models

ICLR 2021(2021)

Cited 424|Views385
Key words
Transformers,Deep Learning,Attention
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined