A 17.1 TOPS/W FP-INT Transformer Inference Accelerator with Sparsity Boosting and Output Importance-Aware Processing
2025 IEEE International Symposium on Circuits and Systems (ISCAS)(2025)
Key words
Digital Processor,Transformer Inference,FPINT,Energy-efficiency,Adder Tree,Booth Encoding,Block Floating Point
AI Read Science
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined