22.1 A 1.1V 16GB 640GB/s HBM2E DRAM with a Data-Bus Window-Extension Technique and a Synergetic On-Die ECC Scheme

2020 IEEE INTERNATIONAL SOLID- STATE CIRCUITS CONFERENCE (ISSCC)(2020)

引用 42|浏览57
暂无评分
摘要
Rapidly evolving artificial intelligence (Al) technology, such as deep learning, has been successfully deployed in various applications: such as image recognition, health care, and autonomous driving. Such rapid evolution and successful deployment of Al technology have been possible owing to the emergence of accelerators, such as GPUs and TPUs, that have a higher data throughput. This, in turn, requires an enhanced memory system with large capacity and high bandwidth [1]; HBM has been the most preferred high-bandwidth memory technology due to its high-speed and low-power characteristics, and 1024 IOs facilitated by 2.5D silicon interposer technology, as well as large capacity realized by through-silicon via (TSV) stack technology [2]. Previous-generation HBM2 supports 8GB capacity with a stack of 8 DRAM dies (i.e., 8-high stack) and 341GB/s (2.7Gb/s/pin) bandwidth [3]. The HBM industry trend has been a speed improvement of 15~20% every year, while capacity increases by 1.5-2x every two years. In this paper, we present a 16GB HBM2E with circuit and design techniques to increase its bandwidth up to 640GB/s (5Gb/s/pin), while providing stable bit-cell operation in the 2 nd generation of a 10nm DRAM process: featuring (1) a data-bus window-extension technique to cope with reduced $t_{cco}$ , (2) a power delivery network (PDN) designed for stable operation at a high speed, (3) a synergetic on-die ECC scheme to reliably provide large capacity, and (4) an MBIST solution to efficiently test large capacity memory at a high speed.
更多
查看译文
关键词
HBM2E DRAM,data-bus window-extension technique,artificial intelligence technology,deep learning,image recognition,health care,autonomous driving,data throughput,enhanced memory system,preferred high-bandwidth memory technology,low-power characteristics,2.5D silicon interposer technology,16GB HBM2E,design techniques,DRAM process,capacity memory,HBM industry,synergetic on-die ECC scheme,accelerators,through-silicon via stack technology,TSV stack technology,DRAM dies,stable bit-cell operation,power delivery network,MBIST solution,voltage 1.1 V,size 10.0 nm,storage capacity 16.0 Gbit,bit rate 640.0 Gbit/s,bit rate 341.0 Gbit/s,storage capacity 8.0 Gbit,Si
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要