An Integral Methodology for Predicting Long-Term RTN

IEEE Transactions on Electron Devices(2022)

引用 2|浏览3
暂无评分
摘要
Random telegraph noise (RTN) adversely impacts circuit performance and this impact increases for smaller devices and lower operation voltage. To optimize the circuit design, many efforts have been made to model RTN. RTN is highly stochastic, with significant device-to-device variations (DDVs). Early works often characterize individual traps first and then group them together to extract their statistical distributions. This bottom-up approach suffers from limitations in the number of traps it is possible to measure, especially for the capture and emission time constants, calling the reliability of extracted distributions into question. Several compact models have been proposed, but their ability to predict long-term RTN is not verified. Many early works measured RTN only for tens of seconds, although a longer time window increases RTN by capturing slower traps. The aim of this work is to propose an integral methodology for modeling RTN and, for the first time, to verify its capability of predicting the long-term RTN. Instead of characterizing properties of individual traps/devices, the RTN of multiple devices was integrated to form one dataset for extracting their statistical properties. This allows using the concept of effective charged traps (ECTs) and transforms the need for time constant distribution to obtain the kinetics of ECT, making long-term RTN prediction similar to predicting aging. The proposed methodology opens the way for assessing RTN impact within a window of ten years by efficiently evaluating the probability of a device parameter at a given level.
更多
查看译文
关键词
Device variations,fluctuation,jitters,noise,random telegraph noise (RTN),time-dependent variations,yield
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要