A Hybrid Delay Model for Interconnected Multi-Input Gates

2023 26th Euromicro Conference on Digital System Design (DSD)(2024)

引用 0|浏览0
暂无评分
摘要
Dynamic digital timing analysis aims at substituting highly accurate but slow analog simulations of digital circuits with less accurate but fast digital approaches to facilitate tracing timing relations between individual transitions in a signal trace. This primarily requires gate delay models, where the input-to-output delay of a transition also depends on the signal history. We focus on a recently proposed hybrid delay model for CMOS multi-input gates, exemplified by a 2-input gate, which is the only delay model known to us that faithfully captures both single-input switching (SIS) and multi-input switching (MIS) effects, also known as “Charlie effects”. Despite its simplicity as a first-order model, simulations have revealed that suitably parametrized versions of the model predict the actual delays of NOR gates accurately. However, the approach considers isolated gates without their interconnect. In this work, we augment the existing model and its theoretical analysis by a first-order interconnect, and conduct a systematic evaluation of the resulting modeling accuracy: Using SPICE simulations, we study both SIS and MIS effects on the overall delay of gates under variation of input driving strength, wire length, load capacitance and CMOS technology, and compare it to the predictions of appropriately parametrized versions of our model. Overall, our results reveal a surprisingly good accuracy of our fast delay model.
更多
查看译文
关键词
Digital circuit,delay model,dynamic timing analysis,interconnect
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要