Scalable trace signal selection using machine learning

Computer Design(2013)

引用 24|浏览9
暂无评分
摘要
A key problem in post-silicon validation is to identify a small set of traceable signals that are effective for debug during silicon execution. Structural analysis used by traditional signal selection techniques leads to poor restoration quality. In contrast, simulation-based selection techniques provide superior restorability but incur significant computation overhead. In this paper, we propose an efficient signal selection technique using machine learning to take advantage of simulation-based signal selection while significantly reducing the simulation overhead. Our approach uses (1) bounded mock simulations to generate training vectors set for the machine learning technique, and (2) an elimination approach to identify the most profitable signals set. Experimental results indicate that our approach can improve restorability by up to 63.3% (17.2% on average) with a faster or comparable runtime.
更多
查看译文
关键词
computer debugging,electronic engineering computing,formal verification,integrated circuit design,learning (artificial intelligence),monolithic integrated circuits,signal restoration,silicon,Si,bounded mock simulation,computation overhead,debugging,machine learning technique,post-silicon validation,restoration quality,scalable trace signal selection,silicon execution,simulation-based selection technique,simulation-based signal selection,structural analysis,traceable signal identification,training vector set generation,Post-silicon,machine learning,signal selection
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要