Improving Error Resilience Analysis Methodology of Iterative Workloads for Approximate Computing.

Conf. Computing Frontiers(2017)

引用 16|浏览25
暂无评分
摘要
Assessing error resilience inherent to the digital processing workloads provides application-specific insights towards approximate computing strategies for improving power efficiency and/or performance. With the case study of radio astronomy calibration, our contributions for improving the error resilience analysis are focused primarily on iterative methods that use a convergence criterion as a quality metric to terminate the iterative computations. We propose an adaptive statistical approximation model for high-level resilience analysis that provides an opportunity to divide a workload into exact and approximate iterations. This improves the existing error resilience analysis methodology by quantifying the number of approximate iterations (23% of the total iterations in our case study) in addition to other parameters used in the state-of-the-art techniques. This way heterogeneous architectures comprised of exact and inexact computing cores and adaptive accuracy architectures can be exploited efficiently. Moreover, we demonstrate the importance of quality function reconsideration for convergence based iterative processes as the original quality function (the convergence criterion) is not necessarily sufficient in the resilience analysis phase. If such is the case, an additional quality function has to be defined to assess the viability of the approximate techniques.
更多
查看译文
关键词
Error resilience analysis, iterative workloads, quality function, approximate computing, heterogeneous architectures
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要