Energy Prediction for Cache Tuning in Embedded Systems

Ruben Vazquez, Ann Gordon-Ross,Greg Stitt

2019 IEEE 37th International Conference on Computer Design (ICCD)(2019)

引用 4|浏览38
暂无评分
摘要
Modern embedded systems are longer tasked at operating a single application or function and are increasingly required to operate more like general purpose desktop computers. Conforming to modern usage demands is extremely challenging given an embedded system's stringent design constraints, such as power, energy, and performance. Adherence to these constraints can be achieved by specializing/tuning the underlying system to application-specific execution requirements and characteristics by tuning a system's configurable parameters to meet these requirements given design constraints. Configurable parameters include architectural voltage, frequency, cache size, line size, and associativity, etc. However, given the complexity of modern systems, exploring these large design spaces is infeasible when the number of configurable parameters and valid parameter values increases beyond a trivial amount. In this paper, we propose using machine learning in lieu of traditional design space exploration techniques. In this work, we evaluate the potential for using an artificial neural network (ANN)-based prediction module for energy prediction. Since the cache hierarchy has a large impact on total energy consumption, without loss of generality, we study a configurable cache hierarchy with configurable cache size, associativity, and line size. We design and train an energy prediction module to infer the best cache configuration for an application based on the application's execution characteristics. Our approach requires only a single profiling run of the application to collect these characteristics. Our energy prediction module then predicts the energy consumption for all the configurations in the cache design space based on these characteristics, and outputs the configuration with the lowest energy consumption, thus essentially performing exhaustive design space exploration with a single execution. Our results show that our prediction module predicts the best instruction and data cache configurations for the majority of the applications, yielding an average energy degradation of less than 2% for both the instruction and data caches as compared to the optimal configuration determined by exhaustive design space exploration.
更多
查看译文
关键词
embedded systems, configurable caches, machine learning, tuning, artificial neural network, prediction, energy consumption, optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要