Courteous Cache Sharing: Being Nice To Others In Capacity Management
DAC '12: The 49th Annual Design Automation Conference 2012 San Francisco California June, 2012(2012)
摘要
This paper proposes a cache management scheme for multiprogrammed, multithreaded applications, with the objective of obtaining maximum performance for both individual applications and the multithreaded workload mix. In this scheme, each individual application's performance is improved by increasing the priority of its slowest thread, while the overall system performance is improved by ensuring that each individual application's performance benefit does not come at the cost of a significant degradation to other application's threads that are sharing the same cache. Averaged over six workloads, our shared cache management scheme improves the performance of the combination of applications by 18%. These improvements across applications in each mix are also fair, as indicated by average fair speedup improvements of 10% across the threads of each application (averaged over all the workloads).
更多查看译文
关键词
Shared Cache Management,Multithreaded Applications
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络