NV-Fogstore : Device-aware hybrid caching in fog computing environments

arxiv(2020)

引用 0|浏览3
暂无评分
摘要
Edge caching via the placement of distributed storages throughout the network is a promising solution to reduce latency and network costs of content delivery. With the advent of the upcoming 5G future, billions of F-RAN (Fog-Radio Access Network) nodes will created and used for for the purpose of Edge Caching. Hence, the total amount of memory deployed at the edge is expected to increase 100 times. Currently, used DRAM-based caches in CDN (Content Delivery Networks) are extremely power-hungry and costly. Our purpose is to reduce the cost of ownership and recurring costs (of power consumption) in an F-RAN node while maintaining Quality of Service. For our purpose, we propose NV-FogStore, a scalable hybrid key-value storage architecture for the utilization of Non-Volatile Memories (such as RRAM, MRAM, Intel Optane) in Edge Cache. We further describe in detail a novel, hierarchical, write-damage, size and frequency aware content caching policy H-GREEDY for our architecture. We show that our policy can be tuned as per performance objectives, to lower the power, energy consumption and total cost over an only DRAM-based system for only a relatively smaller trade-off in the average access latency.
更多
查看译文
关键词
nv-fogstore,device-aware
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要