Making Serverless Not So Cold in Edge Clouds: A Cost-Effective Online Approach

IEEE Transactions on Mobile Computing(2024)

引用 0|浏览25
暂无评分
摘要
Applying the serverless paradigm to edge computing improves edge resource utilization while bringing the benefits of flexible scaling and pay-as-you-go to latency-sensitive applications. This extends the boundaries of serverless computing and improves the quality of service for Function-as-a-Service users. However, as an emerging cloud computing paradigm, serverless edge computing faces pressing challenges, with one of the biggest obstacles being delay caused by excessively long container cold starts. Cold start delay is defined as the time between when a serverless function is triggered and when it begins to execute, and its existence seriously impacts resource utilization and Quality of Service (QoS). In this paper, we study how to minimize the total system cost by caching function containers and selecting routes for neighboring functions via edge or public clouds. We prove that the proposed problem is NP-hard even in the special case where the user request contains only one function, and that the unpredictability of user requests and the impact between adjacent time decisions require that the problem to be solved in an online fashion. We then design the Online Lazy Caching algorithm, an online algorithm with a worst-case competitive ratio using a randomized dependent rounding algorithm to solve the problem. Extensive simulation results show that the proposed online algorithm can achieve close-to-optimal performance in terms of both total cost and cold start cost compared to other existing algorithms, with average improvements of 31.6% and 51.7%.
更多
查看译文
关键词
Cold start,container caching,edge computing,serverless computing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要