From warm to hot starts - leveraging runtimes for the serverless era.

HotOS(2021)

引用 10|浏览32
暂无评分
摘要
The serverless computing model leverages high-level languages, such as JavaScript and Java, to raise the level of abstraction for cloud programming. However, today's design of serverless computing platforms based on stateless short-lived functions leads to missed opportunities for modern runtimes to optimize serverless functions through techniques such as JIT compilation and code profiling. In this paper, we show that modern serverless platforms, such as AWS Lambda, do not fully leverage language runtime optimizations. We find that a significant number of function invocations running on warm containers are executed with unoptimized code ( warm-starts ), leading to orders of magnitude performance slowdowns. We explore the idea of exploiting the runtime knowledge spread throughout potentially thousands of nodes to profile and optimize code. To that end, we propose Ignite, a serverless platform that orchestrates runtimes across machines to run optimized code from the start ( hot-start ). We present evidence that runtime orchestration has the potential to greatly reduce cost and latency of serverless workloads by running optimized code across thousands of serverless functions.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要