Numerical And Simulation Verification For Optima Server Allocation In Edge Computing

2021 IEEE INTERNATIONAL IOT, ELECTRONICS AND MECHATRONICS CONFERENCE (IEMTRONICS)(2021)

引用 0|浏览5
暂无评分
摘要
In this paper, we consider the server allocation problem in edge computing. We consider a system model where there are a number of areas or locations, each of which has an associated Base Station (BS), where we can deploy an edge cloud with multiple servers. Each edge cloud will process application requests received at the corresponding BS from users in the corresponding area. The system manager/operator has a budget to deploy a given number of servers to the BSs. Our goal is to come up with a server allocation plan, i.e., how many servers to deploy at each of the BSs, such that the overall average turnaround time of application requests generated by all the users is minimized. In order to achieve the optimal solution for the problem, we resort to queueing theory and model each edge cloud as an M/M/c queue. Analysis on the problem motivates a Largest Weighted Reduction Time First (LWRTF) algorithm to assign servers to edge clouds. Numerical comparisons among various algorithms verify that Algorithm LWRTF has near-optimal performances in terms of minimizing the average turnaround time. Simulation results using the CloudSim Plus simulation tool also verify that Algorithm LWRTF achieves better performances compared to other reasonably designed heuristic algorithms.
更多
查看译文
关键词
Edge computing, edge cloud, queueing theory, numerical method, simulation approach
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要