Machine Learning Driven Latency Optimization for Internet of Things Applications in Edge Computing

ZTE Communications(2023)

引用 0|浏览10
暂无评分
摘要
Emerging Internet of Things(IoT)applications require faster execution time and response time to achieve optimal performance.However,most IoT devices have limited or no computing capability to achieve such stringent application requirements.To this end,compu-tation offloading in edge computing has been used for IoT systems to achieve the desired performance.Nevertheless,randomly offloading ap-plications to any available edge without considering their resource demands,inter-application dependencies and edge resource availability may eventually result in execution delay and performance degradation.We introduce Edge-IoT,a machine learning-enabled orchestration framework in this paper,which utilizes the states of edge resources and application resource requirements to facilitate a resource-aware offloading scheme for minimizing the average latency.We further propose a variant bin-packing optimization model that co-locates applica-tions firmly on edge resources to fully utilize available resources.Extensive experiments show the effectiveness and resource efficiency of the proposed approach.
更多
查看译文
关键词
edge computing,execution time,IoT,machine learning,resource efficiency
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要