Joint Task Offloading and Resource Allocation for Accuracy-Aware Machine-Learning-Based IIoT Applications

IEEE Internet of Things Journal(2023)

引用 9|浏览31
暂无评分
摘要
Machine learning (ML) plays a key role in Intelligent Industrial Internet of Things (IIoT) applications. Processing of the computation-intensive ML tasks can be largely enhanced by applying edge computing (EC) to traditional cloud-based schemes. System optimizations in the existing works always ignore the inference accuracy of ML models with different complexities, and their impacts on error task inference. In this article, we propose a joint task offloading and resource allocation scheme for accuracy-aware machine-learning-based IIoT applications in an edge–cloud-based network architecture. We aim at minimizing the long-term average system cost affected by the task offloading, computing resource allocation, and inference accuracy of the ML models deployed on the sensors, edge server, and cloud server. The Lyapunov optimization technique is applied to convert the long-term stochastic optimization problem into a short-term deterministic problem. An optimal algorithm based on the general Benders decomposition (GBD) technology and a heuristic algorithm based on proportional computing resource allocation and task offloading strategy comparison are proposed to efficiently solve the problem, respectively. The performance of our scheme is proved by theoretical analysis and evaluated by extensive simulations conducted in multiple scenarios. Simulation results demonstrate the effectiveness and superiority of our two algorithms in comparison with several other schemes proposed by the existing works.
更多
查看译文
关键词
Cloud computing,edge computing (EC),machine learning (ML),resource allocation,task offloading
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要