Logistics-involved task scheduling in cloud manufacturing with offline deep reinforcement learning.

J. Ind. Inf. Integr.(2023)

引用 0|浏览8
暂无评分
摘要
As an application of industrial information integration engineering (IIIE) in manufacturing, cloud manufac-turing (CMfg) integrates enterprises' manufacturing information and provides an open and sharing platform for processing manufacturing tasks with distributed manufacturing services. Assigning tasks to manufacturing enterprises in the CMfg platform calls for effective scheduling algorithms. In recent years, deep reinforcement learning (DRL) has been widely applied to tackle cloud manufacturing scheduling problems (CMfg-SPs) because of its high generalization and fast-responding capability. However, the current DRL algorithms need to be trial-and-error through online interaction with the environment, which is costly and not allowed in the real CMfg platform. This paper proposes a novel offline DRL scheduling algorithm that alleviates the online trial-and-error issue while retaining DRL's original advantages. First, we describe the system model of CMfg-SPs and propose the sequential Markov decision process modeling strategy, where all tasks are regarded as one agent. Then, we introduce the framework of the decision transformer (DT), which converts the online scheduling decision-making problem into an offline classification problem. Finally, we construct an attention-based model as the agent's policy and train it offline under the DT architecture. Experimental results indicate that the proposed method consistently matches or exceeds online DRL algorithms, including deep double q-network (DDQN), deep recurrent q-network (DRQN), proximal policy optimization (PPO), and the offline learning algorithm behavior cloning (BC) in terms of scheduling performance and model generalization.
更多
查看译文
关键词
Cloud manufacturing, Offline reinforcement learning, Scheduling problems, Decision transformer, Attention mechanism
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要