Dynamic energy scheduling and routing of multiple electric vehicles using deep reinforcement learning

ENERGY(2022)

引用 23|浏览3
暂无评分
摘要
The demand on energy is uncertain and subject to change with time due to several factors including the emergence of new technology, entertainment, divergence of people's consumption habits, changing weather conditions, etc. Moreover, increases in energy demand are growing every day due to increases in world's population and growth of global economy, which substantially increase the chances of disrup-tions in power supply. This makes the security of power supply a more challenging task especially during seasons (e.g. summer and winter). This paper proposes a reinforcement learning model to address the uncertainties in power supply and demand by dispatching a set of electric vehicles to supply energy to different consumers at different locations. An electric vehicle is mounted with various energy resources (e.g., PV panel, energy storage) that share power generation units and storages among different con -sumers to power their premises to reduce energy costs. The performance of the reinforcement learning model is assessed under different configurations of consumers and electric vehicles, and compared to the results from CPLEX and three heuristic algorithms. The simulation results demonstrate that the rein-forcement learning algorithm can reduce energy costs up to 22.05%, 22.57%, and 19.33% compared to the genetic algorithm, particle swarm optimization, and artificial fish swarm algorithm results, respectively. (c) 2021 Elsevier Ltd. All rights reserved.
更多
查看译文
关键词
Mobile energy network, Electric vehicle, Vehicle routing, Energy scheduling, Deep reinforcement learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要