A Reinforcement Learning Algorithm for Resource Provisioning in Mobile Edge Computing Network

2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2020)

引用 5|浏览16
暂无评分
摘要
Mobile edge computing (MEC) is a model that allows integration of computing power into telecommunications networks, to improve communication and data processing efficiency. In general, providing power to ensure the computing power of edge servers in the MEC network is very important. In many cases, ensuring continuous power supply to the system is not possible because servers are deployed in hard-to-reach areas such as outlying areas, forests, islands, etc. This is when renewable energy prevails as a viable source of power for ensuring stable operation. This paper addresses resource provisioning in the MEC network using renewable energy. We formulate the problem as a Markov Decision Problem and introduce a new approach to optimize this problem in terms of energy and time costs by using a reinforcement learning technique. Our simulation validates the efficacy of our algorithm, which results in a cost three times better than the other methods.
更多
查看译文
关键词
Mobile Edge Computing, Fog Computing, Resource Provisioning, Markov Decision Process, Energy Harvesting, Proximal Policy Optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要