Controlling Action Space of Reinforcement-Learning-Based Energy Management in Batteryless Applications

IEEE Internet of Things Journal(2023)

引用 1|浏览14
暂无评分
摘要
Duty cycle management is critical for the energy-neutral operation of batteryless devices. Many efforts have been made to develop an effective duty cycling method, including machine-learning-based approaches, but existing methods can barely handle the dynamic harvesting environments of batteryless devices. Specifically, most machine-learning-based methods require the harvesting patterns to be collected in advance, as well as manual configuration of the duty-cycle boundaries. In this article, we propose a configuration-free duty cycling scheme for batteryless devices, called CTRL, with which energy harvesting nodes tune the duty cycle themselves adapting to the surrounding environment without user intervention. This approach combines reinforcement learning (RL) with a control system to allow the learning algorithm to explore all possible search space automatically. The learning algorithm sets the target State of Charge (SoC) of the energy storage, instead of explicitly setting the target task frequency at a given time. The control system then satisfies the target SoC by controlling the duty cycle. An evaluation based on the real implementation of the system using publicly available trace data shows that CTRL outperforms state-of-the-art approaches, resulting in 40% less frequent power failures in energy-scarce environments while achieving more than ten times the task frequency in energy-rich environments.
更多
查看译文
关键词
reinforcement-learning-based reinforcement-learning-based,energy,action space
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要