Energy Optimization and Lightweight Design for Efficient Federated Learning in Wireless Edge Systems

IEEE Transactions on Vehicular Technology(2024)

引用 0|浏览1
暂无评分
摘要
Energy-efficient federated learning (FL) is important for decentralized learning-based edge computing. The energy consumption of FL is largely affected by the efficiency of local training in edge devices and their communication efficiency to the central server. The majority of studies have focused on improving the latter while continuing to use traditional neural networks for local training. This can be computationally heavy for edge devices, especially given their limited energy supply and computation capabilities. In this paper, we propose a joint lightweight design and energy-saving algorithm (LDES) for efficient FL training and parameter uploading in the wireless edge. In the lightweight design of LDES, we investigate lightweight models in local training by deploying sparse or binary neural networks (SNN or BNN) at the edge, in order to reduce the uploaded data volume and the energy consumption. We develop an enhanced stochastic gradient descent algorithm with guaranteed convergence to handle the issues of non-smoothness and weight constraints (NSC-SGD) in training sparse models. In the energy-optimization part of LDES, we optimize power, bandwidth, and learning parameters to jointly minimize computing, uploading, and broadcasting energy. The numerical results show that LDES can reduce 56.21% energy consumption compared to benchmark FL schemes.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要