Communication-Efficient Satellite-Ground Federated Learning through Progressive Weight Quantization

IEEE Transactions on Mobile Computing(2024)

引用 0|浏览5
暂无评分
摘要
Large constellations of Low Earth Orbit (LEO) satellites have been launched for Earth observation and satellite-ground communication, which collect massive imagery and sensor data. These data can enhance the AI capabilities of satellites to address global challenges such as real-time disaster navigation and mitigation. Prior studies proposed leveraging federated learning (FL) across satellite-ground to collaboratively train a share machine learning (ML) model in a privacy-preserving mechanism. However, they mostly focus on single unique challenges such as limited ground-to-satellite bandwidth, short connection window, and long connection cycle, while ignoring the completeness of these challenges in deploying efficient FL frameworks in space. In this paper, we propose an efficient satellite-ground FL framework, SatelliteFL, to address these three challenges collectively. Its key idea is to ensure that each satellite must complete per-round training within each connection window. Moreover, we design a progressive block-wise quantization algorithm that determines a unique bitwidth for each block of the ML model to maximize the model utility while not exceeding the connection window. We evaluate SatelliteFL by plugging an implemented FL platform into real-world satellite networks and satellite images. The results show that SatelliteFL highly accelerates the convergence by up to 2.8× and improves the bandwidth utilization ratio by up to 9.3× compared to the state-of-the-art methods.
更多
查看译文
关键词
In-orbit computing,satellite network,federated learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要