Federated Optimization Based on Compression and Event-triggered Communication

2021 36th Youth Academic Annual Conference of Chinese Association of Automation (YAC)(2021)

引用 1|浏览3
暂无评分
摘要
Federated learning is deemed as a promising solution to large-scale machine learning problem, which enables multiple edge users cooperatively to train a global parameter model and guarantees users a basic privacy level. However, despite the increasing interest, communication expenditure usually turns out to be a major bottleneck for scaling up distributed algorithms with possibly irresponsible or limited data rate network environments. In fact, users or clients synchronize models periodically regardless of whether current models change significantly from last one, it is a waste of communication resources. Considering how much message to transmit each communication round and when to communicate, in this paper, we propose FedCET, which is a compression and event-triggered algorithm for federated learning. We present convergence analysis of algorithm and rigorous proof for smooth nonconvex, strong convex or PL condition and general convex objective functions, respectively, testifying that such communication method is efficient without affecting the convergence property of the algorithm. Further, we evaluate the proposed FedCET on several dataset to demonstrate the effectiveness compared with other methods.
更多
查看译文
关键词
compression,event-triggered,Federated learning,machine learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要