A Statistical Framework for Personalized Federated Learning and Estimation: Theory, Algorithms, and Privacy

ICLR 2023(2023)

引用 1|浏览18
暂无评分
摘要
A distinguishing characteristic of federated learning is that the (local) client data could have statistical heterogeneity. This heterogeneity has motivated the design of personalized learning, where individual (personalized) models are trained, through collaboration. There have been various personalization methods proposed in literature, with seemingly very different forms and methods ranging from use of a single global model for local regularization and model interpolation, to use of multiple global models for personalized clustering, etc. In this work, we begin with a generative framework that could potentially unify several different algorithms as well as suggest new algorithms. We apply our generative framework to personalized estimation, and connect it to the classical empirical Bayes' methodology. We develop private personalized estimation under this framework. We then use our generative framework to propose new personalized learning algorithms, including AdaPeD based on a Knowledge Distillation, which numerically outperforms several known algorithms. We develop privacy for personalized learning methods with guarantees for user-level privacy and composition. We numerically evaluate the performance as well as the privacy for both the estimation and learning problems, demonstrating the advantages of our proposed methods.
更多
查看译文
关键词
Personalized Federated Learning,Personalized Statistical Estimation,Differential Privacy,Empirical/Hierarchical Bayes
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络