A One-shot Framework for Distributed Clustered Learning in Heterogeneous Environments

arxiv(2023)

引用 0|浏览16
暂无评分
摘要
The paper proposes a family of communication efficient methods for distributed learning in heterogeneous environments in which users obtain data from one of $K$ different data distributions. In the proposed setup, the grouping of users based on the data distributions they sample, as well as the underlying statistical properties of the distributions are apriori unknown. A family of One-shot Distributed Clustered Learning methods (ODCL-$\mathcal{C}$) is proposed, parametrized by the set of admissible clustering algorithms $\mathcal{C}$, with the objective of learning the true model at each user. The admissible clustering methods include $K$-means (KM) and convex clustering (CC), giving rise to various one-shot methods within the proposed family, such as ODCL-KM and ODCL-CC. The proposed one-shot approach, based on local computations at the users and a clustering based aggregation step at the server is shown to provide strong learning guarantees. In particular, for strongly convex problems it is shown that, as long as the number of data points per user is above a threshold, the proposed approach achieves order-optimal mean-squared error (MSE) rates in terms of the sample size. An explicit characterization of the threshold is provided in terms of the problem parameters. Numerical experiments illustrate the findings and corroborate the performance of the proposed methods. We also highlight the trade-offs with respect to selecting various clustering methods (ODCL-CC, ODCL-KM) and demonstrate significant improvements over state-of-the-art.
更多
查看译文
关键词
Distributed learning,communication efficiency,one-shot methods,statistical heterogeneity,clustered learning,mean-squared analysis,order-optimality
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要