A Unified Framework for Gradient-based Clustering of Distributed Data

CoRR(2024)

引用 0|浏览4
暂无评分
摘要
We develop a family of distributed clustering algorithms that work over networks of users. In the proposed scenario, users contain a local dataset and communicate only with their immediate neighbours, with the aim of finding a clustering of the full, joint data. The proposed family, termed Distributed Gradient Clustering (DGC-ℱ_ρ), is parametrized by ρ≥ 1, controling the proximity of users' center estimates, with ℱ determining the clustering loss. Specialized to popular clustering losses like K-means and Huber loss, DGC-ℱ_ρ gives rise to novel distributed clustering algorithms DGC-KM_ρ and DGC-HL_ρ, while a novel clustering loss based on the logistic function leads to DGC-LL_ρ. We provide a unified analysis and establish several strong results, under mild assumptions. First, the sequence of centers generated by the methods converges to a well-defined notion of fixed point, under any center initialization and value of ρ. Second, as ρ increases, the family of fixed points produced by DGC-ℱ_ρ converges to a notion of consensus fixed points. We show that consensus fixed points of DGC-ℱ_ρ are equivalent to fixed points of gradient clustering over the full data, guaranteeing a clustering of the full data is produced. For the special case of Bregman losses, we show that our fixed points converge to the set of Lloyd points. Numerical experiments on real data confirm our theoretical findings and demonstrate strong performance of the methods.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要