Consistent k-Clustering for General Metrics

SODA(2021)

引用 13|浏览79
暂无评分
摘要
Given a stream of points in a metric space, is it possible to maintain a constant approximate clustering by changing the cluster centers only a small number of times during the entire execution of the algorithm? This question received attention in recent years in the machine learning literature and, before our work, the best known algorithm performs $\widetilde{O}(k^2)$ center swaps (the $\widetilde{O}(\cdot)$ notation hides polylogarithmic factors in the number of points $n$ and the aspect ratio $\Delta$ of the input instance). This is a quadratic increase compared to the offline case -- the whole stream is known in advance and one is interested in keeping a constant approximation at any point in time -- for which $\widetilde{O}(k)$ swaps are known to be sufficient and simple examples show that $\Omega(k \log(n \Delta))$ swaps are necessary. We close this gap by developing an algorithm that, perhaps surprisingly, matches the guarantees in the offline setting. Specifically, we show how to maintain a constant-factor approximation for the $k$-median problem by performing an optimal (up to polylogarithimic factors) number $\widetilde{O}(k)$ of center swaps. To obtain our result we leverage new structural properties of $k$-median clustering that may be of independent interest.
更多
查看译文
关键词
metrics,k-clustering
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要