Parameterized Approximation Algorithms for K-center Clustering and Variants.

AAAI Conference on Artificial Intelligence(2022)

引用 6|浏览1
暂无评分
摘要
k-center is one of the most popular clustering models. While it admits a simple 2-approximation in polynomial time in general metrics, the Euclidean version is NP-hard to approximate within a factor of 1.93, even in the plane, if one insists the dependence on k in the running time be polynomial. Without this restriction, a classic algorithm yields a 2^{O((klog k)/{epsilon})}dn-time (1+epsilon)-approximation for Euclidean k-center, where d is the dimension. In this work, we give a faster algorithm for small dimensions: roughly speaking an O^*(2^{O((1/epsilon)^{O(d)} k^{1-1/d} log k)})-time (1+epsilon)-approximation. In particular, the running time is roughly O^*(2^{O((1/epsilon)^{O(1)}sqrt{k}log k)}) in the plane. We complement our algorithmic result with a matching hardness lower bound. We also consider a well-studied generalization of k-center, called Non-uniform k-center (NUkC), where we allow different radii clusters. NUkC is NP-hard to approximate within any factor, even in the Euclidean case. We design a 2^{O(klog k)}n^2 time 3-approximation for NUkC, and a 2^{O((klog k)/epsilon)}dn time (1+\epsilon)-approximation for Euclidean NUkC. The latter time bound matches the bound for k-center.
更多
查看译文
关键词
Data Mining & Knowledge Management (DMKM),Search And Optimization (SO),Machine Learning (ML)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要