Fully Dynamic $k$k-Center Clustering With Improved Memory Efficiency

IEEE Transactions on Knowledge and Data Engineering(2022)

引用 0|浏览83
暂无评分
摘要
Static and dynamic clustering algorithms are a fundamental tool in any machine learning library. Most of the efforts in developing dynamic machine learning and data mining algorithms have been focusing on the sliding window model or more simplistic models. However, in many real-world applications one might need to deal with arbitrary deletions and insertions. For example, one might need to remove data items that are not necessarily the oldest ones, because they have been flagged as containing inappropriate content or due to privacy concerns. Clustering trajectory data might also require to deal with more general update operations. We develop a $(2+\epsilon)$ -approximation algorithm for the $k$ -center clustering problem with “small” amortized cost under the fully dynamic adversarial model. In such a model, points can be added or removed arbitrarily, provided that the adversary does not have access to the random choices of our algorithm. The amortized cost of our algorithm is poly-logarithmic when the ratio between the maximum and minimum distance between any two points in input is bounded by a polynomial, while $k$ and $\epsilon$ are constant. Furthermore, we significantly improve the memory requirement of our fully dynamic algorithm, although at the cost of a worse approximation ratio of $4 +\epsilon$ . Our theoretical results are complemented with an extensive experimental evaluation on dynamic data from Twitter, Flickr, as well as trajectory data, demonstrating the effectiveness of our approach.
更多
查看译文
关键词
Clustering,k-center,approximation algorithm,fully-dynamic
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要