A quasi-polynomial time algorithm for Multi-Dimensional Scaling via LP hierarchies
arxiv(2023)
摘要
Multi-dimensional Scaling (MDS) is a family of methods for embedding an
n-point metric into low-dimensional Euclidean space. We study the
Kamada-Kawai formulation of MDS: given a set of non-negative dissimilarities
{d_i,j}_i , j ∈ [n] over n points, the goal is to find an embedding
{x_1,…,x_n}∈ℝ^k that minimizes
OPT = min_x𝔼_i,j ∈ [n][ (1-x_i - x_j/d_i,j)^2
]
Kamada-Kawai provides a more relaxed measure of the quality of a
low-dimensional metric embedding than the traditional bi-Lipschitz-ness measure
studied in theoretical computer science; this is advantageous because strong
hardness-of-approximation results are known for the latter, Kamada-Kawai admits
nontrivial approximation algorithms. Despite its popularity, our theoretical
understanding of MDS is limited. Recently, Demaine, Hesterberg, Koehler, Lynch,
and Urschel (arXiv:2109.11505) gave the first approximation algorithm with
provable guarantees for Kamada-Kawai in the constant-k regime, with cost
OPT +ϵ in n^2 2^poly(Δ/ϵ) time, where
Δ is the aspect ratio of the input. In this work, we give the first
approximation algorithm for MDS with quasi-polynomial dependency on Δ:
we achieve a solution with cost Õ(logΔ)OPT^Ω(1)+ϵ in time
n^O(1)2^poly(log(Δ)/ϵ).
Our approach is based on a novel analysis of a conditioning-based rounding
scheme for the Sherali-Adams LP Hierarchy. Crucially, our analysis exploits the
geometry of low-dimensional Euclidean space, allowing us to avoid an
exponential dependence on the aspect ratio. We believe our geometry-aware
treatment of the Sherali-Adams Hierarchy is an important step towards
developing general-purpose techniques for efficient metric optimization
algorithms.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要