Support Recovery For Orthogonal Matching Pursuit: Upper And Lower Bounds
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)(2018)
摘要
We study the problem of sparse regression where the goal is to learn a sparse vector that best optimizes a given objective function. Under the assumption that the objective function satisfies restricted strong convexity (RSC), we analyze orthogonal matching pursuit (OMP), a greedy algorithm that is used heavily in applications, and obtain a support recovery result as well as a tight generalization error bound for the OMP estimator. Further, we show a lower bound for OMP, demonstrating that both our results on support recovery and generalization error are tight up to logarithmic factors. To the best of our knowledge, these are the first such tight upper and lower bounds for any sparse regression algorithm under the RSC assumption.
更多查看译文
关键词
strong convexity,objective function,paper studies,lower bounds,generalization error,sparse vector,matrix representation,upper and lower bounds
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络