RIP-based Performance Guarantee for Low Rank Matrix Recovery via $L_{*-F}$ Minimization

arxiv(2023)

Cited 0|Views13
No score
Abstract
In the undetermined linear system $\bm{b}=\mathcal{A}(\bm{X})+\bm{s}$, vector $\bm{b}$ and operator $\mathcal{A}$ are the known measurements and $\bm{s}$ is the unknown noise. In this paper, we investigate sufficient conditions for exactly reconstructing desired matrix $\bm{X}$ being low-rank or approximately low-rank. We use the difference of nuclear norm and Frobenius norm ($L_{*-F}$) as a surrogate for rank function and establish a new nonconvex relaxation of such low rank matrix recovery, called the $L_{*-F}$ minimization, in order to approximate the rank function closer. For such nonconvex and nonsmooth constrained $L_{*-F}$ minimization problems, based on whether the noise level is $0$, we give the upper bound estimation of the recovery error respectively. Particularly, in the noise-free case, one sufficient condition for exact recovery is presented. If linear operator $\mathcal{A}$ satisfies the restricted isometry property with $\delta_{4r}<\frac{\sqrt{2r}-1}{\sqrt{2r}-1+\sqrt{2}(\sqrt{2r}+1)}$, then $r$-\textbf{rank} matrix $\bm{X}$ can be exactly recovered without other assumptions. In addition, we also take insights into the regularized $L_{*-F}$ minimization model since such regularized model is more widely used in algorithm design. We provide the recovery error estimation of this regularized $L_{*-F}$ minimization model via RIP tool. To our knowledge, this is the first result on exact reconstruction of low rank matrix via regularized $L_{*-F}$ minimization.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined