Robust Low-Rank Tensor Recovery Via Nonconvex Singular Value Minimization

IEEE TRANSACTIONS ON IMAGE PROCESSING(2020)

引用 32|浏览39
暂无评分
摘要
Tensor robust principal component analysis via tensor nuclear norm (TNN) minimization has been recently proposed to recover the low-rank tensor corrupted with sparse noise/outliers. TNN is demonstrated to be a convex surrogate of rank. However, it tends to over-penalize large singular values and thus usually results in biased solutions. To handle this issue, we propose a new definition of tensor logarithmic norm (TLN) as the nonconvex surrogate of rank, which can decrease the penalization on larger singular values and increase that on smaller ones simultaneously to preserve the low-rank structure of a tensor. Then, the strategy of tensor factorization is combined into the minimization of TLN to improve computational performance. To handle impulsive scenarios, we propose a nonconvex l(p)-ball projection scheme with 0 < p < 1 instead of the conventional convex scheme with p = 1, which enhances the robustness against outliers. By incorporating the TLN minimization and the l(p)-ball projection, we finally propose two low-rank recovery algorithms, whose resulting optimization problems are efficiently solved by the alternating direction method of multipliers (ADMM) with convergence guarantees. The proposed algorithms are applied to the synthetic data recovery and image and video restorations in real-world. Experimental results demonstrate the superior performance of the proposed methods over several state-of-the-art algorithms in terms of tensor recovery accuracy and computational efficiency.
更多
查看译文
关键词
Tensile stress, Robustness, Optimization, Minimization, Signal processing algorithms, Principal component analysis, Matrix decomposition, Low-rank, tensor recovery, tensor factorization, nonconvex optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要