Double Auto-weighted Tensor Robust Principal Component Analysis.

IEEE transactions on image processing : a publication of the IEEE Signal Processing Society(2023)

引用 0|浏览25
暂无评分
摘要
Tensor Robust Principal Component Analysis (TRPCA), which aims to recover the low-rank and sparse components from their sum, has drawn intensive interest in recent years. Most existing TRPCA methods adopt the tensor nuclear norm (TNN) and the tensor ℓ1 norm as the regularization terms for the low-rank and sparse components, respectively. However, TNN treats each singular value of the low-rank tensor L equally and the tensor ℓ1 norm shrinks each entry of the sparse tensor S with the same strength. It has been shown that larger singular values generally correspond to prominent information of the data and should be less penalized. The same goes for large entries in S in terms of absolute values. In this paper, we propose a Double Auto-weighted TRPCA (DATRPCA) method. Instead of using predefined and manually set weights merely for the low-rank tensor as previous works, DATRPCA automatically and adaptively assigns smaller weights and applies lighter penalization to significant singular values of the low-rank tensor and large entries of the sparse tensor simultaneously. We have further developed an efficient algorithm to implement DATRPCA based on the Alternating Direction Method of Multipliers (ADMM) framework. In addition, we have also established the convergence analysis of the proposed algorithm. The results on both synthetic and real-world data demonstrate the effectiveness of DATRPCA for low-rank tensor recovery, color image recovery and background modelling.
更多
查看译文
关键词
Tensor robust PCA, tensor nuclear norm, double weight learning, low-dimensional structure
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要