Robust PCA Via Nonconvex Rank Approximation

IEEE International Conference on DataMining(2015)

引用 144|浏览82
暂无评分
摘要
Numerous applications in data mining and machinelearning require recovering a matrix of minimal rank. Robust principal component analysis (RPCA) is a generalframework for handling this kind of problems. Nuclear normbased convex surrogate of the rank function in RPCA iswidely investigated. Under certain assumptions, it can recoverthe underlying true low rank matrix with high probability. However, those assumptions may not hold in real-world applications. Since the nuclear norm approximates the rank byadding all singular values together, which is essentially a '1-norm of the singular values, the resulting approximation erroris not trivial and thus the resulting matrix estimator canbe significantly biased. To seek a closer approximation andto alleviate the above-mentioned limitations of the nuclearnorm, we propose a nonconvex rank approximation. Thisapproximation to the matrix rank is tighter than the nuclearnorm. To solve the associated nonconvex minimization problem, we develop an efficient augmented Lagrange multiplier basedoptimization algorithm. Experimental results demonstrate thatour method outperforms current state-of-the-art algorithms inboth accuracy and efficiency.
更多
查看译文
关键词
PCA,nonconvex rank approximation,nuclear norm based convex surrogate,robust principal component analysis,RPCA,nonconvex minimization problem,augmented Lagrange multiplier,optimization algorithm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要