Linear Regression Problem Relaxations Solved by Nonconvex ADMM With Convergence Analysis

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY(2024)

引用 1|浏览3
暂无评分
摘要
In this work, we focus on studying the differentiable relaxations of several linear regression problems, where the original formulations are usually both nonsmooth with one nonconvex term. Unfortunately, in most cases, the standard alternating direction method of multipliers (ADMM) cannot guarantee global convergence when addressing these kinds of problems. To address this issue, by smoothing the convex term and applying a linearization technique before designing the iteration procedures, we employ nonconvex ADMM to optimize challenging nonconvex-convex composite problems. In our theoretical analysis, we prove the boundedness of the generated variable sequence and then guarantee that it converges to a stationary point. Meanwhile, a potential function is derived from the augmented Lagrange function, and we further verify that the objective function is monotonically nonincreasing. Under the Kurdyka-Lojasiewicz (KL) property, the global convergence is analyzed step by step. Finally, experiments on face reconstruction, image classification, and subspace clustering tasks are conducted to show the superiority of our algorithms over several state-of-the-art ones.
更多
查看译文
关键词
Convergence,Optimization,Task analysis,Sparse matrices,Minimization,Linear regression,Linear programming,Nonconvex composite problem,nonsmooth convex function,nonconvex ADMM,Kurdyka-Lojasiewicz (KL) property,global convergence analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要