Variable step-size convex regularized PRLS algorithms

SIGNAL PROCESSING(2024)

引用 0|浏览5
暂无评分
摘要
The proportionate updating (PU) and zero-attracting (ZA) mechanisms have been applied independently in the development of sparsity-aware recursive least squares (RLS) algorithms. Recently, we propose an enhanced l1- proportionate RLS (l1-PRLS) algorithm by combining the PU and ZA mechanisms. The l1-PRLS employs a fixed step size which trades off the transient (initial convergence) and steady-state performance. In this letter, the l1- PRLS is improved in two aspects: first, we replace the l1 norm penalty by a general convex regularization (CR) function to have the CR-PRLS algorithm; second, we further introduce the variable step-size (VSS) technique to the CR-PRLS, leading to the VSS-CR-PRLS algorithm. Theoretical and numerical results were provided to corroborate the superiority of the improved algorithm.
更多
查看译文
关键词
Sparse RLS algorithms,Proportionate updating,Zero-attracting,Variable step-size
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要