Lossy Compression via Sparse Regression Codes: An Approximate Message Passing Approach.

ITW(2023)

引用 0|浏览18
暂无评分
摘要
This paper presents a low-complexity lossy compression scheme for Gaussian vectors, using sparse regression codes (SRC) and a novel decimated approximate message passing (AMP) encoder. The sparse regression codebook is characterized by a design matrix and each codeword is a linear combination of selected columns of the matrix. In order to enable the convergence of AMP for lossy compression, we incorporate the concept of decimation into the AMP algorithm for the first time. Further, we show that the power allocation technique is beneficial for improving the rate-distortion performance. The computational complexity of the proposed encoding is O(log n) per source sample for a length-n source vector, using a sub-Fourier design matrix. Moreover, the proposed AMP encoder inherently supports successively refinable compression. Simulation results show that the proposed decimated AMP encoder significantly outperforms the existing successive-approximation encoding [1] and approaches the rate-distortion limit in low-rate regime.
更多
查看译文
关键词
Lossy compression, sparse regression codes, approximate message passing, Gaussian rate-distortion
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要