Learned Belief-Propagation Decoding with Simple Scaling and SNR Adaptation

2019 IEEE International Symposium on Information Theory (ISIT)(2019)

引用 42|浏览1
暂无评分
摘要
We consider the weighted belief-propagation (WBP) decoder recently proposed by Nachmani et al. where different weights are introduced for each Tanner graph edge and optimized using machine learning techniques. Our focus is on simple-scaling models that use the same weights across certain edges to reduce the storage and computational burden. The main contribution is to show that simple scaling with few parameters often achieves the same gain as the full parameterization. Moreover, several training improvements for WBP are proposed. For example, it is shown that minimizing average binary cross-entropy is suboptimal in general in terms of bit error rate (BER) and a new "soft-BER" loss is proposed which can lead to better performance. We also investigate parameter adapter networks (PANs) that learn the relation between the signal-to-noise ratio and the WBP parameters. As an example, for the (32, 16) Reed-Muller code with a highly redundant parity-check matrix, training a PAN with soft-BER loss gives near-maximum-likelihood performance assuming simple scaling with only three parameters.
更多
查看译文
关键词
SNR adaptation,weighted belief-propagation decoder,Tanner graph edge,soft-BER loss,parameter adapter networks,WBP parameters,machine learning,binary cross-entropy,learned belief-propagation decoding,bit error rate,signal-to-noise ratio,Reed-Muller code,redundant parity-check matrix,near-maximum-likelihood performance
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要