Rejection-Sampled Universal Quantization for Smaller Quantization Errors
CoRR(2024)
摘要
We construct a randomized vector quantizer which has a smaller maximum error
compared to all known lattice quantizers with the same entropy for dimensions
5, 6, ..., 48, and also has a smaller mean squared error compared to known
lattice quantizers with the same entropy for dimensions 35, ..., 48, in the
high resolution limit. Moreover, our randomized quantizer has a desirable
property that the quantization error is always uniform over the ball and
independent of the input. Our construction is based on applying rejection
sampling on universal quantization, which allows us to shape the error
distribution to be any continuous distribution, not only uniform distributions
over basic cells of a lattice as in conventional dithered quantization. We also
characterize the high SNR limit of one-shot channel simulation for any additive
noise channel under a mild assumption (e.g., the AWGN channel), up to an
additive constant of 1.45 bits.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要