Deterministic identification over channels with finite output: a dimensional perspective on superlinear rates

CoRR(2024)

引用 0|浏览3
暂无评分
摘要
Following initial work by JaJa and Ahlswede/Cai, and inspired by a recent renewed surge in interest in deterministic identification via noisy channels, we consider the problem in its generality for memoryless channels with finite output, but arbitrary input alphabets. Such a channel is essentially given by (the closure of) the subset of its output distributions in the probability simplex. Our main findings are that the maximum number of messages thus identifiable scales super-exponentially as 2^R nlog n with the block length n, and that the optimal rate R is upper and lower bounded in terms of the covering (aka Minkowski, or Kolmogorov, or entropy) dimension d of the output set: 1/4 d ≤ R ≤ d. Leading up to the general case, we treat the important special case of the so-called Bernoulli channel with input alphabet [0;1] and binary output, which has d=1, to gain intuition. Along the way, we show a certain Hypothesis Testing Lemma (generalising an earlier insight of Ahlswede regarding the intersection of typical sets) that implies that for the construction of a deterministic identification code, it is sufficient to ensure pairwise reliable distinguishability of the output distributions. These results are then shown to generalise directly to classical-quantum channels with finite-dimensional output quantum system (but arbitrary input alphabet), and in particular to quantum channels on finite-dimensional quantum systems under the constraint that the identification code can only use tensor product inputs.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要