Large Deviations For Products Of Non-I.I.D. Stochastic Matrices With Application To Distributed Detection
2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT)(2018)
摘要
We derive the large deviation rate for convergence in probability of products of independent but not identically distributed stochastic matrices arising in time-varying distributed consensus-type networks. More precisely, we consider the model in which there exists a baseline topology that describes all possible communications and nodes are activated sparsely. At any given time, a node is active with a certain time-dependent probability, and any two nodes communicate if they are both active at that time. Under this model, we compute the exact rate for exponential decay of probabilities that the matrix products stay bounded away from their limiting matrix. We show that the rate is given by the minimal vertex cut of the baseline topology, where the node costs are defined by their limiting activation probabilities. The computed rate has many potential applications in distributed inference with intermittent communications. We provide an application in the context of consensus+innovations distributed detection. Therein, we show that optimal error exponent is achievable under a very general model of sparsified activations, thus effectively constructing asymptotically optimal detectors with significant communications savings.
更多查看译文
关键词
distributed inference,intermittent communications,sparsified activations,distributed detection,deviation rate,time-varying distributed consensus-type networks,baseline topology,matrix products,limiting matrix,minimal vertex cut,node costs,limiting activation probabilities,communications savings,independent identically distributed stochastic matrices,convergence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络