Latency-Aware Generative Semantic Communications with Pre-Trained Diffusion Models
arxiv(2024)
摘要
Generative foundation AI models have recently shown great success in
synthesizing natural signals with high perceptual quality using only textual
prompts and conditioning signals to guide the generation process. This enables
semantic communications at extremely low data rates in future wireless
networks. In this paper, we develop a latency-aware semantic communications
framework with pre-trained generative models. The transmitter performs
multi-modal semantic decomposition on the input signal and transmits each
semantic stream with the appropriate coding and communication schemes based on
the intent. For the prompt, we adopt a re-transmission-based scheme to ensure
reliable transmission, and for the other semantic modalities we use an adaptive
modulation/coding scheme to achieve robustness to the changing wireless
channel. Furthermore, we design a semantic and latency-aware scheme to allocate
transmission power to different semantic modalities based on their importance
subjected to semantic quality constraints. At the receiver, a pre-trained
generative model synthesizes a high fidelity signal using the received
multi-stream semantics. Simulation results demonstrate ultra-low-rate,
low-latency, and channel-adaptive semantic communications.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要