Tailoring Semantic Communication at Network Edge: A Novel Approach Using Dynamic Knowledge Distillation
CoRR(2024)
摘要
Semantic Communication (SemCom) systems, empowered by deep learning (DL),
represent a paradigm shift in data transmission. These systems prioritize the
significance of content over sheer data volume. However, existing SemCom
designs face challenges when applied to diverse computational capabilities and
network conditions, particularly in time-sensitive applications. A key
challenge is the assumption that diverse devices can uniformly benefit from a
standard, large DL model in SemCom systems. This assumption becomes
increasingly impractical, especially in high-speed, high-reliability
applications such as industrial automation or critical healthcare. Therefore,
this paper introduces a novel SemCom framework tailored for heterogeneous,
resource-constrained edge devices and computation-intensive servers. Our
approach employs dynamic knowledge distillation (KD) to customize semantic
models for each device, balancing computational and communication constraints
while ensuring Quality of Service (QoS). We formulate an optimization problem
and develop an adaptive algorithm that iteratively refines semantic knowledge
on edge devices, resulting in better models tailored to their resource
profiles. This algorithm strategically adjusts the granularity of distilled
knowledge, enabling devices to maintain high semantic accuracy for precise
inference tasks, even under unstable network conditions. Extensive simulations
demonstrate that our approach significantly reduces model complexity for edge
devices, leading to better semantic extraction and achieving the desired QoS.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要