M3SC: A generic dataset for mixed multi-modal (MMM) sensing and communication integration

CHINA COMMUNICATIONS(2023)

引用 0|浏览2
暂无评分
摘要
The sixth generation (6G) of mobile communication system is witnessing a new paradigm shift, i.e., integrated sensing-communication system. A comprehensive dataset is a prerequisite for 6G integrated sensing-communication research. This paper develops a novel simulation dataset, named (MSC)-S-3, for mixed multi-modal (MMM) sensing-communication integration, and the generation framework of the (MSC)-S-3 dataset is further given. To obtain multi-modal sensory data in physical space and communication data in electromagnetic space, we utilize Air- Sim and WaveFarer to collect multi-modal sensory data and exploit Wireless InSite to collect communication data. Furthermore, the in-depth integration and precise alignment of AirSim, WaveFarer, and Wireless InSite are achieved. The (MSC)-S-3 dataset covers various weather conditions, multiplex frequency bands, and different times of the day. Currently, the (MSC)-S-3 dataset contains 1500 snapshots, including 80 RGB images, 160 depth maps, 80 LiDAR point clouds, 256 sets of mmWave waveforms with 8 radar point clouds, and 72 channel impulse response (CIR) matrices per snapshot, thus totaling 120,000 RGB images, 240,000 depth maps, 120,000 LiDAR point clouds, 384,000 sets of mmWave waveforms with 12,000 radar point clouds, and 108,000 CIR matrices. The data processing result presents the multi-modal sensory information and communication channel statistical properties. Finally, the MMM sensing-communication application, which can be supported by the (MSC)-S-3 dataset, is discussed.
更多
查看译文
关键词
multi-modal sensing,ray-tracing,sensing-communication integration,simulation dataset
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要