LiDAR and Camera Raw Data Sensor Fusion in Real-Time for Obstacle Detection

2023 IEEE Sensors Applications Symposium (SAS)(2023)

引用 0|浏览0
暂无评分
摘要
Light Detection and Ranging (LiDAR) and camera are the most widely used sensors in autonomous vehicles for object detection, classification, localization, and Mapping. This paper proposes the fusion of raw data from LiDAR and Camera sensor in real-time. Multiple numbers of sample data are taken to calculate the intrinsic and extrinsic calibration parameters so that fusion will work on the real-time data with minimal projection error. Most of the research work done in LiDAR and Camera data fusion does not project point clouds on images in real-time data. Colored point cloud obtained by back projection can be used for constructing the High-Definition (HD) map. We need not rely on one sensor as it is easily prone to commit mistakes in perception. In this paper, The point cloud data of LiDAR is projected on the image, and using back projection, color information is provided to point cloud data to get the colored point cloud. The data of LiDAR and the camera is fused in real-time to perform the classification task using the camera and depth detection using LiDAR. The raw data from both sensors are projected with a rate much higher than the data acquisition rate so that data processing and perception algorithms to detect obstacles are done in real-time for the autonomous vehicle.
更多
查看译文
关键词
LiDAR,Calibration,Sensor Fusion,Perception,Autonomous Vehicle,HD Map
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要