DeepSpatial: Intelligent Spatial Sensor to Perception of Things

IEEE Sensors Journal(2021)

引用 5|浏览24
暂无评分
摘要
This paper discusses a spatial sensor to identify and track objects in the environment. The sensor is composed of an RGB-D camera that provides point cloud and RGB images and an egomotion sensor able to identify its displacement in the environment. The proposed sensor also incorporates a data processing strategy developed by the authors to conferring to the sensor different skills. The adopted approach is based on four analysis steps: egomotive, lexical, syntax, and prediction analysis. As a result, the proposed sensor can identify objects in the environment, track these objects, calculate their direction, speed, and acceleration, and also predict their future positions. The on-line detector YOLO is used as a tool to identify objects, and its output is combined with the point cloud information to obtain the spatial location of each identified object. The sensor can operate with higher precision and a lower update rate, using YOLOv2, or with a higher update rate, and a smaller accuracy using YOLOv3-tiny. The object tracking, egomotion, and collision prediction skills are tested and validated using a mobile robot having a precise speed control. The presented results show that the proposed sensor (hardware + software) achieves a satisfactory accuracy and usage rate, powering its use to mobile robotic. This paper's contribution is developing an algorithm for identifying, tracking, and predicting the future position of objects embedded in a compact hardware. Thus, the contribution of this paper is to convert raw data from traditional sensors into useful information.
更多
查看译文
关键词
Spatial sensor,egomotion,YOLO,mobile robot
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要