SynWoodScape: Synthetic Surround-View Fisheye Camera Dataset for Autonomous Driving

IEEE ROBOTICS AND AUTOMATION LETTERS(2022)

引用 18|浏览11
暂无评分
摘要
Surround-view cameras are a primary sensor for automated driving, used for near-field perception. It is one of the most commonly used sensors in commercial vehicles primarily used for parking visualization and automated parking. Four fisheye cameras with a 190 degrees field of view cover the 360 degrees around the vehicle. Due to its high radial distortion, the standard algorithms do not extend easily. Previously, we released the first public fisheye surround-view dataset named WoodScape. In this work, we release a synthetic version of the surround-view dataset, covering many of its weaknesses and extending it. Firstly, it is not possible to obtain ground truth for pixel-wise optical flow and depth. Secondly, WoodScape did not have all four cameras annotated simultaneously in order to sample diverse frames. However, this means that multi-camera algorithms cannot be designed to obtain a unified output in birds-eye space, which is enabled in the new dataset. We implemented surround-view fisheye geometric projections in CARLA Simulator matching WoodScape's configuration and created SynWoodScape. We release 80 k images from the synthetic dataset with annotations for 10+ tasks. We also release the baseline code and supporting scripts.
更多
查看译文
关键词
Automated driving, fisheye cameras, omnidirectional vision, synthetic datasets
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要