Virtually Adapted Reality and Algorithm Visualization for Autonomous Robots.

RoboCup(2016)

引用 25|浏览37
暂无评分
摘要
Autonomous mobile robots are often videotaped during operation, whether for later evaluation by their developers or for demonstration of the robots to others. Watching such videos is engaging and interesting. However, clearly the plain videos do not show detailed information about the algorithms running on the moving robots, leading to a rather limited visual understanding of the underlying autonomy. Researchers have resorted to following the autonomous robots algorithms through a variety of methods, most commonly graphical user interfaces running on offboard screens and separated from the captured videos. Such methods enable considerable debugging, but still have limited effectiveness, as there is an inevitable visual mismatch with the video capture. In this work, we aim to break this disconnect, and we contribute the ability to overlay visualizations onto a video, to extract the robot’s algorithms, in particular to follow its route planning and execution. We further provide mechanisms to create and visualize virtual adaptations of the real environment to enable the exploration of the behavior of the algorithms in new situations. We demonstrate the complete implementation with an autonomous quadrotor navigating in a lab environment using the rapidly-exploring random tree algorithm. We briefly motivate and discuss our follow-up visualization work for our complex small-size robot soccer team.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要