3D VSG: Long-term Semantic Scene Change Prediction through 3D Variable Scene Graphs

arxiv(2023)

引用 4|浏览55
暂无评分
摘要
Numerous applications require robots to operate in environments shared with other agents, such as humans or other robots. However, such shared scenes are typically subject to different kinds of long-term semantic scene changes. The ability to model and predict such changes is thus crucial for robot autonomy. In this work, we formalize the task of semantic scene variability estimation and identify three main varieties of semantic scene change: changes in the position of an object, its semantic state, or the composition of a scene as a whole. To represent this variability, we propose the Variable Scene Graph (VSG), which augments existing 3D Scene Graph (SG) representations with the variability attribute, representing the likelihood of discrete long-term change events. We present a novel method, DeltaVSG, to estimate the variability of VSGs in a supervised fashion. We evaluate our method on the 3RScan long-term dataset, showing notable improvements in this novel task over existing approaches. Our method DeltaVSG achieves an accuracy of 77.1% and a recall of 72.3%, often mimicking human intuition about how indoor scenes change over time. We further show the utility of VSG prediction in the task of active robotic change detection, speeding up task completion by 66.0% compared to a scene-change-unaware planner. We make our code available as open-source.
更多
查看译文
关键词
3D scene graph representations,3D variable scene graphs,3RScan long-term dataset,active robotic change detection,DeltaVSG,indoor scenes change,long-term change events,long-term dataset,long-term semantic scene change prediction,long-term semantic scene changes,robot autonomy,scene-change-unaware planner,semantic scene variability estimation,semantic state,shared scenes,variability attribute,VSG prediction
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要