Providing Safety Assurances for Systems with Unknown Dynamics
arxiv(2024)
摘要
As autonomous systems become more complex and integral in our society, the
need to accurately model and safely control these systems has increased
significantly. In the past decade, there has been tremendous success in using
deep learning techniques to model and control systems that are difficult to
model using first principles. However, providing safety assurances for such
systems remains difficult, partially due to the uncertainty in the learned
model. In this work, we aim to provide safety assurances for systems whose
dynamics are not readily derived from first principles and, hence, are more
advantageous to be learned using deep learning techniques. Given the system of
interest and safety constraints, we learn an ensemble model of the system
dynamics from data. Leveraging ensemble uncertainty as a measure of uncertainty
in the learned dynamics model, we compute a maximal robust control invariant
set, starting from which the system is guaranteed to satisfy the safety
constraints under the condition that realized model uncertainties are contained
in the predefined set of admissible model uncertainty. We demonstrate the
effectiveness of our method using a simulated case study with an inverted
pendulum and a hardware experiment with a TurtleBot. The experiments show that
our method robustifies the control actions of the system against model
uncertainty and generates safe behaviors without being overly restrictive. The
codes and accompanying videos can be found on the project website.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要