Risk-Sensitive Diffusion for Perturbation-Robust Optimization
arxiv(2024)
摘要
The essence of score-based generative models (SGM) is to optimize a
score-based model towards the score function. However, we show that noisy
samples incur another objective function, rather than the one with score
function, which will wrongly optimize the model. To address this problem, we
first consider a new setting where every noisy sample is paired with a risk
vector, indicating the data quality (e.g., noise level). This setting is very
common in real-world applications, especially for medical and sensor data.
Then, we introduce risk-sensitive SDE, a type of stochastic differential
equation (SDE) parameterized by the risk vector. With this tool, we aim to
minimize a measure called perturbation instability, which we define to quantify
the negative impact of noisy samples on optimization. We will prove that zero
instability measure is only achievable in the case where noisy samples are
caused by Gaussian perturbation. For non-Gaussian cases, we will also provide
its optimal coefficients that minimize the misguidance of noisy samples. To
apply risk-sensitive SDE in practice, we extend widely used diffusion models to
their risk-sensitive versions and derive a risk-free loss that is efficient for
computation. We also have conducted numerical experiments to confirm the
validity of our theorems and show that they let SGM be robust to noisy samples
for optimization.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要