Optimal Parameter and Neuron Pruning for Out-of-Distribution Detection
NeurIPS 2023(2024)
摘要
For a machine learning model deployed in real world scenarios, the ability of
detecting out-of-distribution (OOD) samples is indispensable and challenging.
Most existing OOD detection methods focused on exploring advanced training
skills or training-free tricks to prevent the model from yielding overconfident
confidence score for unknown samples. The training-based methods require
expensive training cost and rely on OOD samples which are not always available,
while most training-free methods can not efficiently utilize the prior
information from the training data. In this work, we propose an
Optimal Parameter and Neuron Pruning
(OPNP) approach, which aims to identify and remove those parameters
and neurons that lead to over-fitting. The main method is divided into two
steps. In the first step, we evaluate the sensitivity of the model parameters
and neurons by averaging gradients over all training samples. In the second
step, the parameters and neurons with exceptionally large or close to zero
sensitivities are removed for prediction. Our proposal is training-free,
compatible with other post-hoc methods, and exploring the information from all
training data. Extensive experiments are performed on multiple OOD detection
tasks and model architectures, showing that our proposed OPNP consistently
outperforms the existing methods by a large margin.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要