Reliable uncertainty with cheaper neural network ensembles: a case study in industrial parts classification
CoRR(2024)
摘要
In operations research (OR), predictive models often encounter
out-of-distribution (OOD) scenarios where the data distribution differs from
the training data distribution. In recent years, neural networks (NNs) are
gaining traction in OR for their exceptional performance in fields such as
image classification. However, NNs tend to make confident yet incorrect
predictions when confronted with OOD data. Uncertainty estimation offers a
solution to overconfident models, communicating when the output should (not) be
trusted. Hence, reliable uncertainty quantification in NNs is crucial in the OR
domain. Deep ensembles, composed of multiple independent NNs, have emerged as a
promising approach, offering not only strong predictive accuracy but also
reliable uncertainty estimation. However, their deployment is challenging due
to substantial computational demands. Recent fundamental research has proposed
more efficient NN ensembles, namely the snapshot, batch, and multi-input
multi-output ensemble. This study is the first to provide a comprehensive
comparison of a single NN, a deep ensemble, and the three efficient NN
ensembles. In addition, we propose a Diversity Quality metric to quantify the
ensembles' performance on the in-distribution and OOD sets in one single
metric. The OR case study discusses industrial parts classification to identify
and manage spare parts, important for timely maintenance of industrial plants.
The results highlight the batch ensemble as a cost-effective and competitive
alternative to the deep ensemble. It outperforms the deep ensemble in both
uncertainty and accuracy while exhibiting a training time speedup of 7x, a test
time speedup of 8x, and 9x memory savings.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要