Calibrating Deep Neural Networks using Explicit Regularisation and Dynamic Data Pruning

WACV(2023)

Cited 3|Views48
No score
Abstract
Deep neural networks (DNNS) are prone to miscalibrated predictions, often exhibiting a mismatch between the predicted output and the associated confidence scores. Contemporary model calibration techniques mitigate the problem of overconfident predictions by pushing down the confidence of the winning class while increasing the confidence of the remaining classes across all test samples. However, from a deployment perspective an ideal model is desired to (i) generate well calibrated predictions for high-confidence samples with predicted probability say > 0.95 and (ii) generate a higher proportion of legitimate high-confidence samples. To this end, we propose a novel regularization technique that can be used with classification losses, leading to state-of-the-art calibrated predictions at test time; From a deployment standpoint in safety critical applications, only high-confidence samples from a well-calibrated model are of interest, as the remaining samples have to undergo manual inspection. Predictive confidence reduction of these potentially "high-confidence samples" is a downside of existing calibration approaches. We mitigate this via proposing a dynamic traintime data pruning strategy which prunes low confidence samples every few epochs, providing an increase in confident yet calibrated samples. We demonstrate state-of-the-art calibration performance across image classification benchmarks, reducing training time without much compromise in accuracy. We provide insights into why our dynamic pruning strategy that prunes low confidence training samples leads to an increase in high-confidence samples at test time.
More
Translated text
Key words
Algorithms: Explainable,fair,accountable,privacy-preserving,ethical computer vision,Image recognition and understanding (object detection,categorization,segmentation,scene modeling,visual reasoning),Social good
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined