Hierarchical Dynamic Masks for Visual Explanation of Neural Networks

IEEE TRANSACTIONS ON MULTIMEDIA(2024)

引用 0|浏览2
暂无评分
摘要
Despite the remarkable accomplishments of deep neural networks in computer vision tasks, the inherent opacity of their operations remains a pressing concern. Attribution methods generating visual explanatory maps representing the importance of image pixels for model classification are popular for explaining neural network decisions. However, the small and diverse decision regions in fine-grained or medical images limit the precision and comprehensiveness of the existing attribution methods when explaining decisions made for such a data type. This paper introduces a novel attribution method called hierarchical dynamic masks (HDM) to overcome these concerns to generate saliency maps with high recognition reliability and localization capability. Specifically, we suggest dynamic masks (DM), which enable multiple small-sized benchmark mask vectors to learn the image's critical information roughly through an optimization method. The benchmark mask vectors guide the learning of the large-sized combination mask vectors so that their overlay mask accurately learns detailed pixel importance information. Additionally, we construct the HDM by hierarchically concatenating DM modules. These DM modules search and combine the regions of interest in the remaining neural network classification decisions within the masked image in a learning-based way. Since HDM forces DM to perform importance analysis in different areas, it makes the fused saliency map more comprehensive. The experiments reveal that the proposed method outperforms existing approaches significantly regarding recognition credibility and positioning ability when qualitatively and quantitatively tested on CUB-200-2011 and iChallenge-PM datasets.
更多
查看译文
关键词
Neural networks,Decision making,Visualization,Reliability,Predictive models,Location awareness,Biological neural networks,Model interpretability,neural networks,image classification,model-agnostic
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要