Accurate estimation of feature importance faithfulness for tree models
arxiv(2024)
摘要
In this paper, we consider a perturbation-based metric of predictive
faithfulness of feature rankings (or attributions) that we call PGI squared.
When applied to decision tree-based regression models, the metric can be
computed accurately and efficiently for arbitrary independent feature
perturbation distributions. In particular, the computation does not involve
Monte Carlo sampling that has been typically used for computing similar metrics
and which is inherently prone to inaccuracies. Moreover, we propose a method of
ranking features by their importance for the tree model's predictions based on
PGI squared. Our experiments indicate that in some respects, the method may
identify the globally important features better than the state-of-the-art SHAP
explainer
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要