Selective inference after feature selection via multiscale bootstrap

Annals of the Institute of Statistical Mathematics(2022)

引用 1|浏览0
暂无评分
摘要
It is common to show the confidence intervals or p -values of selected features, or predictor variables in regression, but they often involve selection bias. The selective inference approach solves this bias by conditioning on the selection event. Most existing studies of selective inference consider a specific algorithm, such as Lasso, for feature selection, and thus they have difficulties in handling more complicated algorithms. Moreover, existing studies often consider unnecessarily restrictive events, leading to over-conditioning and lower statistical power. Our novel and widely applicable resampling method via multiscale bootstrap addresses these issues to compute an approximately unbiased selective p -value for the selected features. As a simplification of the proposed method, we also develop a simpler method via the classical bootstrap. We prove that the p -value computed by our multiscale bootstrap method is more accurate than the classical bootstrap method. Furthermore, numerical experiments demonstrate that our algorithm works well even for more complicated feature selection methods such as non-convex regularization.
更多
查看译文
关键词
Hypothesis testing,Confidence intervals,Variable selection,Bootstrap resampling,Selective inference
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要