Asymptotic analysis of locally weighted jackknife prediction

Neurocomputing(2020)

引用 6|浏览11
暂无评分
摘要
Locally weighted jackknife prediction(LW-JP) is a variant of conformal prediction, which can output interval prediction in regression problems built on traditional learning algorithms for point prediction named as underlying algorithms. Although empirical validity and efficiency of LW-JP have been reported in some works, there lacks theoretical understanding of it. This paper gives some theoretical analysis of LW-JP in the asymptotic setting, where the number of the training samples approaches infinity. Under some regularity assumptions and conditions, the asymptotic validity of LW-JP is proved in the nonlinear regression case with heteroscedastic errors. The proof is an extension of the asymptotic analysis of leave-one-out prediction intervals in linear regression with homoscedastic errors. Based on our analysis, two conformal regressors built on LW-JP are proposed and the experimental results showed that the algorithms are not only valid interval predictors, but also achieve the state-of-the-art performance of conformal regressors.
更多
查看译文
关键词
Locally weighted jackknife prediction,Interval prediction,Theoretical analysis,Asymptotic validity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要