Statistical Inference In Sparse High-Dimensional Additive Models

ANNALS OF STATISTICS(2021)

引用 4|浏览1
暂无评分
摘要
In this paper, we discuss the estimation of a nonparametric component f(1) of a nonparametric additive model Y = f(1)(X-1)+ ... + f(q)(X-q) + epsilon. We allow the number q of additive components to grow to infinity and we make sparsity assumptions about the number of nonzero additive components. We compare this estimation problem with that of estimating f(1) in the oracle model Z = f(1)(X-1) + epsilon, for which the additive components f(2),..., f(q) are known. We construct a two-step presmoothing-and-resmoothing estimator of f(1) and state finite-sample bounds for the difference between our estimator and a corresponding smoothing estimator f(1)((oracle)) in the oracle model. In an asymptotic setting, these bounds can be used to show asymptotic equivalence of our estimator and the oracle estimator; the paper thus shows that, asymptotically, under strong enough sparsity conditions, knowledge of f(2),..., f(q) has no effect on estimation accuracy. Our first step is to estimate f(1) with an undersmoothed estimator based on near-orthogonal projections with a group Lasso bias correction. In the second step, we construct pseudo responses Y by evaluating this undersmoothed estimator of f(1) at the design points and then apply the smoothing method of the oracle estimator f(1)((oracle)) to the nonparametric regression problem with "responses" Y and covariates X-1. Our mathematical exposition centers primarily on establishing properties of the presmoothing estimator. We present simulation results demonstrating close-to-oracle performance of our estimator in practical applications.
更多
查看译文
关键词
Nonparametric curve estimation, additive models, bias correction, near-orthogonality, Lasso
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要