A reproducing kernel Hilbert space approach to high dimensional partially varying coefficient model

Computational Statistics & Data Analysis(2020)

引用 0|浏览0
暂无评分
摘要
Partially varying coefficient model (PVCM) provides a useful class of tools for modeling complex data by incorporating a combination of constant and time-varying covariate effects. One natural question is that how to decide which covariates correspond to constant coefficients and which correspond to time-dependent coefficient functions. To handle this two-type structure selection problem on PVCM, those existing methods are either based on a finite truncation way of coefficient functions, or based on a two-phase procedure to estimate the constant and function parts separately. This paper attempts to provide a complete theoretical characterization for estimation and structure selection issues of PVCM, via proposing two new penalized methods for PVCM within a reproducing kernel Hilbert space (RKHS). The proposed strategy is partially motivated by the so-called “Non-Constant Theorem” of radial kernels, which ensures a unique and unified representation of each candidate component in the hypothesis space. Within a high-dimensional framework, minimax convergence rates for the prediction risk of the first method is established when each unknown time-dependent coefficient can be well approximated within a specified RKHS. On the other hand, under certain regularity conditions, it is shown that the second proposed estimator is able to identify the underlying structure correctly with high probability. Several simulated experiments are implemented to examine the finite sample performance of the proposed methods.
更多
查看译文
关键词
Varying coefficient models,Sparsity,Structure learning,High dimensions,Reproducing kernel Hilbert space (RKHS)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要