Penalized kernel quantile regression for varying coefficient models

Journal of Statistical Planning and Inference(2022)

引用 0|浏览1
暂无评分
摘要
In nonparametric models, numerous penalization methods using a nonparametric series estimator have been developed for model selection and estimation. However, a penalization has been poorly understood combined with kernel smoothing. This can be attributed to the intrinsic technical and computational difficulties, which leads to different treatments from the developments of the penalized series estimators. Kernel smoothing is a popular and useful nonparametric estimation method; thus it is desirable to establish theoretical and computational analyses for penalized kernel smoothing. In this paper, we develop a novel penalized kernel quantile regression that has nice theoretical and computational properties in varying coefficient models. We show that the proposed method consistently identifies the partially linear structure of the varying coefficient model even when the number of covariates is allowed to increase with a sample size. We develop an efficient algorithm using the alternating direction method of multipliers with a computational convergence guarantee. We derive the plug-in bandwidth selection using high-dimensional kernel regression theory and the penalty parameter is selected by the proposed Bayesian Information Criterion. Our developments require novel high-dimensional kernel regression and computational analyses. The various simulations and real data analyses conducted in this study demonstrate the effectiveness and numerically verify the proposed method.
更多
查看译文
关键词
Kernel smoothing,Local linear method,Alternating direction method of multiplier method algorithm,Computational convergence,High-dimensional kernel quantile regression,Penalized methods
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要