Support Vector Regression: Risk Quadrangle Framework

arxiv(2023)

引用 0|浏览0
暂无评分
摘要
This paper investigates Support Vector Regression (SVR) in the context of the fundamental risk quadrangle paradigm. It is shown that both formulations of SVR, $\varepsilon$-SVR and $\nu$-SVR, correspond to the minimization of equivalent regular error measures (Vapnik error and superquantile (CVaR) norm, respectively) with a regularization penalty. These error measures, in turn, give rise to corresponding risk quadrangles. By constructing the fundamental risk quadrangle, which corresponds to SVR, we show that SVR is the asymptotically unbiased estimator of the average of two symmetric conditional quantiles. Furthermore, the technique used for the construction of quadrangles serves as a powerful tool in proving the equivalence between $\varepsilon$-SVR and $\nu$-SVR. Additionally, SVR is formulated as a regular deviation minimization problem with a regularization penalty by invoking Error Shaping Decomposition of Regression and the dual formulation of SVR in the risk quadrangle framework is derived.
更多
查看译文
关键词
support vector regression,risk quadrangle framework
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要