On the Noise Model of Support Vector Machines Regression

ALT(2000)

引用 89|浏览296
暂无评分
摘要
Support Vector Machines Regression (SVMR) is a learning technique where the goodness of fit is measured not by the usual quadratic loss function (the mean square error), but by a different loss function called the Ɛ-Insensitive Loss Function (ILF), which is similar to loss functions used in the field of robust statistics. The quadratic loss function is well justified under the assumption of Gaussian additive noise. However, the noise model underlying the choice of the ILF is not clear. In this paper the use of the ILF is justified under the assumption that the noise is additive and Gaussian, where the variance and mean of the Gaussian are random variables. The probability distributions for the variance and mean will be stated explicitly. While this work is presented in the framework of SVMR, it can be extended to justify nonquadratic loss functions in any Maximum Likelihood or Maximum AP osteriori approach. It applies not only to the ILF, but to a much broader class of loss functions.
更多
查看译文
关键词
nonquadratic loss function,different loss function,noise model,maximum likelihood,loss function,support vector machines regression,gaussian additive noise,maximum ap osteriori approach,mean square error,quadratic loss function,usual quadratic loss function,random variable,approximation,networks,probability distribution,support vector machine,robust statistics
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要