首页 > 系统服务 > 详细

Machine Learning Techniques -6-Support Vector Regression

时间:2015-08-27 12:37:43      阅读:322      评论:0      收藏:0      [点我收藏+]

6-Support Vector Regression

For the regression with squared error, we discuss the kernel ridge regression.

 With the knowledge of kernel function, could we find an analytic solution for kernel ridge regression?

 Since we want to find the best βn

 

技术分享

技术分享

However, compare to the linear situation, the large number of data will suffer from this formation of βn.

Compared to soft-margin Gaussian SVM, kernel ridge regression suffers from the operation of  βn through N:

技术分享

That means more SVs and will slow down our calculation, a sparse βn is now we want.

技术分享

Thus we add a tube, with the familiar function of MAX, we prune the points at a small |s - y|.

Max function is not differentable at some points, so we need some other operation as well.

技术分享

These operations are about changing the appearance to be more like standard SVM, in order to deal with the tool of QP.

wTZn + b = wTZn +w0, which is separated as a Constant.

技术分享

we add a factor to descrip the violation of margin, and use upper and lower bound to keep linear formation.

技术分享

Our next task : SVR primal -> dual

 

Machine Learning Techniques -6-Support Vector Regression

原文:http://www.cnblogs.com/windniu/p/4762749.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!