首页 > 其他 > 详细

多分类器

时间:2017-08-17 18:27:31      阅读:205      评论:0      收藏:0      [点我收藏+]

 

Multiclass Classification: One-vs-all

Now we will approach the classification of data when we have more than two categories. Instead of y = {0,1} we will expand our definition so that y = {0,1...n}.

Since y = {0,1...n}, we divide our problem into n+1 (+1 because the index starts at 0) binary classification problems; in each one, we predict the probability that ‘y‘ is a member of one of our classes.

y{0,1...n}h(0)θ(x)=P(y=0|x;θ)h(1)θ(x)=P(y=1|x;θ)?h(n)θ(x)=P(y=n|x;θ)prediction=maxi(h(i)θ(x))

We are basically choosing one class and then lumping all the others into a single second class. We do this repeatedly, applying binary logistic regression to each case, and then use the hypothesis that returned the highest value as our prediction.

The following image shows how one could classify 3 classes:

技术分享

To summarize:

Train a logistic regression classifier hθ(x) for each class? to predict the probability that ? ?y = i? ?.

To make a prediction on a new x, pick the class ?that maximizes hθ(x)

多分类器

原文:http://www.cnblogs.com/ne-zha/p/7383177.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!