Monday, May 4, 2015

Logistic Regression : One Vs All Classification

MultiClass Classification : Logistic Regression

Examples:

  • Email Folder tagging: Work (y=1), Friends (y=2), Family (y=3), Travel (y=4)
  • Weather : Sunny (y=1), Cloudy (y=2), Rain (y=3)
The outcome variable y is not restricted to only two outcomes $y=\pmatrix{0 \cr 1}$ but is defined by $y=\pmatrix{1 \cr 2 \cr 3 \cr 4\cr}$ depending on the number of classifications/ groups we need to make.


In the multiclass classification, we train the model separately for y=1, y=2, y=3 and so on, and for each outcome, we select the class that maximizes $h_\theta^{(i)}(x)$

$h_\theta^{(i)}(x) = P(y=i | x;\theta)$

Train a logistic regression classifier $h_\theta^{(i)}(x)$ for each class $i$ to predict the probability that $y=i$.

On a new input $x$, to make a prediction, pick the class $i$ that maximizes $h_\theta^{(i)}(x)$




No comments:

Post a Comment