Logistic Regression : Decision Boundary
$h_\theta(x) = g(\theta^Tx)$
$g(z) = \frac 1 {1+e^{-z}}$
Threshold:
Predict y=1 if $h_\theta(x)\ge0.5$ or $\theta^Tx\ge0$
g(z) $\ge$ 0.5 when z $\ge$ 0
$h_\theta(x) = g(\theta^Tx)$
Predict y=0 if $h_\theta(x)\lt0.5$ or $\theta^Tx\lt0$
g(z) $\lt$ 0.5 when z $\lt$ 0
$h_\theta(x) = g(\theta^Tx)$
Decision Boundary:
Decision Boundary is the property of the hypothesis and the parameters, and not the property of the dataset
In the above example, the two datasets (red cross and blue circles) can be separated by a decision boundary whose equation is given by:
$h_\theta(x) = g(\theta_0 + \theta_1x_1 + \theta_2x_2)$
Suppose the parameters $\theta$ is defined by the vector
$\theta = \pmatrix {-3 \cr 1 \cr 1 \cr}$
The model becomes:
Predict "y=1" if $-3 + x_1 + x_2 \ge 0$
or $x_1 + x_2 \ge 3$
Similarily, Predict "y=0" if $x_1 + x_2 \lt 3$
At the Decision Boundary, when $x_1 + x_2 =3$, $h_\theta(x) = 0.5$
A Decision boundary can be nonlinear, depending on the parameters. For example, a logistic regression optimized by the following hypothesis will result in a circular Decision Boundary
$h_\theta(x) = g(\theta_0 + \theta_1x_1 + \theta_2x_2 + \theta_3x_1^2 + \theta_4x_2^2)$
where the vector $\theta$ is given by $\theta = \pmatrix{-1 \cr 0\cr0\cr1\cr1\cr}$
In this case, predict "y=1" when $z\ge0$, where $z=-1 +x_1^2+x_2^2$
Threshold:
Predict y=1 if $h_\theta(x)\ge0.5$ or $\theta^Tx\ge0$
g(z) $\ge$ 0.5 when z $\ge$ 0
$h_\theta(x) = g(\theta^Tx)$
Predict y=0 if $h_\theta(x)\lt0.5$ or $\theta^Tx\lt0$
g(z) $\lt$ 0.5 when z $\lt$ 0
$h_\theta(x) = g(\theta^Tx)$
Decision Boundary:
Decision Boundary is the property of the hypothesis and the parameters, and not the property of the dataset
$h_\theta(x) = g(\theta_0 + \theta_1x_1 + \theta_2x_2)$
Suppose the parameters $\theta$ is defined by the vector
$\theta = \pmatrix {-3 \cr 1 \cr 1 \cr}$
The model becomes:
Predict "y=1" if $-3 + x_1 + x_2 \ge 0$
or $x_1 + x_2 \ge 3$
Similarily, Predict "y=0" if $x_1 + x_2 \lt 3$
At the Decision Boundary, when $x_1 + x_2 =3$, $h_\theta(x) = 0.5$
A Decision boundary can be nonlinear, depending on the parameters. For example, a logistic regression optimized by the following hypothesis will result in a circular Decision Boundary
$h_\theta(x) = g(\theta_0 + \theta_1x_1 + \theta_2x_2 + \theta_3x_1^2 + \theta_4x_2^2)$
where the vector $\theta$ is given by $\theta = \pmatrix{-1 \cr 0\cr0\cr1\cr1\cr}$
In this case, predict "y=1" when $z\ge0$, where $z=-1 +x_1^2+x_2^2$
The Cost Function and the Gradient Descent Algorithm works much in the similar manner as the Linear Regressionm, with the exception of the function $h_\theta(x)$ which is different for Linear and Logistic Regression.
Watch This Show Review - YouTube VR | VideoDl
ReplyDeleteWatch this show youtube to mp3 download on YouTube. Watch the latest videos on YouTube with VrDl. Csgo ·. Show Search ·. Follow ·. Watch Videos.