Quiz - Logistic Regression (2) .Ipynb
Quiz - Logistic Regression (2) .Ipynb
Quiz - Logistic Regression (2) .Ipynb
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Question 1"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<img src=\"images/lec6_pic36.png\">\n",
"\n",
"*Screenshot taken from
[Coursera](https://www.coursera.org/learn/machine-learning/exam/Fd9cu/
logistic-regression)*\n",
"\n",
"<!--TEASER_END-->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Question 2"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<img src=\"images/lec6_pic37.png\">\n",
"\n",
"*Screenshot taken from
[Coursera](https://www.coursera.org/learn/machine-learning/exam/Fd9cu/
logistic-regression)*\n",
"\n",
"<!--TEASER_END-->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Question 3"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<img src=\"images/lec6_pic38.png\">\n",
"\n",
"*Screenshot taken from
[Coursera](https://www.coursera.org/learn/machine-learning/exam/Fd9cu/
logistic-regression)*\n",
"\n",
"<!--TEASER_END-->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Question 4"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<img src=\"images/lec6_pic39.png\">\n",
"\n",
"*Screenshot taken from
[Coursera](https://www.coursera.org/learn/machine-learning/exam/Fd9cu/
logistic-regression)*\n",
"\n",
"<!--TEASER_END-->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Question 5"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<img src=\"images/lec6_pic40.png\">\n",
"<img src=\"images/lec6_pic41.png\">\n",
"\n",
"*Screenshot taken from
[Coursera](https://www.coursera.org/learn/machine-learning/exam/Fd9cu/
logistic-regression)*\n",
"\n",
"<!--TEASER_END-->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The hypothesis is \n",
"$$\\large h_{\\theta}(x) = g(\\theta_{0} + \\theta_{1}x_{1} + \\
theta_{2}x_{2})$$\n",
"- We choose $\\theta_{0}= - 6 $ , $\\theta_{1}= 1 $ , $\\theta_{2}=
0 $. This means that my parameter vector is going to be $ \\theta = \n",
"\\begin{bmatrix}\n",
" -6\\\\\n",
" 1\\\\\n",
" 0\\\\\n",
"\\end{bmatrix}\n",
"$\n",
"- $$ \\text{Predict \"y = 1\" if } -6 + x_{1} \\geq 0 \\
text{ or } \\theta^{T}x \\geq 0 $$\n",
"We have: \n",
"$$ -6 + x_{1} \\geq 0 \\\\\n",
" -6 + x_{1} + 6 \\geq 6 \\\\\n",
" x_{1} \\geq 6\n",
"$$\n",
"\n",
"- We can rewrite this as $ x_{1} \\geq 6$, we found that this
hypothesis would predict y=1 whenever x1 is greater than or equal to 3.\
n",
"- Let's see what that means on the figure, if I write down the
equation, $ x_{1} = 6$, , this defines\n",
"the equation of a straight line and if I draw what that straight, it
gives me the following line which passes through 6 on the x1 axis.\n",
"- So the part of the input space, we will have 2 regions: \n",
" - y = 1 region, where $ x_{1} \\geq 6$\n",
" - y = 0 region, where $ x_{1} \\leq 6$\n",
"- The line $ x_{1} = 6$, where $h_{\\theta}(x) = 0.5$ exactly, that
seperates 2 regions is called the **decision boundary**.\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 2",
"language": "python",
"name": "python2"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 2
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.10"
}
},
"nbformat": 4,
"nbformat_minor": 0
}