Lecture 02 - Supervised Learning
Lecture 02 - Supervised Learning
set
x
=
e
car
predicting
a
Task :
2
not xi tIR
~ in
=Faeoso
or
.
-
-
nee
2x1
if is a family cor
↳
!I
xi
: =
other wise
O
an
1x2 1st cer
↳Jean
X12
data
matrit
3
X = y
y
=
with car yN
- Nx2
NX1
family
asiy
car
⑧
⑧
type of
⑧
1
-
particl other
⑧
⑧ ④
car
⑧ &
·
A
.... *
22 ⑧ ⑧
O
(i)
⑧
0
31
=
x
⑧ X =
...
X 22
-
· 2
1
21
. . .& ⑧
y i
in
⑧
X12
-
↑2
,
price x =
( Y] 3 = 1
Model family
FAMILY OF RECTANGLES R = Ep1 ,
p2,
er
,
e23
* nu
ifp xm+ 1 <p e : Xr+1 , zen model
perometers
, -
YN+1
1 =
W
price X *
I
I
X
_
x1 =
cy X
. ·
....
......
-
31
i lita
32 X
-
-
-
x
#offeatures
32
,o
*
mileage(x> #of
detents
MODEL FAMILY
=
YMICLI TUES
,
LEARNING↓=> Ending the best
observed dicte M
*
Ew wo 3
estimated
=
get
,
We .
X +
W M
A
y
y , 31
:
M A er = -
WiXis+We
y
=
,
A 4
1
XN+ 1
22 =
↑
we X21+ wo y2 -32
Wo
=
42
.
! *
1
i Xia ,
+
I W
36-56
A
=
*
We .* 61+ We 26
=
YN+1 -
46
-
X minimize E(yi -i) -
T 1
ei lei
*
=
w
ei
minimize Gisei Xe
:
minimize
"E(yi -
(1 .
Xi +
IX)
to We and we
with respect
.
:
Error(we ,
wel 4) = (y : -
(1xin +
wo))2
i = 1
-
2 Errou
-
2 wo
=
E (Etxin+wosl]
. 2 wo
I
0
=E 2 .
(yi (wixin
-
+
wo))( -
1) =
2 - .
-i
.
Owe 1
ii)
=
A
Solve for we
Exercise
N
N
-
N
/x1 yih
.
E -
As
.
.
-
-
T I
-
=
[E /N]
W1 - N
Ex
x:
-
we =
(-, 3: /N) -
w .
( xi/N) ,
e
5 -