0% found this document useful (0 votes)
13 views8 pages

Lecture 04 - Parametric Methods

Maximum likelihood estimation (MLE) is a method used to estimate the parameters of a statistical model. MLE chooses the parameters that maximize the likelihood function or equivalently maximize the log-likelihood function given the data. For linear regression, the MLE estimates the parameters by minimizing the sum of squared errors between the observed responses in the data and the responses predicted by the linear function. MLE provides consistent, efficient and asymptotically normal estimates under certain regularity conditions.

Uploaded by

Ata
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views8 pages

Lecture 04 - Parametric Methods

Maximum likelihood estimation (MLE) is a method used to estimate the parameters of a statistical model. MLE chooses the parameters that maximize the likelihood function or equivalently maximize the log-likelihood function given the data. For linear regression, the MLE estimates the parameters by minimizing the sum of squared errors between the observed responses in the data and the responses predicted by the linear function. MLE provides consistent, efficient and asymptotically normal estimates under certain regularity conditions.

Uploaded by

Ata
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Maximum Likelihood Estruation (MLE)

data set
X training
:

& :
peremeters

MMLE
D
=

arg max p(X/a)

P
(x1a) =

#p(xia)
i I
(MAP)
=

Posteriori Estimation
Maximum an

⑨MAP p(a1X
=
org max (a)
=org max P(x)
tric Regression
YN+1= ?
:

- xN+1 ->

g(x 1/4)
y f(x) t
N N
+

f(xN 1)
=
+

->
+ 1 =
+
e

with g(x/a)
noise
f(x)
n

underlyrug approximate

[
=>

process E[x] W
2)
=

~N(t ; 0
Assumptions p(t) =N
,

=[x ]
: + c

~Nygly
+

VAR[X] K2
⑪ p(yIx)
=

~ &
VAR(x+C] =
K

y1x7f(x)
t -

E(g(xa) e]
+

=(yIx]
+

E
=

y(x=g(x1a) ur
+

E E(g(x1a)] + E[t]
V
constant R =
e
e
. .

0
- g(x(a) -

VAR[g(x1a) t]
-

x NN(x ; 0 9) VAR[y1x] =
+

, 2

x +
5NN(X ; 5, 9) -
VARCE] =
&
yi)31tR R

a
:
x
=

2(xi ,
y

(Xi (xi yi)

I
, yi) ~
p -
,
p(x 2),

↳ i .
I d
. .
=

p(x)y)p(y)
p(x y),

) p(yIx)p(x)
3N) px
=

:
y
p(x x2 y2 , XN ,

ye
=
,
---

S
, , ,
-
,

xi
aax
XN
-
+ 1

=
loglikelihood legt (01x) log(#, [p (yi1xi) p(xi)]] constant
- alo
=>
=

="flogp (yixi) logp(xi)] +

e. leg [NI
i
leg plyilxi)
1
=

maximize i
=> maximiz
e
r exp[-] ·
=
N(x; r ,
w)
constant R V N
-
.
.

↓ ↓
N(y ;

]
"[P(-Se
2

rexx(
=

maximize
-
-

maximize #g(xila 3 I
22 constant
->

2ndorder ->
-
- e

g(xila)
i
-
1

polynomiminimize
al
- =

* (yi g(xia)) g(xila)imear ->


-

↓ ~

Xi +W2
X: Yi ↓ n
Note
.


.

Wo+ W Xi yi
x23
=

N
& 3 we Gwx0 x13
In 2
, , -
a
E ei
Cy -y-
=
=

,
minimize : :
=
-

1
=

.
-
ei
minimize
E (3i -
g(xia)] g(xi(0) wo+wXi
=

Ew , w
, A 9
O 2 =

Error[aIX] "[ [3 : -(wo+we x=)]


=
.

i
=
1

-
r
2
zwe i 1
=

8 Error
-
A -

8w1 i 1
=

⑦E
I Yi I

i =
1
We

+
(yixi) =
iy
Ewo Xi+ .
3 Xi
we .

a i 1
o
i
= =
I
a)
wheN
-Wit
#
Furth
2
z

Oyi

"b
R
N

i
=
1
. Xi
i

A
=

N
1

⑨ M,


-

A
=
=

Gixe nig
when
!b murtible
: .

O Exercise
-is statement
= .

matrit this
wN=1 A
prove

[ii]
2
0
a =
det( A) =

x1 .
-

x1 x1
.
=


not
invertible .
K th order polynomial ↓!!
miel
Regression e nk
to mony

g(Xilwo we
, ...,
WK) 8
=

ur
Wotwixit ......
+
WKXi exon
,
questrous
lineer constant /me the
regression K 0 = M
1
=

k =
=>
i previous
N years
( 2 xi)

~=
=>

·
N(wo)
=

=
, ,

i = I
N

I Exi -

N
N
Exi
->
-

I
4
x
G:
.

x? WK
i
=
- 1
(x )x
I
i 1 i
=
1
- + 1x 1 +

~+x O b
DTD <= A
Show that

=I
:

Exercise
A
DT .
D
=

if N < K +
1
,
DTD is not invertible

if N>, K +
1
,
DT D .
is invertible


0

xi X2 T D I
K N

(8) f I
I
-
2 A
0 1
Exi

I---
1 ...

X
X ,
W
i ↳

+ 1)
( 1)x(k+)
+

1) x(N) Nx(k
(k +

(N 2) (*
=

N
k
=

=
25

3
[Yx5[] 5
2x4
=

< Jx rock-deficient

You might also like