ML TShivani
l&wTIAO5 YD
UNI7-4 CsE-V
Assianment
a Explain about Baysian learning Me fhods?
Bayesid learning n fhods are relevant fo ou stu du 4
me
machine leatning for two different teasons
Bayesidn learning algorithons fhat calculafe apliest
epplieit
Such a s the naive ayes
probabi lities for hypothes es
the most prac ticdl pproach es to
er,
classif are among
Certainpes learning problens.
ML is
Bayesidn melhod are imporlant fo our s fudy
that hy Proide
provide a seful perspec live fr understanding
Many lea Ining agorilhms that do nod explitify manipu late
probabilities.
-Feafures Bayesian learning methods ineude
Each observe d fraining enample csn ncre mentaly decrease
or increase re efimated probab:ik hat pothes is
is corfect
dafa o
Prior knoslegge can be combined t h observed
defermine fhe fial probebili
bayesian methods ccn accommodafe ypofheses that
make probabilisti predictions.
Nes inslanes can be c lassified combin the
their
Predictions g mulfple hypetheses, we'ghtedby
probabilities
Len the can provi de a standard optima
optima
decision making 8gainst which ofher practicalmethods
Can be me a su fe d.
What is Naive Bdyes Classifier?
The aive Bapes classifier plies fo lesrming fasks
where each inslance i s described y a confunc fion
atfribude vaues and where the farge f funetion f(7)
Can Hake on
y bauefrom Somefinfe set V.
A set training examples th target fune tion is
Providedd
Provide a ne insfance is presente d, described b
haple ofaffribafe
the vahues a,, a1, .-.on
The le arner is asked predict the fargef vaue, Or
Or
elassificafion for this neu tnsfance
The Bsyesidn spproach fo c lassikying fhe new instance
is fo SSign the most probable farget vaue, map
hat
ma ribe
aiven the affribute vaues <a,a1, .. an desc
the instance.
Vm agma (y(a,,a2... an)
ev
We can use Bay es theorem fo rewrife ris express io
epression
nap drrax P, ,a, an Pu)
ev P(a,,a,. - - .
an)
h e assumption is hat iven the facget vaue e the
instance, the po probabiliby observing he conjuncfion
a , 2 , . n isjust the producf fhe probabilifies
for the indivi dal aHribufes Po,a,nly) =I, Pe;)
arg max P) T Pe;{7y
ev
6a Discuss in delail about EM algorithm and
ancd its
its
Sigoinance eesfimaling means ofk auassians UASSians
in EM algorithm?
in
EM algorithm a
ide used
approach fo learning
the
he plesence u nobserve d variables.
-EM algeri thm can be used even
or variables whose
Laue ISnever direety observed provided the gener a
foron the probabili distribution goveninq these
varia bles known.
This agorithm has been used to% train Baye siAn
belie
belie neivorks. as well as radia basis fune fion
nefuw orks.
Fstimain Means
Consider a problem in which
k 61aussidnS
the dala D is aset
set
instantes enerafed
getated by a ptobab; lithy distributi on that is a
mix tufe kk distinet Normal distr:bufions. (k-2)
First,one ef the k Nomal dis tributi ons is selected af
random
Second, a single Tando instante iseneraled according
Each k has the Same variance 6
The learning task is b oufput
hy po thesis a
h: .,e.-
A H)
hypethesis h thaf maximies Pbf
PCof
Dala instances , I1 , *..
agminG; a)
The Sum Squared errors minimige d by the sanmple
Mean
ML
Q Described Maximum likeli hood pothesis fot
prediefing probabililirs
Assuming that cach fiaining eramp le is drawn idependen
e can write
-
when is independent h
Pbl T e&J;)
f Pe, ) Pa)- -
i1
-> P b ,
1- h if d,o
Pe,lh,x)-ht (* -
Sub eq i n
Pold T h(t)" (-h) P(,)-
rpression for fhe maximum likelihood hypothesi's
ML1max TT hC(-h Pe:)
a t t h e ) ( 1 -h ( ) h is (anstan
Erplain
xplain in delail aboud Gibbs algorihon?
Gibbs Algorithpm
h trom H at ran dom
Choose
Choose f ypothesis distribation
accer ding fo the Posterior probabilihy
Dver H
net
nert
Use h to prediet the class:fi ca fion tqhe
ht
instance X.
Cinen a na inslance to clasib, the Gbbs aorithm
Simp Pplies a ypothes is dirdun at an dom <cord i
accord ing
to the Cu Trent posterior probabilily distributon.
hat under Certain conditior
Suprisingy, c sho
it can be
the expecie d t e r Bges sclassifieation e r t o r er he
mi
Cibbs algorithm is at most twice the expecfed error
the Bayes oplimal classifier.
-More precise y the expecte d value is fakon Over argef
taryef
Concepts dran at random ACcording to the prior
Probabili distribuled assumedb the learner.
>Under this condihon, the rpec te d e hue h e error
ibhs dlyois at sorst foite fhe expecte d vaue
te
the error the Byes eqimal lassifier.
XXX