0% found this document useful (0 votes)
130 views6 pages

Bayesian Statistics

Bayesian statistics considers parameters as random variables with probability distributions rather than fixed unknown values. It uses Bayes' theorem to update the prior probability distribution of a parameter with new sample data to obtain the posterior distribution. The posterior distribution represents what is known about the parameter after observing the data. A Bayesian analysis of coin flipping data incorporates a prior distribution on the probability of heads and uses the binomial sampling distribution and Bayes' theorem to obtain the posterior distribution over the probability parameter given the observed data.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
130 views6 pages

Bayesian Statistics

Bayesian statistics considers parameters as random variables with probability distributions rather than fixed unknown values. It uses Bayes' theorem to update the prior probability distribution of a parameter with new sample data to obtain the posterior distribution. The posterior distribution represents what is known about the parameter after observing the data. A Bayesian analysis of coin flipping data incorporates a prior distribution on the probability of heads and uses the binomial sampling distribution and Bayes' theorem to obtain the posterior distribution over the probability parameter given the observed data.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

6/12/2015 BayesianStatistics

BayesianStatistics
Intheclassical,orfrequentistapproachtoStatisticsweconsideraparameterafixedalthough
unknownquantity.ArandomsampleX1,..,Xnisdrawnfromapopulationindexedbyand,
basedontheobservedvaluesinthesample,knowledgeaboutthetruevalueofisobtained.In
theBayesianapproachisconsideredaquantitywhosevariationcanbedescribedbya
probabilitydistribution(calledapriordistribution),whichisasubjectivedistribution
describingtheexperimentersbeliefandwhichisformulatedbeforethedataisseen.Asampleis
thentakenfromapopulationindexedbyandthepriordistributionisupdatedwiththisnew
information.Theupdateddistributioniscalledtheposteriordistribution.Thisupdatingisdone
usingBayes'formula,hencethenameBayesianStatistics.

Let'sdenotethepriordistributionby()andthesamplingdistributionbyf(x|),thenthejoint
pdf(pmf)ofXandisgivenby

f(x,)=f(x|)()

themarginalofthedistributionofXis

m(x)=f(x|)()d

andfinallytheposteriordistributionistheconditionaldistributionofgiventhesamplexand
isgivenby

(|x)=f(x|)()/m(x)

Wecanwritethisalsointermsofthethelikelihoodfunction:

(|x1,..,xn)=L(|x1,..,xn)()/m(x)

Example:Youwanttoseewhetheritisreallytruethatcoinscomeupheadsandtailswith
probability1/2.Youtakeacoinfromyourpocketandflipit10times.Itcomesupheads3
times.Asafrequentistwewouldnowusethesamplemeanasanestimateofthetrue
probabilityofheads,pandfind =0.3.

ABayesiananalysiswouldproceedasfollows:letX1,..,XnbeiidBer(p).ThenY=X1+..+Xn
isBin(n,p).Nowweneedaprioronp.Ofcoursepisaprobability,soithasvalueson[0,1].
Onedistributionon[0,1]weknowistheBeta,sowewilluseaBeta(,)asourprior.
Remember,thisisaperfectlysubjectivechoice,andanybodycanusetheirown.Thejoint
distributiononYandpisgivenby

http://academic.uprm.edu/wrolke/esma6661/bayes.htm 1/6
6/12/2015 BayesianStatistics

whichisknownasthebetabinomialdistribution.

Notethatthat(Y,p)isarandomvectorwhereonecomponentiscontinuous(p)andtheotheris
discrete(Y).Soherewearecombiningapdfwithapmf.Itturnsoutthatthisisok.

Theposteriordestributionofpgivenyisthen


Ofcoursewestillneedto"extract"someinformationabouttheparameterpfromtheposterior
distribution.Oncethesamplingdistributionandthepriorarechosen,theposteriordistribution
isfixed(eventhoughitmaynotbeeasyorevenpossibletofinditanalytically)buthowwe
proceednowiscomp[letelyopenandthereareingeneralmanychoices.Ifwewanttoestimate
panaturalestimatoristhemeanoftheposteriordistribution,givenhereby

B=(y+)/(++n)

Thiscanbewrittenas


andweseethattheposteriormeanisalinearcombinationofthepriormeanandthesample
http://academic.uprm.edu/wrolke/esma6661/bayes.htm 2/6
6/12/2015 BayesianStatistics

mean.

Howaboutourproblemwiththe3headsinthe10flips?Well,wehavetocompletelyspecify
thepriordistribution,thatiswehavetochooseand.Thechoicedependsagainonourbelief.
Forexample,ifwefeelstronglythatthiscoinisjustlikeanyothercoinandthereforereally
shouldbeafaircoinweshouldchoosethemsothatthepriorputsalmostallitsweightat
around1/2.Forexamplewith==100wegetE[p]=0.5andV[p]=0.0016.Then B=
(3+100)/(100+100+10)=0.4905isourestimatefortheprobabilityofheads.Clearlyforsucha
strongpriortheactualsamplealmostdoesnotmatter,Forexamplefory=0wewouldhave
found B=0.476andfory=10itwouldbe B=0.524.

Maybewehaveneverevenheardtheword"coin"andhavenoideawhatonelookslike,let
alonewhatprobabilityof"heads"mightbe.Thenwecouldchoose==1,thatistheuniform
distribution,asourprior.Reallythiswouldindicateourcompletelackofknowledgeregarding

p.(Thisiscalledanuninformativeprior).Nowwefind B=(3+1)/(1+1+10)=0.3,whichis
justthesamplemeanagain.

inbayescoinwithwhich==1westudytheeffectofthesamplesizeontheestimateofp.

inbayescoinwithwhich==2westudytheeffectofalpha=betaontheestimateofp.Alarger
alphameansapriormoreconcentratedaround1/2.

ExamplesayX~Bin(n,p),pknown.Again,aBayesiananalysisbeginswithaprioronn.Now
n=1,2,..andsoapriorisanysequencea1,a2,..withai0andai=1

Ifwewanttofindanestimatefornwecanuseforexamplethemode,thatisthenwhichhas
thelargestposteriorprobability.

Herearesomespecificexamples:sayweobservex=217andweknowp=0.37.Also

a)weknowonlythatn750,sowechooseai=1/750if1i750,0otherwise,
http://academic.uprm.edu/wrolke/esma6661/bayes.htm 3/6
6/12/2015 BayesianStatistics

bayes.bin.n(217,0.37,rep(1/750,750))

b)weknownismostlikely500withastandarddeviationof50,
bayes.bin.n(217,0.37,dnorm(1:750,500,50))

c)weknowthatn750andthatnisamultipleof50,
bayes.bin.n(217,0.37,ifelse(c(1:750)%%50,0,1))

d)weknowthiswasoneofthefourexperimentswedid,withn=510,525,550or575
a=rep(0,750)
a[c(510,525,550,575)]=1
bayes.bin.n(217,0.37,a)

TheBigQuestion:BayesianorFrequentist?

ShouldyoubeaBayesian?
BayesianStatisticshasalotofgoodfeatures.Tobeginwith,itanswerstherightquestion,
P(Hypothesis|Data).Thereareothersaswell:

DecisionTheory

Thereisabranchofmathematicsconcerendwithdecisionmaking.Itisconceptuallyavery
usefulandimportantone:

Shouldyoubuyanewcar,orkeeptheoldoneforanotheryear?
Shouldyouinvestyourmoneyintothestockmarketorbuyfixedinterestbonds?
Shouldthegovermentlowerthetaxesorinsteadusethetaxesfordirectinvestments?

Indecisiontheoryonestartsoutbychoosingalossfunction,thatisafunctionthatassignsa
value(maybeintermsofmoney)toeverypossibleactionandeverypossibleoutcome.

ExampleYouareofferedthefollowinggame:youcaneithertake$10(let'scallthisactiona),
oryoucanflipacoin(actionb).Ifthecoincomesupheadsyouwin$50,ifitcomesupheads
youloose$10.Sotherearetwopossibleactions:takethe$10orflipthecoin,andthree
possibleoutcomes,youwin$10,$50orloose$10.Weneedavalueforeachcombination.One
obviousansweristhisone:

L(a)=10,L(b,"heads")=50,L(b,"tails")=10

Butthereareotherpossibilities.Sayyouareinabar.Youalreadyhadfoodanddrinksandyour
tabis$27.Nowyounoticethatyouonlyhave$8inyourpocket(andnocreditcardetc.)Now
ifyouwinorloose$10itdoesn'tmatther,eitherwayyoucan'tpayyourbill,anditwillbevery
embarrassingwhenitcomestopaying.Butifyouwin$50,youarefine.Nowyourloss
http://academic.uprm.edu/wrolke/esma6661/bayes.htm 4/6
6/12/2015 BayesianStatistics

functionmightbe:

L(a)=0,L(b,"heads")=1000,L(b,"tails")=0

Thenextpieceindecisiontheoryisthedecisionfunction.Theideaisthis:let'scarryoutan
experiment,anddependingontheoutcomeoftheexperimentwechoseanaction.

Shouldyouinvestyourmoneyintothestockmarketorbuyfixedinterestbonds?
Let'sdothis:wewaituntiltommorrow.IftheDowJonesgoesup,weinvestinthestock
market,otherwiseinbonds.

Indecisiontheoryadecisionruleiscalledinadmissibleifthereisanotherrulethatisbetterno
matterwhattheoutcomeoftheexperiment.Obviouslyitmakesnosensetopickan
inadmissiblerule.

Sowhat'stheconnectiontoBayesianStatistics?FirstthereareBayesiandecisionrules,which
combinepriorknowledgewiththeoutcomeoftheexperiment.

basedonthemovementoftheDowJonesinthelastyear,Ihaveacertainprobabilitythatit
willgoupoverthenextyear.

Nowthereisafamoustheorem(thecompleteclasstheorem)thatsaysthatalladmissiblerules
areBayesiandecisionrulesforsomeprior.

Optimality

Obviouslywhenwedosomethingitwouldbenicetodoitinanoptimal(best)way.Itturnsout
thatinBayesianstatisticsitisoftenpossibletoshowthatacertainmethodisbest,betterorat
leastasgoodasanyother.

WhytobeaFrequentist
Becauseyouhatepriors

orbettertosayyoudon'tlikethesubjectivityintroducedbypriors.InBayesianstatisticsitis
entirelypossiblethattwoScientistswhohavethesamedataavailableandusethesamemethod
foranalysiscometodifferentconclusions,becausetheyhavedifferentpriors.

BecauseFrequentistmethodswork

FormostofthehistoryofStatistics,thatisfromabout1900toabout1960,therewas
(essentially)onlyFrequentistStatistics.Inthistime(andsince)manymethodshavebeen
developedthatworkedverywellinpractise.ManyofthoseturnouttobealsoBayesian
methodswhentherightpriorisused,butnotall!
http://academic.uprm.edu/wrolke/esma6661/bayes.htm 5/6
6/12/2015 BayesianStatistics

Exampleoneofthemostusefulmodernmethods,calledtheBootstrap,isapurelyFrequentist
methodwithnoBayesiantheory.(ActuallythereissomethingcalledtheBayesianbootstrap,
butitisnotthesameastheclassicalbootstrap)

ExampleAstandardtechnicinregressionistostudytheresiduals.This,though,violatesthe
likelihoodprincipleandisthereforenotalloweduntertheBayesianparadigm.Actually,most
Bayesiansstudytheresidualsanyway.

Simplicity

Evenfortheeasiestproblems("estimatethemeanGPAofstudentsattheColegio")aBayesian
analysisalwaysseemstobecomplicated(chooseapriorandalossfunction,calculatethe
posterior,extracttheestimatefromtheposterior,trytodoallofthisoptimally)Frequentist
solutionsareoftenquickandeasy.

So?BeBoth!

http://academic.uprm.edu/wrolke/esma6661/bayes.htm 6/6

You might also like