0% found this document useful (0 votes)
24 views7 pages

3rd Sem Stats ASSIGNMENT 3 and 4

Uploaded by

Sswetha Sara
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views7 pages

3rd Sem Stats ASSIGNMENT 3 and 4

Uploaded by

Sswetha Sara
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

ASSIGNMENT 3

COMPARISON OF ESTIMATORS USING MEAN SQUARE ERROR CRITERION


Q1)
CODE:
p=seq(0.1,1,by=0.1);p
q=1-p
MSE1=(2*p*q)+(p*p)
MSE2=(p*q)/2
MSE3=(5*p*q)/9
MSE4=(13*p*q)/25
plot(p,MSE1,col=1,lty=1,type="l",xlab="p",ylab="MSE",ylim=c(0,1),main="MSE CURVES")
lines(p,MSE2,col=2,lty=2)
lines(p,MSE3,col=3,lty=3)
lines(p,MSE4,col=4,lty=4)
legend("topright",legend=c("MSE1","MSE2","MSE3","MSE4"),col=1:4,lty=1:4)
PLOT:

RESULT:
On comparing the
graph, MSE2 is a
better estimator.
Q2)
CODE:
lam=seq(0.1,1,by=0.1);p
MSE1=lam/15
MSE2=((15*lam)+(lam^2))/196
plot(lam,MSE1,col=1,lty=1,type="l",xlab="p",ylab="MSE",ylim=c(0,0.1),main="MSE CURVES")
lines(p,MSE2,col=2,lty=2)
legend("topright",legend=c("MSE1","MSE2"),col=1:2,lty=1:2)
PLOT:

RESULT:
On comparing the
graph, MSE1 is a
better estimator.
Q3)
CODE:
mu=seq(-3,3,by=1);mu
n=10
n1=c(0.1,0.1,0.1,0.1,0.1,0.1,0.1)
MSE1=(n+((mu)^2))/((n+1)^2)
MSE2=(n+(4*(mu)^2))/((n+2)^2)
MSE3=1/n
plot(mu,MSE1,col=1,lty=1,type="l",xlab="lambda",ylab="MSE",ylim=c(0,0.4),main="MSE
CURVES")
lines(mu,MSE2,col=2,lty=2)
lines(mu,n1,col=3,lty=3)
legend("topright",legend=c("MSE1","MSE2","MSE3"),col=1:3,lty=1:3)
PLOT:

RESULT:
On comparing the
graph, as n increases
MSE1 drastically
decreases and
remains constant.
But, when compared
to MSE2 and MSE3 it
is a better estimator.
Q4)
CODE:
t=c(0,1,2,3,4,5,6,7,8,9)
MSE1=t^2
MSE2=7*(t^2)
MSE3=(3/8)*(t^2)
MSE4=(t^2)/10
plot(t,MSE1,col=1,lty=1,type="l",xlab="theta",ylab="MSE",main="MSE CURVES")
lines(t,MSE2,col=2,lty=2)
lines(t,MSE3,col=3,lty=3)
lines(t,MSE4,col=4,lty=4)
legend("topright",legend=c("MSE1","MSE2","MSE3","MSE4"),col=1:4,lty=1:4)
PLOT:
Q5)
CODE:
p=seq(0.1,1,by=0.1);p
q=1-p
MSE1=P
MSE2=((2/3)*p*q)+(p^2)
MSE3=((7/9)*p*q)+(p^2)
MSE4=(3/4)*p*q
plot(p,MSE1,col=1,lty=1,type="l",xlab="p",ylab="MSE",ylim=c(0,1),main="MSE CURVES")
lines(p,MSE2,col=2,lty=2)
lines(p,MSE3,col=3,lty=3)
lines(p,MSE4,col=4,lty=4)
legend("topright",legend=c("MSE1","MSE2","MSE3","MSE4"),col=1:4,lty=1:4)
PLOT:

RESULT:
On comparing the
graph, MSE4 has a
decreasing curve,
hence it is a better
estimator.
ASSIGNMENT 4
ESTIMATION OF PARAMETERS BY MAXIMUM LIKELIHOOD METHOD AND
METHOD OF MOMENT
Q3)
CODE:
x=c(594,689,585,1034,720,929,737,1026,791,836,952,898,1043,806,577,994)
n=16
#MLE_mu=sample mean
#MLE_var=sample variance
mu=sum(x)/n;mu
var=((1/n)*(sum(x^2)))-((mu)^2);var
OUTPUT:
x=c(594,689,585,1034,720,929,737,1026,791,836,952,898,1043,806,577,994)
> n=16
> #MLE_mu=sample mean
> #MLE_var=sample variance
> mu=sum(x)/n;mu
[1] 825.6875
> var=((1/n)*(sum(x^2)))-((mu)^2);var
[1] 25237.59

Q4)
CODE:
x=c(28,32,29,23,28,21,24,22,14,19,25,20,15,26,35,23,12,13,36,32)
n=20
#MLE_mu=sample mean-(sqrt(3)*sd)
#MLE_var=sample variance+(sqrt(3)*sd)
mu=sum(x)/n;mu
sd=sqrt(((1/n)*(sum(x^2)))-((mu)^2));sd
a=mu-(sqrt(3)*sd);a
b=mu+(sqrt(3)*sd);b
OUTPUT:
> x=c(28,32,29,23,28,21,24,22,14,19,25,20,15,26,35,23,12,13,36,32)
> n=20
> #MLE_mu=sample mean-(sqrt(3)*sd)
> #MLE_var=sample variance+(sqrt(3)*sd)
> mu=sum(x)/n;mu
[1] 23.85
> sd=sqrt(((1/n)*(sum(x^2)))-((mu)^2));sd
[1] 6.915743
> a=mu-(sqrt(3)*sd);a
[1] 11.87158
> b=mu+(sqrt(3)*sd);b
[1] 35.82842
>
Q8)
CODE:
x=c(29.74,37.09,33.94,25.89,21.54,31.01,28.71,29.52,22.89,30.15,33.25,22.80,20.26,26.06,20.
96,20.50,36.19,26.13,15.77,36.69)
n=20
#MLE_mu=sample mean
#MLE_var=sample variance
mu=sum(x)/n;mu
var=((1/n)*(sum(x^2)))-((mu)^2);var
OUTPUT:
>x=c(29.74,37.09,33.94,25.89,21.54,31.01,28.71,29.52,22.89,30.15,33.25,22.80,20.26,26.06,2
0.96,20.50,36.19,26.13,15.77,36.69)
> n=20
> #MLE_mu=sample mean
> #MLE_var=sample variance
> mu=sum(x)/n;mu
[1] 27.4545
> var=((1/n)*(sum(x^2)))-((mu)^2);var
[1] 36.1764
>

You might also like