CH 05
CH 05
Problem 5.2
Problem 5.3
1
2 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
b. Use the tree diagram of Fig. 5.2 except that we now look at outcomes that give 4 of a
kind. Since the tree diagram is for a particular denomination, we multiply by 13 and
get
4 3 2 1 48 48 3 2 1 48 4 3 2 1
P (4 of a kind) = 13 3 + +
52 51 50 49 48 51 50 49 48 52 51 50 49 48
= 0:0002401
c. The …rst card can be anything. Given a particular suit on the …rst card, the probability
of suit match on the second card is 12=51; on the third card it is 11=50; on the fourth
card it is 10=49; on the …fth and last card it is 9=48. Therefore,
12 11 10 9
P (all same suit) = 1 = 0:001981
51 50 49 48
Problem 5.4
Problem 5.5
a. The result is
2
P (AB) = 1 P (1 or more links broken) = 1 1 q2 (1 q)
Problem 5.6
Using Bayes’rule
P (BjA) P (A)
P (AjB) =
P (B)
where, by total probability
Therefore
(0:95) (0:45)
P (AjB) = = 0:6895
0:62
Similarly,
P BjA P (A)
P AjB =
P B
with
(0:05) (0:45)
P AjB = = 0:0592
0:38
4 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
Problem 5.7
Problem 5.8
First, make a table showing outcomes giving the two values of the random variable:
Outcomes (H = head, T = tail) X P (X = xi )
TTH, THT, HTT, HHH (odd # of heads) 0 4=8 = 1=2
HHT, HTH, THH, TTT (even # of heads) 1 4=8 = 1=2
The cdf is zero for x < 0, jumps by 1/2 at x = 0, jumps another 1/2 at x = 1, and remains
at 1 out to 1. The pdf consists of an impulse of weight 1/2 at x = 0 and an impulse of
weight 1/2 at x = 1. It is zero elsewhere.
Problem 5.9
See the tables below for the results. P
Outcomes X1 = spots up P (X1 = xi )
(1; 1) 2 1/36
(1; 2) ; (2; 1) 3 2/36
(1; 3) ; (3; 1) ; (2; 2) 4 3/36
(1; 4) ; (4; 1) ; (2; 3) ; (3; 2) 5 4/36
(1; 5) ; (5; 1) ; (2; 4) ; (4; 2) ; (3; 3) 6 5/36
(a)
(1; 6) ; (6; 1) ; (2; 5) ; (5; 2) ; (3; 4) ; (4; 3) 7 6/36
(2; 6) ; (6; 2) ; (3; 5) ; (5; 3) ; (4; 4) 8 5/36
(3; 6) ; (6; 3) ; (4; 5) ; (5; 4) 9 4/36
(4; 6) ; (6; 4) ; (5; 5) 10 3/36
(5; 6) ; (6; 5) 11 2/36
((6; 6)) 12 1/36
Outcomes X2 P (X2 = xi )
(1; 1) ; (1; 3) ; (3; 1) ; (2; 2) ;
(1; 5) ; (5; 1) ; (2; 4) ; (4; 2) ; (3; 3) ; 1 18=36 = 1=2
(2; 6) ; (6; 2) ; (3; 5) ; (5; 3) ; (4; 4) ;
(b)
(4; 6) ; (6; 4) ; (5; 5) ; (6; 6)
(1; 2) ; (2; 1) ; (1; 4) ; (4; 1) ; (2; 3) ; (3; 2) ;
(1; 6) ; (6; 1) ; (2; 5) ; (5; 2) ; (3; 4) ; (4; 3) ; 0 18=36 = 1=2
(3; 6) ; (6; 3) ; (4; 5) ; (5; 4) ; (5; 6) ; (6; 5)
5.1. PROBLEM SOLUTIONS 5
Problem 5.10
dFX (x)
fX (x) = = 4 4:8225 10 5 x3 u (x) u (12 x)
dx
= 1:929 10 4 x3 u (x) u (12 x)
The graph of this pdf is 0 for t < 0, the cubic 1:929 10 4 x3 for 0 x 12, and 0
for t > 12.
5
P (X > 5) = 1 FX (5) = 1 4:8225 10 54 = 1 0:0303 = 0:9699
Problem 5.11
R1 x dx
a. A = where the condition 0 Ae = 1 must be satis…ed;
R0 x dx
b. B = where the condition 1 Be = 1 must be satis…ed;
R1 x dx
c. C = e where the condition 1 Ce = 1 must be satis…ed;
R
d. D = 1= where the condition 0 Ddx = 1 must be satis…ed.
6 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
Problem 5.12
With proper choice of A, the two separate factors are marginal pdfs. Thus X and Y
are statistically independent.
b. Note that the pdf is the volume between a plane which intersects the x and y coor-
dinate axes one unit out and the fXY coordinate axis at C. First …nd C from the
integral
Z 1 Z 1 Z 1Z 1 y
fXY (x; y) dxdy = 1 or C (1 x y) dxdy = 1
1 1 0 0
and (
Z 1 y
3 (1 y)2 ; 0 y 1
fY (y) = 6 (1 x y) dx =
0 0; otherwise
Since the joint pdf is not equal to the product of the two marginal pdfs, X and Y are
not statistically independent.
Problem 5.13
RR
a. fXY (x; y) dxdy = 1 gives C = 1=32;
1+1:5
b. fXY (1; 1:5) = 32 = 0:0781
1
fXY (x; y) 32 (1 + xy) (1 + xy)
fXjY (xjy) = = 1 = ; 0 x 4; 0 y 2
fY (y) 8 (1 + 2y)
4 (1 + 2y)
5.1. PROBLEM SOLUTIONS 7
1+x 1+x
fXjY (xj3) = = ; 0 x 4
4 (1 + 2) 12
Problem 5.14
b. The cdf is Z x
2 0; x <
FX (x) = z u (z ) dz =
1 (1 =x) ; x >
1; 10
P (X 10) = 1 P (X < 10) = 1 FX (10) =
=10; < 10
Problem 5.15
Thus A = 1.
c. Yes, because the joint pdf factors into the product of the marginal pdfs.
8 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
Problem 5.16
The result is
8 2
h i 1 < expp( y=2 ) ; y 0
fY (y) = fX (x)jx= p + fX (x)j p p = 2 2y
y x= y
2 y : 0; y < 0
Problem 5.17
First note that
P (Y = 0) = P (X 0) = 1=2
For y > 0, transformation of variables gives
dg 1 (y)
fY (y) = fX (x)
dy x=g 1 (y)
Thus A = be2b .
b. The mean is
Z 1
E [X] = be2b xe bx
dx
2
1 1
= be2b 2+ e 2b
b b
1
= 2+
b
where integration by parts was used.
5.1. PROBLEM SOLUTIONS 9
d. The variance of X is
2
X = E X2 fE [X]g2
2
1 1 1
= 2 2+ 2+ 2+
b b b
1
=
b2
Problem 5.19
R2 h i2 R 2 dx h i2
1 x3 1 x2
a. E X 2 = 0 x2 dx 2 = 2 3 0 = 4
3 ; E [X] = 0 x 2 = 4 2
2 2 0 = 1; 3 > 1 so it is
true in this special case.
R4 h i4 R 4 dx h i4
1 x3 1 x2
b. E X 2 = 0 x2 dx 4 = 4 3 0 = 16
3 ; E [X] = 0 x 4 = 4 2 0 = 2;
16 2
3 > 2 so it is
also true in this special case.
n o
c. Use the fact that E [X E (X)]2 0 (0 only if X = 0 with probability one).
Expanding, we have E X 2 fE [X]g2 > 0 if X 6= 0.
Problem 5.20
bB 1
a. A = b 1 e ;
b. The cdf is 0 for x < 0 and 1 for x > B. For 0 x B, the result is (A=b) 1 e bx ;
c. The mean is
1 e bB
E [X] = 1 bB bB
b 1 e
10 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
d. The mean-square is
2A 1 e bB AB 2
E X2 = (1 + bB) e bB
b b2 b2 b
where A must be substituted from part (a.).
e. For the variance, subtract the result of part (c) squared from the result of part (d).
Problem 5.21
a. The mean and variance of the Rayleigh random variable can be obtained by using
tabulated de…nite integrals. The mean is
Z 1 2 r
r r2 1 1 p 2
E [R] = exp dr = 2 2 =
2 2 2 2 22 (1=2 2 ) 2
0
R1
by using a de…nite integral for 0 x2n exp ax2 dx given in Appendix G. The mean
square is
Z 1 3
2 r r2 r2 r
E R = 2
exp 2
dr; let u = 2
; du = 2 dr
0 2 2
Z 1
= 2 2 u exp ( u) du
0
Z 1
= 2 2 u exp ( u)j1 0 + exp ( u) du (integration by parts)
0
2
= 2
Thus, the variance is
var [R] = E R2 E 2 [R]
2 2
= 2
2
which is the same as given in Table 5.4.
b. The mean and second moment of the single-sided exponential pdf are easily found in
terms of tabulated integrals or by integration by parts. The mean is
Z 1
E [X] = x exp ( x) dx
0
Z
1 1
= u exp ( u) du
0
1
=
5.1. PROBLEM SOLUTIONS 11
c. The mean of the hyperbolic pdf is zero by virtue of the evenness of the pdf, . For the
variance of the hyperbolic pdf, consider
Z 1 2 Z 1 2
2 x (m 1) hm 1 dx x (m 1) hm 1 dx
E X = m dx = dx
1 2 (jxj + h) 0 (x + h)m
where the second integral follows by evenness of the integrand in the …rst integral,
and the absolute value on x is unnecessary because the integration is for positive x.
Integrate by parts twice to obtain
(" #1 Z )
2 m 1 x2 (h + h) m 1 1
2x (x + h) m+1
E X = (m 1) h + dx
m+1 0 m 1
0
The …rst term is zero if m 3. Therefore
Z 1
m+1 2h2
E X 2 = 2hm 1
x (x + h) dx = ; m 4
0 (m 2) (m 3)
k=1
P1 k 1
Di¤erentiate the sum k=0 q = 1 q; jqj < 1 to get the sum
1
X 1
kq k 1
=
k=1
(1 q)2
Problem 5.22
Z 1
1 1
E [X] = x (x 5) + [u (x 4) u (x 8)] dx
1 2 8
Z1 Z
1 1 8 1 1 82 42 11
= x (x 5) dx + xdx = (5) + =
2 1 8 4 2 8 2 2
Z 1
2 1 1
E X = x2 (x 5) + [u (x 4) u (x 8)] dx
1 2 8
Z 1 Z
1 1 8 2 1 1 83 43 187
= x2 (x 5) dx + x dx = (25) + =
2 1 8 4 2 8 3 6
2 187 121 11
X = E X2 E 2 [X] = =
6 4 12
Problem 5.23
Z 1 x2=2 2 Z 1 p 2 Z 1
2n 2n e
2n e y 2n+1 2n
E X = x p dx = 2 2 2 y p dy = p y 2n e y
dy
1 2 2 0 0
where y = x= 21=2 . Using a table of de…nite integrals, the given result is obtained.
Problem 5.24
The conditional pdf fXjY (xjy) is given by
h i
1
p x2 2 xy+y 2
exp 2 2 (1 2 )
fXY (x; y) 2 2 1 2
fXjY (xjy) = = exp( y 2 =(2 2 ))
fY (y) p
2 2
1 x 2 2 xy + y2 y2
= p p exp 2 (1 2)
+
2 2 1 2 2 2 2
1 x2 2 xy y2 y2
= p p exp +
2 2 1 2 2 2 (1 2) 2 2 2 2 (1 2)
1 x2 2 xy y2 1
= p p exp + 1
2 2 1 2 2 2 (1 2) 2 2 1 2
1 x2 2 xy y2 2
= p p exp +
2 2 1 2 2 2 (1 2) 2 2 1 2
1 x2
2 xy + 2 y 2
= p p exp
2 2 1 2 2 2 (1 2)
" #
1 (x y)2
= p p exp
2 2 1 2 2 2 (1 2)
Problem 5.25
Regardless of the correlation coe¢ cient, the mean of Z is
The variance of Z is
n o
var (Z) = E [Z E (Z)]2
n o
= E [3X 4Y 3E(X) + 4E(Y )]2
n o
2
= E 3 X X 4(Y Y )
= E 9 X X 24 X X (Y Y ) + 16(Y Y )2
2 2
= 9 X + 16 Y 24 X Y XY
Putting in numbers, the results for the variance of Z are: (a) 148; (b) 122.6; (c) 59.1; (d)
21.0.
5.1. PROBLEM SOLUTIONS 15
Problem 5.26
Divide the joint Gaussian pdf of two random variables by the Gaussian pdf of Y , collect all
exponential terms in a common exponent, complete the square of this exponent which will
be quadratic in x and y, and the desired result is a Gaussian pdf with
X 2 2
E fXjY g = mX + (Y mY ) and var (XjY ) = X 1
Y
The derivation follow closely that of Problem 5.25 except for the presence of nonzero means
mX and mY :
Problem 5.27
a. From Table 5.4,
E [X] = 0; E X 2 = 1=32 = var [X]
b. By transformation of random variables, the pdf of Y is
dx
fY (y) = fX (x)jx=(y 4)=5
dy
4 8jy 4j=5
= e
5
c. Using the transformation from X to Y , it follows that
E [Y ] = E [4 + 5X] = 4 + 5E [X] = 4;
h i
E Y 2 = E (4 + 5X)2
= E 16 + 40X + 25X 2 = 16 + 40E [X] + 25E X 2
816 51
= 16 + 25=32 = = = 25:5;
32 2
2
= E Y2 E 2 [Y ] = 25:5 42 = 9:5
Problem 5.28
Convolve the two component pdfs. A sketch of the two component pdfs making up the
integrand show that there is no overlap until z > 1, and the overlap ends when z > 7. The
result, either obtained graphically or analytically, is given by (a trapezoid)
8
>
> 0; z < 1
>
>
< (z 1) =8; 1 z < 3
fZ (z) = 1=4; 3 z < 5
>
>
>
> (z 7) =8; 5 z 7
:
0; z > 7
16 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
Problem 5.29
@MX (jv)
E [X] = j = 1=a
@v v=0
and
@ 2 MX (jv) 2
E X 2 = ( j)2 =
@v 2 v=0 a2
respectively.
Problem 5.30
P 10 10! 1
a. P (5 or 6 heads in 10 trials) = k=5;6 [10!=k! (10 k)!] 2 = 5!5! 1024 = 0:2461
Problem 5.31
The required distributions are
n k
P (k) = p (1 p)n k
(Binomial)
k
e (k np)2 =[2np(1 p)]
P (k) = p (Laplace)
2 np (1 p)
(np)k np
P (k) = e (Poisson)
k!
Comparison tables are given below:
5.1. PROBLEM SOLUTIONS 17
Problem 5.32
c. The answers are the reciprocals of the numbers found in (a) and (b), or 2:79 10 6
Problem 5.33
a. The probability of exactly one error in 105 digits, by the binomial distribution, is
105 5 99;999
P 1 error in 105 = 1 10 10 5
1
= 105 0:3679 10 5
= 0:3679
105 10 5
P 2 errors in 105 = exp 105 10 5
= 0:18395
2!
Problem 5.34
a. The desired probability is
2
X 20
20 1
P (fewer than 3 heads in 20 coins tossed) =
k 2
k=0
20 4
= (1 + 20 + 190) 2 = 2:0123 10
Problem 5.35
a. The marginal pdf for X is
" # " #
1 (x 1)2 1 (x 1)2
fX (x) = p exp = p exp
2 (4) 2 4 8 8
The marginal pdf for Y is of the same form except for Y and y in place of X and x,
respectively.
b. The joint pdf is
8 2
9
>
< x 1 2
2 (0:5) x 1 y 1
+ y 1 >
=
1 2 2 2 2
fXY (x; y) = p exp
2 (2) (2) 1 0:52 >
: 2 (1 2
0:5 ) >
;
( )
1 (x 1)2 (x 1) (y 1) + (y 1)2
= p exp
8 0:75 4 2 (1 0:52 )
1 x2 x xy + y 2 y+1
= p exp
8 0:75 6
20 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
fXY (x; y)
fXjY (xj y) =
fY (y)
n o
(x 1)2 (x 1)(y 1)+(y 1)2
p1 exp
8 0:75 6
= h i
(y 1)2
p1 exp
8 8
( )
1 (x 1)2 (x 1) (y 1) + (y 1)2 (y 1)2
= p exp +
6 6 8
( )
1 (x 1)2 (x 1) (y 1) (y 1)2
= p exp
6 6 24
( ) ( )
1 [x 1 0:5 (y 1)]2 1 [x 0:5y 0:5]2
= p exp = p exp
6 6 6 6
E [Xj Y ] = 1 + 0:5 (Y 1)
2
and var [Xj Y ] = 4 1 0:5 =3
Problem 5.36
a. K = 1= ;
b. E [X] is not de…ned, but one could argue that it is zero from the oddness of the
integrand for this expectation. If one writes down the integral for the second moment,
it clearly does not converge (the integrand approaches a constant as jxj ! 1).
d. Compare the form of the characteristic function with the answer given in (c) and do
a suitable rede…nition of variables.
5.1. PROBLEM SOLUTIONS 21
Problem 5.37
But
n o Z x =21 2 2
2 2e
E ejvXi = ejvx p dx
1 2 2
Z 1 2 2
e (1=2 jv)x
= p dx
1 2 2
2 1=2
= 1 j2v
which follows by using the de…nite integral de…ned at the beginning of Problem 4.35.
Thus
N=2
MY (jv) = 1 j2v 2
b. The given pdf follows easily by letting =2 2 in the given Fourier transform pair.
c. The mean of Y is
2
E [Y ] = N
by taking the sum of the expectations of the separate terms of the sum de…ning Y .
The mean-square value of Xi2 is
h i
2 2 1=2
h i d 1 j2v
2
E Xi2 = ( j)2 =3 4
dv 2 v=0
Thus, h i
2
var Xi2 =3 4 4
=2 4
Since the terms in the sum de…ning Y are independent, var[Y ] = 2N 4 . The mean
and variance of Y can be put into the de…nition of a Gaussian pdf to get the desired
approximation.
Note the extreme di¤erence between exact and approximation due to the central limit
theorem not being valid for N = 2; 4.
22 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
0.5 0.2
chi-sq; σ 2 = 1; N = 2 chi-sq; σ 2 = 1; N = 4
0.4 Gauss approx Gauss approx
0.15
0.3
fY (y)
fY (y)
0.1
0.2
0.05
0.1
0 0
0 10 20 30 0 10 20 30
y y
chi-sq; σ 2 = 1; N = 8 chi-sq; σ 2 = 1; N = 16
0.12 Gauss approx 0.08 Gauss approx
0.1
0.06
0.08
fY (y)
fY (y)
0.06 0.04
0.04
0.02
0.02
0 0
0 10 20 30 0 10 20 30
y y
Problem 5.38
R1 exp( t2 =2) exp( x2 =2)
This is a matter of plotting Q (x) = x p
2
dx and Qa (x) = p
2 x
on the same
set of axes.
Problem 5.39
By de…nition, the cdf of a Gaussian random variable is
h i h i
Z x exp (u m)2 Z 1 exp (u m)2
2 2 2 2
FX (x) = p du = 1 p du
1 2 2 x 2 2
Change variables to
u m
v=
Z 1 exp u2
2 x m
FX (x) = 1 p du = 1 Q
(x m)= 2
5.1. PROBLEM SOLUTIONS 23
A plot may easily be obtained byR a MATLAB program. It is suggested that you use the
1
MATLAB function erfc(x) = p2 x exp t2 dt.
Problem 5.40
The following relationships involving the Q-function will prove useful:
Z b exp u2 =2
p du = Q (a) Q (b) ; b > a;
a 2
Q (x) = 1 Q (jxj) ; x < 0; Q (0) = 0:5
a. The probability is
Z 15 (x 10)2 =50
e
P (jXj 15) = dx p
50
15
Z
u2 =2
1 exp
x 10
= p du; u = p
5 2 25
= Q ( 5) Q (1) = 1 Q (5) Q (1)
= 0:8413
b. This probability is
Z 20 (x 10)2 =50
e
P (10 < X 20) = dx p
50 10
Z 2
exp u2 =2 x 10
= p du; u = p
0 2 25
= Q (0) Q (2) = 0:5 Q (2)
= 0:4772
d. The result is
Z 30 (x 10)2 =50
e
P (20 < X 30) = dx p
2050
Z 4
exp u2 =2 x 10
= p du; u = p
2 2 25
= Q (2) Q (4) = 0:0227
Problem 5.41
Consider the product of integrals
Z 1 Z 1
exp u2 =2 exp v 2 =2
I (x) = p du p dv
0 2 x 2
Z
1 1 exp v 2 =2 1
= p dv = Q (x)
2 x 2 2
where the 1/2 comes from the fact that the …rst integral is the integral of the right half of
a Gaussian pdf. Now write the product of the two integrals as an iterated integral to get
Z 1Z 1
1 u2 + v 2
I (x) = exp dudv
2 0 x 2
Transform the integral to polar coordinates via u = r cos ; v = r sin ; and dudv = rdrd
to get
Z =2 Z 1
1 1
I (x) = Q (x) = exp r2 =2 rdrd
2 2 0 x= sin
Z =2
1 x2
= exp d
2 0 2 sin2
Multiplying through by 2 gives the answer given in the problem statement.
Problem 5.42
a. Observe that
Z 1
2
= (x m)2 fX (x) dx
1
Z
(x m)2 fX (x) dx
jx mj>k
Z
k2 2
fX (x) dx
jx mj>k
2 2
= k P fjX mj > k g = k 2 2
f1 P [jX mj > k ]g
5.1. PROBLEM SOLUTIONS 25
or
1
P [jX mj < k ] 1
k2
b. Note that for this random variable mX = 0 and 2 = 22 =12 = 1=3: The actual
X
probability is
h p i
P [jX mX j < k X ] = P jXj < k= 3
( R p p p
k= 3 dx
p
2 = k= 3; k < 3
= k= 3p
1; k 3
Note that in all cases the bound exceeds the actual probability.
26 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
Problem 5.43
a. The mean is 0 by even symmetry of the pdf. Thus, the variance is
2
= var [X] = E X 2 (because the mean is 0)
Z 1
a
= x2 exp ( ajxj) dx
1 2
Z 1
a
= 2 x2 exp ( ax) dx
2 0
Z 1
y 2 dy
= 2a exp ( y)
0 a a
1 1
= 2a = 1=a2
a3 2
Thus, in terms of the variance, the pdf is
1
fX (x) = exp ( jxj = )
2
b. The desired probability is
P [jXj > k ] = P [X < k ] + P [X > k ] ; k = 1; 2; 3
Z k Z 1
1 1
= exp (x= ) dx + exp ( x= ) dx
1 2 k 2
Z 1
= exp ( u) du (by eveness of the integrand and u = x= )
k
= exp ( k)
Problem 5.44
The desired probability is
P [jXj > k ] = P [X < k ] + P [X > k ] ; k = 1; 2; 3
Z k Z 1
exp x2 =2 2 exp x2 =2 2
= p dx + p dx
1 2 2 k 2 2
= 2Q (k)
5.1. PROBLEM SOLUTIONS 27
Problem 5.45
Since both X and Y are Gaussian and the transformation is linear, Z is Gaussian. Thus,
all we need are the mean and variance of Z. Since the means of X and Y are 0, so is the
mean of Z. Therefore, its variance is
h i
2 2 2
Z = E Z = E (X + 2Y )
= E X 2 + 4XY + 4Y 2
= E X 2 + 4E [XY ] + 4E Y 2
2 2
= X +4 X Y XY +4 Y (because the means are 0)
p p
= 3 + (4) 3 4 ( 0:4) + (4) (4)
p
= 19 3:2 3 = 13:457
The pdf of Z is
exp z 2 =26:915
fZ (z) = p
26:915
Problem 5.46
exp[ (x 5)2 =2] exp[ (y 3)2 =4]
a. fX (x) = p
2
; fY (y) = p
4
Problem 5.47
Since both X and Y are Gaussian and the transformation is linear, Z is Gaussian. Thus,
all we need are the mean and variance of Z. The mean of Z is
mZ = E [3X + Y ] = 3mX + mY
= (3) (1) + 2 = 5
Its variance is
h i h i
2
Z = E (Z mZ )2 = E (3X + Y 3mX mY )2
h i
= E (3X 3mX + Y mY )2
h i h i
= 9E (X mX )2 + 6E [(X mX ) (Y mY )] + E (Y mY )2
2
= 9 X +6 X Y XY + 2Y
p p
= 9 (3) + (6) 3 2 (0:2) + 2
p
= 29 + 1:2 6 = 31:939
The pdf of Z is h i
exp (z 5)2 =63:879
fZ (z) = p
63:879
Problem 5.48
exp[ (x 4)2 =6] exp[ (y 2)2 =10]
a. fX (x) = p
6
; fY (y) = p
10
e. The pdf of Z1 is h i
exp (z1 14)2 =64
fZ1 (z1 ) = p
64
f. The pdf of Z2 is h i
exp (z2 10)2 =64
fZ2 (z2 ) = p
64
Problem 5.49
>> ce5_1
Enter the parameter of the exponential pdf: f(x) = a*exp(-a*x)*u(x) => 2
Enter number of exponential random variables to be generated => 5000
No. of random variables generated
5000
Bin centers
Columns 1 through 6
5.2. COMPUTER EXERCISES 31
Theoretical pdf and histogram for 5000 computer generated exponential random variables
1.4
Histogram points
Exponential pdf; a = 2
1.2
0.8
fX(x)
0.6
0.4
0.2
0
0 0.5 1 1.5 2 2.5 3 3.5 4
x
>> ce5_2
Enter the mean of the Gaussian random variables => 2
Enter the standard deviation of the Gaussian random variables => 3
Enter number of Gaussian random variable pairs to be generated => 5000
Covarance matrix of X and Y vectors:
9.0928 0.2543
0.2543 9.0777
Current plot held
5.2. COMPUTER EXERCISES 33
Theoretical pdfs and histograms for 5000 computer generated independent Gaussian RV pairs
0.2
Histogram points
0.15 Gauss pdf; σ , µ = 3, 2
fX(x)
0.1
0.05
0
-10 -5 0 5 10 15
x
0.2
0.15
fY (y)
0.1
0.05
0
-10 -5 0 5 10 15
y
34 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
>> ce5_3
Enter the mean of the Gaussian random variables => 0
Enter the standard deviation of the Gaussian random variables => 2
Enter correlation coe¢ cient between adjacent Gaussian random variables => 0.8
Enter number of Gaussian random variables to be generated => 250
Current plot held
5.2. COMPUTER EXERCISES 35
Theoretical pdf and histogram for 250 computer generated Gaussian RVs
0.4
Histogram points
0.3 Gauss pdf; σ , µ = 2, 0
fX(x)
0.2
0.1
0
-5 -4 -3 -2 -1 0 1 2 3 4 5
x
Gaussian sample values with correlation 0.8 between samples
5
X(k)
-5
0 5 10 15 20 25 30 35 40 45 50
k
36 CHAPTER 5. A BRIEF REVIEW OF PROBABILITY THEORY
>> ce5_4
Enter number of Gaussian random variables to be generated => 1000
Enter number of uniform random variables to be summed to generate one GRV => 10
Estimated variance of the Gaussian RVs:
0.9091
Current plot held
5.2. COMPUTER EXERCISES 37
Theoretical pdf & histogram for 1000 Gaussian RVs each generated as the sum of 10 uniform RVs
0.45
Histogram points
0.4 Gauss pdf; σ = 1, µ = 0
0.35
0.3
0.25
fX(x)
0.2
0.15
0.1
0.05
0
-3 -2 -1 0 1 2 3 4
x