0% found this document useful (0 votes)
20 views72 pages

Module 2 Epcss

Chapter 1 discusses multiple random variables, focusing on bivariate cumulative distribution functions (cdf) and probability density functions (pdf). It defines key concepts such as marginal properties, independence of random variables, and provides examples and properties of bivariate distributions. The chapter includes mathematical derivations and solutions to problems related to joint distributions and independence.

Uploaded by

Anandrao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views72 pages

Module 2 Epcss

Chapter 1 discusses multiple random variables, focusing on bivariate cumulative distribution functions (cdf) and probability density functions (pdf). It defines key concepts such as marginal properties, independence of random variables, and provides examples and properties of bivariate distributions. The chapter includes mathematical derivations and solutions to problems related to joint distributions and independence.

Uploaded by

Anandrao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 72

Chapter 1

Multiple Random Variables:[?, ?, ?]

1.1 Bivariate-cdf and pdf:


Introduction

• When one measurement is made on each observation, univariate analysis is applied.


• If more than one measurement is made on each observation, multivariate analysis is applied.
• The two measurements will be called X and Y . Since X and Y are obtained for each observation,
the data for one observation is the pair (X, Y ).

Some examples:
• Height (X) and weight (Y ) are measured for each individual in a sample.
• If more than one measurement is made on each observation, multivariate analysis is applied.
• The two measurements will be called X and Y . Since X and Y are obtained for each observation,
the data for one observation is the pair (X, Y ).
• Temperature (X) and precipitation (Y ) are measured on a given day at a set of weather stations.

• The distribution of X and the distribution of Y can be considered individually using univariate
methods. That is, we can analyze
X1 , X2 , ..., Xn

Y1 , Y2 , ..., Yn

• using CDFs, densities, quantile functions, etc. Any property that described the behavior of the Xi
values alone or the Yi values alone is called marginal property.
• The two measurements will be called X and Y . Since X and Y are obtained for each observation,
the data for one observation is the pair (X, Y ).
• For example the ECDF FX (t) of X, the quantile function QY (p) of Y, the sample standard deviation
of σY of Y , and the sample mean X of X are all marginal properties.

Consider a continuous random variables X and Y , then their joint cumulative distribution function
(cdf) is defined as:
FXY (x, y) = P {(X ≤ x, Y ≤ y)}
The marginal cdf can be obtained from the joint distribution as:

FX (x) = P (X ≤ x, Y ≤ ∞) = FXY (x, ∞)


FY (y) = P (X ≤ ∞, Y ≤ y) = FXY (∞, y)

1
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

Properties of Bivariate Cumulative Density Function (Bivariate cdf )


1. If x and y are very large then the bivariate cdf is

FXY (∞, ∞) = P {X ≤ ∞, Y ≤ ∞} = 1

2. The range of cdf is


0 ≤ FXY (x, y) ≤ 1

3. The impossible events are

FXY (−∞, −∞) = P {X ≤ −∞, Y ≤ −∞} = 0


FXY (−∞, y) = P {∅(Y ≤ y)} = P (∅) = 0
FXY (x, −∞) = 0

4. Marginal cdfs are

FXY (∞, y) = P {S ∩ (Y ≤ y) = FY (y)


FXY (x, ∞) = P {S ∩ (X ≤ x) = FX (x)

Independent random variables :


Two random variablesX and Y are said to be independent if

FXY (x, y) = FX (x)FY (y) f or all x and y

1.1.1 Bivariate Probability Density Function (Bivariate PDF)


Bivariate probability density function (bivariate pdf) is defined as derivative of bivariate cdf and is
expressed as

∂2
fXY (x, y) = FXY (x, y) (1.1)
∂x∂y
The inverse relation of 1.2 is
Z y Z x
FXY (x, y) = fXY (u, v)dudv (1.2)
−∞ −∞

Properties of Bivariate Probability Density Function (Bivariate cdf )


1. The volume of the bivariate pdf is 1 i.e.,
Z ∞ Z ∞
fXY (∞, ∞) = fXY (x, y)dxdy = 1
−∞ −∞

2. The FXY (x, y) is a non decreasing function

fXY (x, y) ≥ 0

3.
Z y2 Z x2
P {x1 < X ≤ x2 , y1 < Y ≤ y2 } = fXY (x, y)dxdy
y1 x1

4. Marginal pdfs are


Z ∞
fX (x) = fXY (x, y)dy
y
Z ∞
fY (y) = fXY (x, y)dx
x

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 2
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

Independent random variables :


Two random variablesX and Y are said to be independent if

fXY (x, y) = fX (x)fY (y) f or all x and y


∂2 ∂ ∂
FXY (x, y) = fX (x) fY (y)
∂x∂y ∂x ∂y

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 3
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

3.17. The joint pdf of a bivariate r.v X, Y is given by



k(x + y) 0 < x < 2 0 < y < 2
fXY (x, y) =
0 otherwise

where k is a constant

[a.] Find the value of k

[b.] Find the marginal pdf ’s of X and Y

[c.] Are X and Y independent ? [?]

Solution: c.
a. 1 1
fX (x)fY (y) = (x + 1) × (y + 1)
It is given that fXY (x, y) = k(x + y) is joint pdf, 4 4
then 1
= (x + 1)(y + 1)
Z −∞ Z −∞ 8
f (x, y)k(x + y) dxdy = 1
−∞ −∞
1
fXY (x, y) = (x + y) 0<x<2 0<y<2
8
Z 2Z 2 Z 2 Z 2 
k(x + y) dxdy = k (x + y)dx dy
0 0 0 0 fXY (x, y) 6= fX (x)fY (y)
Z 2 2
x
= k [ + xy]20 dy Hence X and Y are not independent
0 2
Z 2
1 = k (2 + 2y) dy
0
y2 2
= k[2y + 2 ]
2 0
1 = 8k
1
k =
8

1
fXY (x, y) = (x + y) 0<x<2 0<y<2
8
b.
Z 2
fX (x) = k (x + y) dy
0
y2 2
= k[xy + ]
2 0
4 1
= k[2x + ] = [2x + 2]
2 8
1
= [x + 1] 0 < x < 2
4

Z 2
fY (y) = k (x + y) dx
0
x2
= k[ + xy]20
2
4 1
= k[ + 2y] = [2y + 2]
2 8
1
= [y + 1] 0 < y < 2
4

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 4
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

3.18. The joint pdf of a bivariate r.v X, Y is By symmetry


given by
 fY (y) = 2y 0<x<1
kxy 0 < x < 1 0 < y < 1
fXY (x, y) =
0 otherwise
where k is a constant
fX (x)fY (y) = 2x × 2y
[a.] Find the value of k.
= 4xy
[b.] Are X and Y independent ? fXY (x, y) = 4xy
[c.] Find P (X + Y < 1). [?] fXY (x, y) = fX (x)fY (y)

Solution: Hence X and Y are independent


y c. P (X + Y < 1)
The details of the limits are as shown in Figure 1.1
(0,1) (b) By taking line BC. Considering y varies from 0
to 1 and x is a variable its lower limit is 0 and its
upper limit is
x x1 = 0, y1 = 1, x2 = 1, y2 = 0
(0,0) (1,0)
(a)
y2 − y1
y − y1 = (x − x1 )
y x2 − x1
0−1
B y−1 = (x − 0)
(0,1) 1−0
y − 1 = −x
x = 1−y
C x
(0,0) A (1,0)
(b) Z 1 Z 1−y Z 1 Z 1−y 
kxy dxdy = ky xydx dy
Figure 1.1 0 0 0 0
Z 1 2
x
a. The value of k = ky [ ]1−y dy
It is given that fxy (x, y) = kxy is joint pdf, then 0 2 0
Z 1
Z −∞ Z −∞ 1
= 4y (1 − y)2 dy
f (x, y) kxydxdy = 1 0 2
−∞ −∞ Z 1
= 2 y(1 − 2y + y 2 ) dy
Z 1Z 1 Z 1 Z 1 
0
kxy dxdy = k xydx dy Z 1
0 0 0 0 = 2 (y − 2y 2 + y 3 ) dy
1
x2 0
Z
= k [ y]10 dy 
y2 y3 y4
1
0 2 = 2 −2 + ]
Z 1
y 2 3 4 0
1 = k dy 1
0 2 =
y2 1 6
= k[ ]10 = k
4 4
k = 4

fXY (x, y) = 4xy 0<x<2 0<y<2


b. Are X and Y independent ?
Z 1
fX (x) = k xy dy
0
y2 1
= k[x ]
2 0
1 x
= k[x ] = 4[ ]
2 2
= 2x 0 < x < 1

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 5
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

2. The joint pdf fXY (x, y) = c a constant, when 0 < x < 3 and 0 < y < 3, and is 0 otherwise

[a.] What is the value of of the constant c ?

[b.] What are the pdf for X and Y ?

[c.] What is FXY (x, y) when 0 < x < 3 and 0 < y < 3?

[d.] What are FXY (x, ∞) and Fxy (∞, y)?

[e.] Are X and Y independent ? [?]

Solution: d.
Z xZ 3
a.What is the value of of the constant c
It is given that fXY (x, y) = c is joint pdf, then FX (x) = FXY (x, ∞) = c dudv
0 0
Z x Z 3 
Z −∞
= c du dv
f (x, y) dxdy = 1 0 0
−∞ Z x
3Z 3 3 Z 3 = c [y]30 dv
Z Z 
c dxdy = c 1dx dy 0
0 0 0 0
Z x
Z 3 = 3c dv = 3c [v]x0
= c [x]30 dy 0
0 1 x
Z 3 = 3 x = 0<x<3
9 3
1 = c 3 dy = 3c[y]30
0
1 = 9c Z 3Z y
1 FY (y) = FXY (∞, y) = c dudv
c =
9 Z 3 Z y  0 0
1
fXY (x, y) = = c , du dv
9 0 0
Z 3
b. What are the pdf for X and Y ? = c [y]y0 dv
0
Z 3 Z 3
fX (x) = c 1 dy = yc dv = yc [v]30
0 0
= c[y]30 =c×3 1 y
= 3 y = 0<y<3
1 9 3
= 0<x<3
3 e.
From the above equations it is observed that
Z 3
1 1 1
fY (y) = c 1 dx fX (x)fY (y) = × =
0 3 3 9
= c[y]30 = c × 3 1
fXY (x, y) =
1 9
= 0<y<3 fXY (x, y) = fX (x)fY (y)
3
c. Therefore X and Y are independent. Similarly it is
Z xZ y observed that
FXY (x, y) = c dudv
0
Z x Z y 0
 FX (x)FY (y) = FXY (x, y)
= c du dv
Z0 x 0
= c [u]y0 dv
0
Z x
= cy dv = cy [v]x0
0
1
= xy 0 < x < 3, 0 < y < 3,
9

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 6
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

3. The joint pdf fxy (x, y) = c a constant, when 0 < x < 3 and 0 < y < 4, and is 0 otherwise

[a.] What is the value of of the constant c ?

[b.] What are the pdf for X and Y ?

[c.] What is Fxy (x, y) when 0 < x < 3 and 0 < y < 4?

[d.] What are Fxy (x, ∞) and Fxy (∞, y)?

[e.] Are X and Y independent ? [?]

Solution: d.
a. Z xZ 4
It is given that fxy (x, y) = c is joint pdf, then FX (x) = FXY (x, ∞) = c dudv
Z −∞
f (x, y) dxdy = 1 Z x Z 4  0 0

−∞ = c du dv
0 0
Z 4Z 3 Z 4 Z 3  Z x
c dxdy = c 1dx dy = c [y]40 dv
0 0 0 0 0
Z x
Z 4
= c [x]30 dy = 4c dv = 4c [v]x0
0
0
Z 4 1 x
= 4 x = 0<x<3
1 = c 3 dy = 3c[y]40 12 3
0
1 = 12c
1
c =
12 Z 3Z y
b. FY (y) = FXY (∞, y) = c dudv
Z 4 0 0
Z 3 Z y 
fX (x) = c 1 dy
0 = c , du dv
0 0
= c[y]40 =c×4 Z 3
1 = c [y]y0 dv
= 0<x<3 0
3 Z 3
Z 3 = yc dv = yc [v]30
fY (y) = c 1 dx 0
0 1 y
= 3 y = 0<y<3
= c[y]30 =c×3 12 4
1
= 0<y<4
4 From the above equations it is observed that
c.
Z xZ y
FXY (x, y) = c dudv fX (x)fY (y) = fXY (x, y)
0 0
Z x Z y 
= c du dv
0 0 Therefore X and Y are independent. Similarly it is
Z x
observed that
= c [u]y0 dv
0
Z x
= cy dv = cy [v]x0 FX (x)FY (y) = FXY (x, y)
0
1
= xy 0 < x < 3, 0 < y < 4,
12

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 7
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

4. The joint pdf fxy (x, y) = c a constant, when 0 < x < 2 and 0 < y < 3, and is 0 otherwise

[a.] What is the value of of the constant c ?

[b.] What are the pdf for X and Y ?

[c.] What is Fxy (x, y) when 0 < x < 2 and 0 < y < 3?

[d.] What are Fxy (x, ∞) and Fxy (∞, y)?

[e.] Are X and Y independent ? [?]

Solution: d.
a. Z xZ 3
It is given that fxy (x, y) = c is joint pdf, then FX (x) = FXY (x, ∞) = c dudv
Z −∞
f (x, y) dxdy = 1 Z x Z 3  0 0

−∞ = c du dv
0 0
Z 3Z 2 Z 3 Z 2  Z x
c dxdy = c 1dx dy = c [y]30 dv
0 0 0 0 0
Z x
Z 3
= c [x]20 dy = 4c dv = 3c [v]x0
0
0
Z 3 1 x
= 4 x = 0<x<2
1 = c 2 dy = 2c[y]30 6 2
0
1 = 6c
1
c =
6 Z 2Z y
b. FY (y) = FXY (∞, y) = c dudv
Z 3 0 0
Z 2 Z y 
fX (x) = c 1 dy
0 = c du dv
0 0
= c[y]30 =c×3 Z 2
1 = c [y]y0 dv
= 0<x<2 0
2 Z 2
Z 2 = yc dv = yc [v]20
fY (y) = c 1 dx 0
0 1 y
= 3 y = 0<y<3
= c[y]20 =c×2 6 3
1
= 0<y<3
3 From the above equations it is observed that
c.
Z xZ y
FXY (x, y) = c dudv fX (x)fY (y) = fXY (x, y)
0 0
Z x Z y 
= c du dv
0 0 Therefore X and Y are independent. Similarly it is
Z x
observed that
= c [u]y0 dv
0
Z x
= cy dv = cy [v]x0 FX (x)FY (y) = FXY (x, y)
0
1
= xy 0 < x < 2, 0 < y < 3,
6

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 8
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

5. A bivariate pdf for the discrete random variables. X and Y is

0.2δ(x)δ(y) + 0.3δ(x − 1)δ(y) + 0.3δ(x)δ(y − 1) + cδ(x − 1)δ(y − 1)

[a.] What is the value of of the constant c ?


[b.] What are the pdf for X and Y ?
[c.] What is FXY (x, y) when 0 < x < 1 and 0 < y < 1?
[d.] What are FXY (x, ∞) and FXY (∞, y)?
[e.] Are X and Y independent ? [?]

Solution:

fXY (x, y) = 0.2δ(x)δ(y) + 0.3δ(x − 1)δ(y) + 0.3δ(x)δ(y − 1) + cδ(x − 1)δ(y − 1)

a.
It is given that the given function is bivariate pdf then,

1 = 0.2 + 0.3 + 0.3 + c


c = 1 − 0.8
c = 0.2

Hence given function is

0.2δ(x)δ(y) + 0.3δ(x − 1)δ(y) + 0.3δ(x)δ(y − 1) + 0.2δ(x − 1)δ(y − 1)

b.

fX (x) = 0.2δ(x) + 0.3δ(x − 1) + 0.3δ(x) + 0.2δ(x − 1)


= 0.5δ(x) + 0.5δ(x − 1)

fY (y) = 0.2δ(y) + 0.3δ(y) + 0.3δ(y − 1) + 0.2δ(y − 1)


= 0.5δ(y) + 0.5δ(y − 1)

c.

FXY (x, y) = 0.2 0 < x < 1 and 0 < y < 1

d.

f (x, y) = 0.2δ(x)δ(y) + 0.3δ(x − 1)δ(y) + 0.3δ(x)δ(y − 1) + 0.2δ(x − 1)δ(y − 1)


FX (x) = 0.5u(x) + 0.5u(x − 1)
FY (y) = 0.5u(y) + 0.5u(y − 1)

e.

fX (x)fY (y) = [0.5δ(x) + 0.5δ(x − 1)][0.5δ(y) + 0.5δ(y − 1)]


= 0.25δ(x)δ(y) + 0.25δ(x − 1)δ(y) + 0.25δ(x)δ(y − 1) + 0.25δ(x − 1)δ(y − 1)

From the above equations it is observed that

fX (x)fY (y) 6= fXY (x, y)

Therefore X and Y are not independent.

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 9
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

6. A bivariate pdf for the discrete random variables. X and Y is

0.3δ(x)δ(y) + 0.2δ(x − 1)δ(y) + 0.3δ(x)δ(y − 1) + cδ(x − 1)δ(y − 1)

[a.] What is the value of of the constant c ?


[b.] What are the pdf for X and Y ?
[c.] What is FXY (x, y) when 0 < x < 1 and 0 < y < 1?
[d.] What are FXY (x, ∞) and FXY (∞, y)?
[e.] Are X and Y independent ? [?]

Solution:

fXY (x, y) = 0.3δ(x)δ(y) + 0.2δ(x − 1)δ(y) + 0.3δ(x)δ(y − 1) + cδ(x − 1)δ(y − 1)

a.
It is given that the given function is bivariate pdf then,

1 = 0.3 + 0.2 + 0.3 + c


c = 1 − 0.8
c = 0.2

Hence given function is

fXY (x, y) = 0.3δ(x)δ(y) + 0.2δ(x − 1)δ(y) + 0.3δ(x)δ(y − 1) + 0.2δ(x − 1)δ(y − 1)

b.

fX (x) = 0.3δ(x) + 0.2δ(x − 1) + 0.3δ(x) + 0.2δ(x − 1)


= 0.6δ(x) + 0.4δ(x − 1)

fY (y) = 0.3δ(y) + 0.2δ(y) + 0.3δ(y − 1) + 0.2δ(y − 1)


= 0.5δ(y) + 0.5δ(y − 1)

c.

FXY (x, y) = 0.3 0 < x < 1 and 0 < y < 1

d.

f (x, y) = 0.3δ(x)δ(y) + 0.2δ(x − 1)δ(y) + 0.3δ(x)δ(y − 1) + 0.2δ(x − 1)δ(y − 1)


FX (x) = 0.6u(x) + 0.4u(x − 1)
FY (y) = 0.5u(y) + 0.5u(y − 1)

e.

fX (x)fY (y) = [0.6δ(x) + 0.4δ(x − 1)][0.5δ(y) + 0.5δ(y − 1)]


= 0.3δ(x)δ(y) + 0.2δ(x − 1)δ(y) + 0.3δ(x)δ(y − 1) + 0.2δ(x − 1)δ(y − 1)

From the above equations it is observed that

fX (x)fY (y) 6= fXY (x, y)

Therefore X and Y are not independent.

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 10
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

7. A bivariate pdf for the discrete random variables. X and Y is

0.2δ(x)δ(y) + 0.3δ(x − 1)δ(y) + 0.2δ(x)δ(y − 1) + cδ(x − 1)δ(y − 1)

[a.] What is the value of of the constant c ?

[b.] What are the pdf for X and Y ?

[c.] What is FXY (x, y) when 0 < x < 1 and 0 < y < 1?

[d.] What are FXY (x, ∞) and FXY (∞, y)?

[e.] Are X and Y independent ? [?]

Solution:

fXY (x, y) = 0.2δ(x)δ(y) + 0.3δ(x − 1)δ(y) + 0.2δ(x)δ(y − 1) + cδ(x − 1)δ(y − 1)

a.
It is given that the given function is bivariate pdf then,

1 = 0.2 + 0.3 + 0.2 + c


c = 1 − 0.7 = 0.3

Hence given function is

fXY (x, y) = 0.2δ(x)δ(y) + 0.3δ(x − 1)δ(y) + 0.2δ(x)δ(y − 1) + cδ(x − 1)δ(y − 1)

b.

fX (x) = 0.2δ(x) + 0.3δ(x − 1) + 0.2δ(x) + 0.2δ(x − 1)


= 0.4δ(x) + 0.6δ(x − 1)

fY (y) = 0.2δ(y) + 0.3δ(y) + 0.2δ(y − 1) + 0.3δ(y − 1)


= 0.5δ(y) + 0.5δ(y − 1)

c.

FXY (x, y) = 0.2 0 < x < 1 and 0 < y < 1

d.

f (x, y) = 0.2δ(x)δ(y) + 0.3δ(x − 1)δ(y) + 0.2δ(x)δ(y − 1) + cδ(x − 1)δ(y − 1)


FX (x) = 0.4u(x) + 0.6u(x − 1)
FY (y) = 0.5u(y) + 0.5u(y − 1)

e.

fX (x)fY (y) = [0.4δ(x) + 0.6δ(x − 1)][0.5δ(y) + 0.5δ(y − 1)]


= 0.2δ(x)δ(y) + 0.3δ(x − 1)δ(y) + 0.2δ(x)δ(y − 1) + 0.3δ(x − 1)δ(y − 1)

From the above equations it is observed that

fX (x)fY (y) 6= fXY (x, y)

Therefore X and Y are not independent.

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 11
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

Example 3.5. Given A bivariate pdf for the discrete random variables. X and Y is
(x2 − 1.4xy + y 2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.4283π 1.02
[a.] What are the pdf for X and Y ?
R∞ R∞
[b.] FXY (∞, ∞) = −∞ −∞ fXY (x, y)dxdy = 1

[c.] Are X and Y independent ? [?]

Solution:
a) The pdf for X and Y
a = 1, b = 1.4, c = 1

x2 − 1.4xy + y 2 = y 2 − 2 × 0.7xy + 0.49x2 + 0.51x2


= (y − 0.7x)2 + 0.51x2

(y − 0.7x)2 + 0.51x2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.4283π 1.02

Z ∞
fX (x) = fXY (xy)dy
−∞
Z ∞
(y − 0.7x)2
 
1 −0.5x2
= e exp − dy
1.4283π −∞ 1.02

u y − 0.7x
√ = √
2 1.02
r
1.02
u = y − 0.7x
2
r
1.02
du = dy
2
Also
Z ∞
1 z2
√ e− 2 dz = 1

Z−∞
∞ √
z2
e− 2 dz = 2π
−∞


r  2
Z
1 2 1.02 u
fX (x) = e−0.5x exp − du
1.4283π −∞ 2 2
1.02 √
r
1 −0.5x2
= e 2π
1.4283π 2
1 2
= √ e−0.5x

1 −x2 /2
= √ e

x2 − 1.4xy + y 2 = x2 − 2 × 0.7xy + 0.49y 2 + 0.51y 2


= (x − 0.7y)2 + 0.51y 2

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 12
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

a) The pdf for Y

((x − 0.7y)2 + 0.51y 2 )


 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.4283π 1.02

Z ∞
fY (y) = fXY (xy)dy
−∞
Z ∞
(x − 0.7y)2
 
1 −0.5y 2
= e exp − dx
1.4283π −∞ 1.02

u x − 0.7y
√ = √
2 1.02
r
1.02
u = x − 0.7y
2
r
1.02
du = dx
2
Also
Z ∞
1 z2
√ e− 2 dz = 1

Z−∞
∞ √
z2
e− 2 dz = 2π
−∞


r
u2
Z  
1 2 1.02
fY (y) = e−0.5y exp − du
1.4283π −∞ 2 2
1.02 √
r
1 −0.5y 2
= e 2π
1.4283π 2
1 2
= √ e−0.5y

1 −y2 /2
= √ e

b. R∞ R∞
FXY (∞, ∞) = −∞ −∞ fXY (x, y)dxdy = 1

Z ∞ Z ∞
fXY (xy)dxdy =
−∞ −∞
Z ∞ Z ∞ 
= fX (x) fY (y)dy dx
−∞
Z ∞ Z−∞
∞ 
1 −y2 /2
= fX (x) √ e dy dx

Z−∞

−∞

= fX (x)dx
Z−∞

1 2
= √ e−x /2 dx
−∞ 2π
= 1

c. Are X and Y independent

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 13
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

(x2 − 1.4xy + y 2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.4283π 1.02

  
1 2 1 2
fX (x)fY (y) = √ e−x /2 √ e−y /2
2π 2π
1 − x2 +y2
= √ e 2

fXY (xy) 6= fX (x)fY (y)

Bivariate random variables X and Y are not independent

8. Given A bivariate pdf for the discrete random variables. X and Y is

(x2 + 1.4xy + y 2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.4283π 1.02

[a.] What are the pdf for X and Y ?


R∞ R∞
[b.] FXY (∞, ∞) −∞ −∞ fXY (x, y)dxdy = 1

[c.] Are X and Y independent ? [?]

Solution: a = 1, b = 1.4, c = 1
a.

x2 + 1.4xy + y 2 = y 2 + 2 × 0.7xy + (0.7x)2 + 0.51x2


= (y + 0.7x)2 + 0.51x2

(y + 0.7x)2 + 0.51x2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.4283π 1.02

Z ∞
fX (x) = fXY (xy)dy
−∞
Z ∞
(y + 0.7x)2
 
1 −0.5x2
= e exp − dy
1.4283π −∞ 1.02

u y + 0.7x
√ = √
2 1.02
r
1.02
u = y + 0.7x
2
r
1.02
du = dy
2
Also
Z ∞
1 z2
√ e− 2 dz = 1

Z−∞
∞ √
z2
e− 2 dz = 2π
−∞

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 14
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]


r  2
Z
1 2 1.02 u
fX (x) = e−0.5x exp − du
1.4283π 2
−∞ 2
1.02 √
r
1 −0.5x2
= e 2π
1.4283π 2
1 2
= √ e−0.5x

1 −x2 /2
= √ e

x2 + 1.4xy + y 2 = x2 + 2 × 0.7xy + (0.7y)2 + 0.51y 2


= (x + 0.7y)2 + 0.51y 2

(x + 0.7y)2 + 0.51y 2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.4283π 1.02

Z ∞
fY (y) = fXY (xy)dy
−∞
Z ∞
(x + 0.7y)2
 
1 −0.5y 2
= e exp − dx
1.4283π −∞ 1.02

u x + 0.7y
√ = √
2 1.02
r
1.02
u = x + 0.7y
2
r
1.02
du = dx
2
Also
Z ∞
1 z2
√ e− 2 dz = 1

Z−∞
∞ √
z2
e− 2 dz = 2π
−∞


r
u2
Z  
1 2 1.02
fY (y) = e−0.5y exp − du
1.4283π −∞ 2 2
1.02 √
r
1 −0.5y 2
= e 2π
1.4283π 2
1 2
= √ e−0.5y

1 2
= √ e−y /2

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 15
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

b.
Z ∞ Z ∞
fXY (xy)dxdy =
−∞ −∞
Z ∞ Z ∞ 
= fX (x) fY (y)dy dx
−∞
Z ∞ Z−∞
∞ 
1 −y2 /2
= fX (x) √ e dy dx

Z−∞

−∞

= fX (x)dx
Z−∞

1 2
= √ e−x /2 dx
−∞ 2π
= 1

c.
(x2 + 1.4xy + y 2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.4283π 1.02

  
1 2 1 2
fX (x)fY (y) = √ e−x /2 √ e−y /2
2π 2π
1 − x2 +y2
= √ e 2

fXY (xy) 6= fX (x)fY (y)

Bivariate random variables X and Y are not independent

9. Given A bivariate pdf for the discrete random variables. X and Y is

(x2 − 0.6xy + y 2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.9079π 1.82

[a.] What are the pdf for X and Y ?


R∞ R∞
[b.] FXY (∞, ∞) −∞ −∞ fXY (x, y)dxdy = 1

[c.] Are X and Y independent ? [?]

Solution: a = 1, b = 1.4, c = 1
a.

x2 − 0.6xy + y 2 = y 2 − 2 × 0.3xy + (0.3x)2 + 0.91x2


= (y − 0.3x)2 + 0.91x2

(y − 0.3x)2 + 0.91x2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.9079π 1.82

Z ∞
fX (x) = fXY (xy)dy
−∞
Z ∞
(y − 0.3x)2
 
1 2
= e−0.5x exp − dy
1.9079π −∞ 1.82

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 16
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

u y − 0.3x
√ = √
2 1.82
r
1.82
u = y − 0.3x
2
r
1.82
du = dy
2
Also
Z ∞
1 z2
√ e− 2 dz = 1

Z−∞
∞ √
z2
e− 2 dz = 2π
−∞


r  2
Z
1 2 1.82 u
fX (x) = e−0.5x exp − du
1.9079π 2
−∞ 2
1.82 √
r
1 −0.5x2
= e 2π
1.9079π 2
1 2
= √ e−0.5x

1 −x2 /2
= √ e

x2 − 0.6xy + y 2 = x2 − 2 × 0.3xy − (0.3y)2 + 0.91y 2


= (x − 0.3y)2 + 0.91y 2

(x − 0.3y)2 + 0.91y 2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.9079π 1.82

Z ∞
fY (y) = fXY (xy)dy
−∞
Z ∞
(x − 0.3y)2
 
1 −0.5y 2
= e exp − dx
1.9079π −∞ 1.92

u x − 0.3y
√ = √
2 1.92
r
1.02
u = x − 0.3y
2
r
1.92
du = dx
2
Also
Z ∞
1 z2
√ e− 2 dz = 1

Z−∞
∞ √
z2
e− 2 dz = 2π
−∞

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 17
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]


r
u2
Z  
1 2 1.92
fY (y) = e−0.5y exp − du
1.9079π 2
−∞ 2
1.92 √
r
1 −0.5y 2
= e 2π
1.9079π 2
1 2
= √ e−0.5y

1 −y2 /2
= √ e

b.
Z ∞ Z ∞
fXY (xy)dxdy =
−∞ −∞
Z ∞ Z ∞ 
= fX (x) fY (y)dy dx
−∞
Z ∞ Z−∞
∞ 
1 2
= fX (x) √ e−y /2 dy dx

Z−∞

−∞

= fX (x)dx
Z−∞

1 2
= √ e−x /2 dx
−∞ 2π
= 1

c.
(x2 − 0.3xy + (0.3y)2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.9079π 1.92

  
1 2 1 2
fX (x)fY (y) = √ e−x /2 √ e−y /2
2π 2π
1 x2 +y 2
= √ e− 2

fXY (xy) 6= fX (x)fY (y)

Bivariate random variables X and Y are not independent

10. Given A bivariate pdf for the discrete random variables. X and Y is

(x2 + 1.0xy + y 2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.7321π 1.5

[a.] What are the pdf for X and Y ?


R∞ R∞
[b.] FXY (∞, ∞) −∞ −∞ fXY (x, y)dxdy = 1

[c.] Are X and Y independent ? [?]

Solution:
a.

x2 + 1.0xy + y 2 = y 2 + 2 × 0.5xy + (0.5x)2 + 0.75x2


= (y + 0.5x)2 + 0.75x2

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 18
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

(y + 0.5x)2 + 0.75x2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.7321π 1.50

Z ∞
fX (x) = fXY (xy)dy
−∞
Z ∞
(y + 0.5x)2
 
1 −0.5x2
= e exp − dy
1.7321π −∞ 1.50

u y + 0.5x
√ = √
2 1.50
r
1.50
u = y + 0.5x
2
r
1.50
du = dy
2
Also
Z ∞
1 z2
√ e− 2 dz = 1

Z−∞
∞ √
z2
e− 2 dz = 2π
−∞


r  2
Z
1 2 1.50 u
fX (x) = e−0.5x exp − du
1.7321π −∞ 2 2
1.50 √
r
1 −0.5x2
= e 2π
1.7321π 2
1 2
= √ e−0.5x

1 −x2 /2
= √ e

x2 + 1.0xy + y 2 = x2 + 2 × 0.5xy + (0.5y)2 + 0.75y 2


= (x + 0.5y)2 + 0.75y 2

(x + 0.5y)2 + 0.75y 2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.7321π 1.50

Z ∞
fX (x) = fXY (xy)dy
−∞
Z ∞
(x + 0.5y)2
 
1 −0.5y 2
= e exp − dy
1.7321π −∞ 1.50

u x + 0.5y
√ = √
2 1.50
r
1.50
u = x + 0.5y
2
r
1.50
du = dx
2

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 19
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

Also
Z ∞
1 z2
√ e− 2 dz = 1

Z−∞
∞ √
z2
e− 2 dz = 2π
−∞


r  2
Z
1 2 1.50 u
fX (x) = e−0.5x exp − du
1.7321π −∞ 2 2
1.50 √
r
1 −0.5x2
= e 2π
1.7321π 2
1 2
= √ e−0.5x

1 −x2 /2
= √ e

b.
Z ∞ Z ∞
fXY (xy)dxdy =
−∞ −∞
Z ∞ Z ∞ 
= fX (x) fY (y)dy dx
−∞
Z ∞ Z−∞
∞ 
1 2
= fX (x) √ e−y /2 dy dx

Z−∞

−∞

= fX (x)dx
Z−∞

1 2
= √ e−x /2 dx
−∞ 2π
= 1

c.
(x2 + 0.5xy + (0.5y)2 )
 
1
fXY (xy) = exp − − ∞ < x, y < ∞
1.7321π 1.50

  
1 2 1 2
fX (x)fY (y) = √ e−x /2 √ e−y /2
2π 2π
1 − x2 +y2
= √ e 2

fXY (xy) 6= fX (x)fY (y)

Bivariate random variables X and Y are not independent

11 As shown in Figure is a region in the x, y plane where the bivariate pdf fXY (xy) = c.
Elsewhere the pdf is 0.

[a.] What value must c have?

[b.] Evaluate FXY (1, 1)

[c.] Find the pdfs fX (x) and fY (y).

[d.] Are X and Y independent ? [?]

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 20
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

y Figure 1.2
Integration Limits
(-2, 2 ) (2, 2 )
A B By taking line CB. Considering x varies from -2 to 2
and y is a variable its upper limit is 2 and its lower
lower limit is
x1 = −2, y1 = −2, x2 = 2, y2 = 2
x
y2 − y1
y − y1 = (x − x1 )
x2 − x1
2 − (−2)
y − (−2) = (x − (−2))
C 2 − (−2)
(-2, -2 ) y+2 = x+2
y = x

Solution:
a.
Z 2 Z 2
FXY (2, 2) = c dydx
−2 x
Z 2 Z 2 
1 = c dy dx
−2 x
2 2 2
x2
Z Z 
1 = c [y]2x dx
=c [2 − x]dx = c 2x −
−2 −2 2 −2
c 2 c
4x − x2 −2 = [4 × 2 − (2)2 ] − [4 × (−2) − (−2)2 ]
  
=
2 2
c c
= [[8 − 4] − [−8 − 4]] = [4 + 12]
2 2
1 = 8c
1
c =
8

b.

y Figure 1.3
Integration Limits
(-2,2 ) (2,2 )
A By taking line CD. Considering x varies from -2 to 2
B
and y is a variable its upper limit is 1 and its lower
1 D lower limit is
(1,1 )
x1 = −2, y1 = −2, x2 = 1, y2 = 1
x
1 1 − y1
y − y1 = (x − x1 )
1 − x1
1 − (−2)
y − (−2) = (x − (−2))
C 1 − (−2)
(-2,-2 ) y+2 = x+2
y = x

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 21
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

Z 1 Z 1
FXY (1, 1) = c dydx
−2 x
Z 1 Z 1 
= c dy dx
−2 x
1 1 1 1
x2
Z Z  Z
= c [y]1x dx
=c [1 − x]dx = c x− dx
−2 −2 −2 2 −2
c 1 c
2x − x2 −2 = [2 × 1 − (1)2 ] − [2 × (−2) − (−2)2 ]

=
2 2
c c
= [[2 − 1] − [−4 − 4]] = [1 + 8]
2 2
= 9c
9
=
16
Z y
c. fY (y) = c dx
Z 2 −2
fX (x) = c dy = c [x]y−2 = c [y + 2]
x
= c [y]2x = c [2 − x] 1
= [[y + 2]
1 8
= [2 − x]
8

1
fXY (x, y) =
8
1 1
fX (x)fY (y) = [2 − x] [y + 2]
8 8
fXY (x, y) 6= fX (x)fY (y)

12 As shown in Figure is a region in the x, y plane where the bivariate pdf fXY (xy) = c.
Elsewhere the pdf is 0.

[a.] What value must c have?

[b.] Evaluate FXY (1, 1)

[c.] Find the pdfs fX (x) and fY (y).

[d.] Are X and Y independent ? [?]

y Figure 1.4
Integration Limits
(-2, 2 ) (2, 2 )
A By taking line BC. Considering x varies from -2 to 2
B
and y is a variable its upper limit is 2 and its lower
lower limit is
x1 = −2, y1 = 2, x2 = 2, y2 = −2
x
y2 − y1
y − y1 = (x − x1 )
x2 − x1
−2 − 2
y−2 = (x − (−2))
C 2+2
(2, -2 ) y − 2 = −x − 2
y = −x

Solution:

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 22
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

a.
Z 2 Z 2
FXY (2, 2) = c dydx
−2 −x
Z 2 Z 2 
1 = c dy dx
−2 −x
2 2 2
x2
Z Z 
1 = c [y]2−x dx
=c [2 + x]dx = c 2x −
−2 −2 2 −2
c 2 c
4x + x2 −2 = [4 × 2 − (2)2 ] − [4 × (−2) − (−2)2 ]
  
=
2 2
c c
= [[8 − 4] − [−8 − 4]] = [4 + 12]
2 2
1 = 8c
1
c =
8
b.

y Integration Limits
By taking line DF. Considering x varies from -1 to 1
(-2, 2 ) (2, 2 ) and y is a variable its upper limit is 1 and its lower
A
B lower limit is
D (1, 1 ) E x1 = −1, y1 = 1, x2 = 1, y2 = −1
(-1, 1 )
x y2 − y1
y − y1 = (x − x1 )
x2 − x1
F −1 − 1
(1, -1 )
y−1 = (x − (−1))
1+1
C y − 1 = −x − 1
(2, -2 )
y = −x

Z 1 Z 1
FXY (1, 1) = c dydx
−1 −x
Z 1 Z 1 
= c dy dx
−1 −x
1 1 1
x2
Z Z 
= c [y]1−x dx
=c [1 + x]dx = c x + dx
−1 −1 2 −1
c 1 c
2x + x2 −1 = [2 × 1 + (1)1 ] − [2 × (−1) + (−1)2 ]

=
2 2
c c
= [[2 + 1] − [−2 + 1]] = [3 + 1]
2 2
1
= 2c = 2
8
1
=
4
c.
Z 2
fX (x) = c dy
−x
= c [y]2−x = c [2 + x]
1
= [2 + x]
8

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 23
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

Z 2
fY (y) = c dx
−y
= c [x]2−y = c [2 + y]
1
= [2 + y]
8

1 1
fX (x)fY (y) = [2 + x] [2 + y]
8 8
e.

fX (x)fY (y) 6= fXY (xy)

Therefore X and Y are independent.

13 As shown in Figure is a region in the x, y plane where the bivariate pdf fXY (xy) = c.
Elsewhere the pdf is 0.

[a.] What value must c have?

[b.] Evaluate FXY (1, 1)

[c.] Find the pdfs fX (x) and fY (y).

[d.] Are X and Y independent ? [?]

y Integration Limits
By taking line BC. Considering x varies from -2 to 2
(2, 2 ) and y is a variable its lower limit is -2 and its upper
C limit is
x1 = −2, y1 = −2, x2 = 2, y2 = 2
y2 − y1
x y − y1 = (x − x1 )
x2 − x1
2 − (−2)
y − (−2) = (x − (−2))
2 − (−2)
B A y+2 = x+2
(-2, -2 ) (2, -2 )
y = x

Solution:

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 24
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

a.
Z 2 Z x
FXY (2, 2) = c dydx
−2 −2
Z 2 Z x 
1 = c dy dx
−2 −2
2 2 2
x2
Z Z 
1 = c [y]x−2 dx
=c [x + 2]dx = c + 2x
−2 −2 2 −2
c 2 2 c 2 2

= x + 4x −2 = [(2) + 4 × 2] − [(−2) + 4 × (−2)]
2 2
c c
= [[4 + 8] − [4 − 8]] = [12 − 4]
2 2
1 = 8c
1
c =
8
b.
y Integration Limits
By taking line BD. Considering x varies from -2 to 1
(2, 2 ) and y is a variable its lower limit is -2 and its upper
C
limit is
(1, 1 )
x1 = −2, y1 = −2, x2 = 1, y2 = 1
D

x y2 − y1
y − y1 = (x − x1 )
x2 − x1
1 − (−2)
y − (−2) = (x − (−2))
1 − (−2)
B A
y+2 = x+2
(-2, -2 ) (2, -2 )
y = x

Z 1 Z x
FXY (1, 1) = c dydx
−2 −2
Z 1 Z x 
= c dy dx
−2 −2
1 1 1
x2
Z Z 
= c [y]x−2 dx
=c [x + 2]dx = c +x dx
−2 −2 2 −2
c 2 1 c
[(1)1 + 4 × 1] − [(−2)2 + 4 × (−2)+]

= x + 4x −2 =
2 2
c c
= [[1 + 4] − [4 − 8]] = [5 + 4]
2 2
9
=
16
c.
Z x
fX (x) = c dy
−2
= c [y]x−2
= c [x + 2]
1
= [x + 2]
8
Z 2
fY (y) = c dx
y
= c [x]2y = c [2 − y]
1
= [2 − y]
8

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 25
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

1 1
fX (x)fY (y) = [x + 2] [2 − y]
8 8
e.

fX (x)fY (y) 6= fXY (xy)

Therefore X and Y are independent.

14 A bivariate random variable has the following cdf.

FXY (xy) = c(x + 1)2 (y + 1)2 (−1 < x < 4) and (−1 < y < 2)

outside of the given intervals, the bivariate cdf is as required by theory

[a.] What value must c have?

[b.] Find the bivariate pdf

[c.] Find the cdfs FX (x) and FY (y).

[d.] Evaluate P {(X ≤ 2) ∩ (Y ≤ 1)}

[e.] Are the bivariate random variables independent ? [?]

Solution:
a.

FXY (4, 2) = c(x + 1)2 (y + 1)2


1 = c(4 + 1)2 (2 + 1)2 = (25)(9)
1 = c225
1
c =
225
b. Bivariate pdf

∂2
c(x + 1)2 (y + 1)2 = c4(x + 1)(y + 1)
∂x∂y
4
= (x + 1)(y + 1)
225
c. The cdfs FX (x) and FY (y).

FX (x) = FXY (x, ∞) = FXY (x, 2)


= c(x + 1)2 (2 + 1)2
9
= (x + 1)2 (−1 < x < 4)
225

FY (y) = FXY (∞, y) = FXY (2, y)


= c(2 + 1)2 (y + 1)2
25
= (y + 1)2 (−1 < y < 2)
225

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 26
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

d. P {(X ≤ 2) ∩ (Y ≤ 1)}

P {(X ≤ 2) ∩ (Y ≤ 1)} = FXY (2, 1)


= c(x + 1)2 (y + 1)2
= c(2 + 1)2 (1 + 1)2
9×4
=
225
4
=
25
e.

FXY (xy) = c(x + 1)2 (y + 1)2


9 25
FX (x)FY (y) = (x + 1)2 (y + 1)2
225 225
1
= (x + 1)2 (y + 1)2
225
FX (x)FY (y) = FXY (xy)

Therefore X and Y are independent.

15 A bivariate random variable has the following cdf.

FXY (xy) = c(x + 1)2 (y + 1)2 (−1 < x < 3) and (−1 < y < 4)

outside of the given intervals, the bivariate cdf is as required by theory


[a.] What value must c have?
[b.] Find the bivariate pdf
[c.] Find the cdfs FX (x) and FY (y).
[d.] Evaluate P {(X ≤ 2) ∩ (Y ≤ 1)}
[e.] Are the bivariate random variables independent ? [?]

Solution:
a.

FXY (3, 4) = c(x + 1)2 (y + 1)2


1 = c(3 + 1)2 (4 + 1)2 = (16)(25)
1 = c400
1
c =
400
b. Bivariate pdf

∂2
c(x + 1)2 (y + 1)2 = c4(x + 1)(y + 1)
∂x∂y
1
= (x + 1)(y + 1)
100
c. The cdfs FX (x) and FY (y).

FX (x) = FXY (x, ∞) = FXY (x, 4)


= c(x + 1)2 (4 + 1)2
25
= (x + 1)2 (−1 < x < 3)
400

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 27
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

FY (y) = FXY (∞, y) = FXY (3, y)


= c(3 + 1)2 (y + 1)2
16
= (y + 1)2 (−1 < y < 2)
400
d. P {(X ≤ 2) ∩ (Y ≤ 1)}

P {(X ≤ 2) ∩ (Y ≤ 1)} = FXY (2, 1)


= c(x + 1)2 (y + 1)2
= c(2 + 1)2 (1 + 1)2
9×4
=
400
9
=
100
e.

FXY (xy) = c(x + 1)2 (y + 1)2


25 16
FX (x)FY (y) = (x + 1)2 (y + 1)2
400 400
1
= (x + 1)2 (y + 1)2
400
FX (x)FY (y) = FXY (xy)

Therefore X and Y are independent.

16 A bivariate random variable has the following cdf.

FXY (xy) = c(x + 1)2 (y + 1)2 (−1 < x < 3) and (−1 < y < 2)

outside of the given intervals, the bivariate cdf is as required by theory


[a.] What value must c have?

[b.] Find the bivariate pdf

[c.] Find the cdfs FX (x) and FY (y).

[d.] Evaluate P {(X ≤ 2) ∩ (Y ≤ 1)}

[e.] Are the bivariate random variables independent ? [?]

Solution:
a.

FXY (3, 2) = c(x + 1)2 (y + 1)2


1 = c(3 + 1)2 (2 + 1)2 = (16)(9)
1 = c144
1
c =
144
b. Bivariate pdf

∂2
c(x + 1)2 (y + 1)2 = c4(x + 1)(y + 1)
∂x∂y
4
= (x + 1)(y + 1)
144

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 28
1.1. Bivariate-cdf and pdf: Chapter 1. Multiple Random Variables:[?, ?, ?]

c. The cdfs FX (x) and FY (y).

FX (x) = FXY (x, ∞) = FXY (x, 2)


= c(x + 1)2 (2 + 1)2
9
= (x + 1)2 (−1 < x < 3)
144

FY (y) = FXY (∞, y) = FXY (3, y)


= c(3 + 1)2 (y + 1)2
16
= (y + 1)2 (−1 < y < 2)
144
d. P {(X ≤ 2) ∩ (Y ≤ 1)}

P {(X ≤ 2) ∩ (Y ≤ 1)} = FXY (2, 1)


= c(x + 1)2 (y + 1)2
= c(2 + 1)2 (1 + 1)2
9×4
=
144
36
=
144
e.

FXY (xy) = c(x + 1)2 (y + 1)2


9 16
FX (x)FY (y) = (x + 1)2 (y + 1)2
144 144
1
= (x + 1)2 (y + 1)2
144
FX (x)FY (y) = FXY (xy)

Therefore X and Y are independent.

Note: Entire material is taken from different text books or from the Internet (different
websites). Slightly it is modified from the original content. It is not for any commercial
purpose. It is used to teach students. Suggestions are always encouraged.

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 29
1.2. Bivariate-Expectations Chapter 1. Multiple Random Variables:[?, ?, ?]

1.2 Bivariate-Expectations
The expectation operation to a continuous random variables X and Y , is defined as:
Z ∞Z ∞
E[g(X, Y )] = g(x, y)fXY (x, y)dxdy
−∞ −∞
where g(x, y) is an arbitrary function of two variables. If g(x, y) is of only single random variable x then
Z ∞ Z ∞ Z ∞
E[g(X)] = g(x) fXY (x, y)dydx = g(x)fX (x)dx
−∞ −∞ −∞
The correlation of X and Y is the expected value of the product of X and Y
Z ∞Z ∞
E[X, Y ] = xyfXY (x, y)dxdy
−∞ −∞
The expectation is also same as averaging, therefore
n
1X
E[X, Y ] ∼ x i yi
n
i=1

Properties of correlation
1. Positive correlation: If the product tends to positive i.e.,
n
1X
xi yi > 0
n
i=1

2. Negative correlation: If the product tends to negative i.e.,


n
1X
xi yi < 0
n
i=1

3. uncorrelation: If the product tends to


n
1X
x i yi = 0
n
i=1
then it is said X and Y are uncorrelated with each other
If the bivariate random variables do not have means of 0 then correlation is defined as covariance denoted
as Cov[XY ] and is expressed as
Cov[XY ] = E[(X − µX )(Y − µY )]
= E[XY − µX Y − µY X + µX µY ]
= E[XY ] − µX E[Y ] − µY E[X] + µX µY ]
= E[XY ] − µX µY ]

Uncorrelated X and Y
Cov[XY ] = 0
then X and Y are uncorrelated with each other
E[XY ] = µX µY
Orthogonal X and Y
Cov[XY ] = 0
then X and Y are uncorrelated with each other
E[XY ] = 0

Cov[XY ] = −µX µY

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 30
1.2. Bivariate-Expectations Chapter 1. Multiple Random Variables:[?, ?, ?]

Correlated X and Y :
A correlation coefficient denoted ρXY is defined as

Cov[XY ]
ρXY =
σX σY

" 2 #
X − µX Y − µY
E ± ≥0
σX σY

" 2  2 #
X − µX (X − µX )(Y − µY ) Y − µY
E ±2 + ≥0
σX σX σY σY

1 ± 2ρxy + 1 ≥ 0

ρxy ≤ 1

|ρxy | ± 1

Consider a relation between X and Y is defined as

Y = aX + b

then

Cov[XY ] = E[(X − µX )(aX + b − aµX − b)]


= E[(X − µX )a(X − µX )]
= aE[(X − µX )2 ]
2
= aσX

The standard deviation of Y is



σY = ± a2 σX

aσ 2
ρXY = √ X = ±1
± a2 σX

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 31
1.2. Bivariate-Expectations Chapter 1. Multiple Random Variables:[?, ?, ?]

17. The mean and variance of random b.


variable X are -2 and 3; the mean and
variance of Y are 3 and 5. The covariance
E[XY ] = CovXY + µX µY
Cov[XY ] = −0.8. What are the correlation
coefficient ρXY and the correlation E[XY ]? = 2.7111 + (−2)(3)
[?] = −3.2889
Solution:
a. Correlation coefficient ρXY is
19. The mean and variance of random
Cov[XY ] variable X are -2 and 3; the mean and
ρXY =
σX σY variance of Y are 3 and 5. The correlation
−0.8 E[XY ] = −8.7. What are the Cov[XY ] and the
= √
3×5 correlation coefficient ρXY ? [?]
= −0.2066
Solution:
b.
a. Correlation coefficient ρXY is
E[XY ] = Cov[XY ] + µX µY
= −0.8 + (−2)(3)
= −6.8 CovXY = E[XY ] − µX µY
= −8.7 − (−2)(3)
= −2.7
18. The mean and variance of random
variable X are -2 and 3; the mean and
variance of Y are 3 and 5. The correlation b.
coefficient ρXY = 0.7. What are the Cov[XY ]
and the correlation E[XY ]? [?] CovXY
ρXY =
σX σY
Solution:
−2.7
a. Correlation coefficient ρXY is = p
(3)(5)
covXY = ρXY σX σY = −0.6971

= 0.7 3 × 5
= 2.7111

20. X is random variable µX = 4 σX = 5. Y is a random variable, µY = 6 σY = 7. The


correlation coefficient is -0.7. If U = 3X + 2Y . What are the V ar[U ], Cov[U X], Cov[U Y ]? [?]
Solution:
a. V ar[U ]

CovXY = ρXY σX σY
= (−0.7)(5)(7)
= −24.5

σU2 = E[(U − µU )2 ]
= E[9(X − µX )2 + 12(X − µX )(Y − µY ) + 4(Y − µY )]
2
= 9σX + 12Cov[XY ] + 4σY2
= 9 × (52 ) + 12(−24.5) + 4(72 )
= 225 − 294 + 196
= 127

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 32
1.2. Bivariate-Expectations Chapter 1. Multiple Random Variables:[?, ?, ?]

b. Cov[U X]

Cov[U X] = E[(U − µU )(X − µX )]


= E[{3(X − µX ) + 2(Y − µY )}(X − µX )]
2
= 3σX + 2Cov[XY ]
= 3(52 ) + 2(−24.5) = 75 − 49
= 26

c. Cov[U Y ]

Cov[U Y ] = E[(U − µU )(Y − µY )]


= E[{3(X − µX ) + 2(Y − µY )}(Y − µY )]
= 3Cov[XY ] + 2σY2
= 3(−24.5) + 2(72 ) = −73.5 + 98
= 24.5

21. X is random variable µX = 4 σX = 5. Y is a random variable, µY = 6 σY = 7. The


correlation coefficient is 0.2. If U = 3X + 2Y . What are the V ar[U ], Cov[U X] and Cov[U Y ]?
[?]
Solution:
a. V ar[U ]

CovXY = ρXY σX σY
= (0.2)(5)(7)
= 7

σU2 = E[(U − µU )2 ]
= E[9(X − µX )2 + 12(X − µX )(Y − µY ) + 4(Y − µY )]
2
= 9σX + 12Cov[XY ] + 4σY2
= 9 × (52 ) + 12(7) + 4(72 )
= 225 + 84 + 196
= 505

b. Cov[U X]

Cov[U X] = E[(U − µU )(X − µX )]


= E[{3(X − µX ) + 2(Y − µY )}(X − µX )]
2
= 3σX + 2Cov[XY ]
= 3(52 ) + 2(7) = 75 + 14
= 89

c. Cov[U Y ]

Cov[U Y ] = E[(U − µU )(Y − µY )]


= E[{3(X − µX ) + 2(Y − µY )}(Y − µY )]
= 3Cov[XY ] + 2σY2
= 3(7) + 2(72 ) = 21 + 98
= 119

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 33
1.2. Bivariate-Expectations Chapter 1. Multiple Random Variables:[?, ?, ?]

22. X is random variable µX = 4 σX = 5. Y is a random variable, µY = 6 σY = 7. The


correlation coefficient is 0.7. If U = 3X + 2Y . What are the V ar[U ], Cov[U X] and Cov[U Y ]?
[?]
Solution:
a. V ar[U ]

CovXY = ρXY σX σY
= (0.7)(5)(7)
= 24.5

σU2 = E[(U − µU )2 ]
= E[9(X − µX )2 + 12(X − µX )(Y − µY ) + 4(Y − µY )]
2
= 9σX + 12Cov[XY ] + 4σY2
= 9 × (52 ) + 12(24.5) + 4(72 )
= 225 + 294 + 196
= 715

b. Cov[U X]

Cov[U X] = E[(U − µU )(X − µX )]


= E[{3(X − µX ) + 2(Y − µY )}(X − µX )]
2
= 3σX + 2Cov[XY ]
= 3(52 ) + 2(24.5) = 75 + 49
= 124

c. Cov[U Y ]

Cov[U Y ] = E[(U − µU )(Y − µY )]


= E[{3(X − µX ) + 2(Y − µY )}(Y − µY )]
= 3Cov[XY ] + 2σY2
= 3(24.5) + 2(72 ) = 73.5 + 98
= 171.5

23. X and Y are correlated random variable with a correlation coefficient of ρ = 0.6 µX = 3
V ar[X] = 49, µY = 144 V ar[Y ] = 144. The random variables U and V are obtained using
U = X + cY and V = X − cY . What values can c have if U and V are uncorrelated? [?]
Solution:

Cov[U V ] = E[(U − µU )(V − µV )]


= E[((X − µX ) + c(Y − µY ))((X − µX ) − c(Y − µY ))]
2
= σX − c2 σY2

If Cov[U V ] = 0 then
2
σX − c2 σY2 = 0
σX
c = ±
σ
rY
49
= ±
144
= ±0.5833

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 34
1.2. Bivariate-Expectations Chapter 1. Multiple Random Variables:[?, ?, ?]

24. X and Y are correlated random variable with a correlation coefficient of ρ = 0.7 µX = 5
V ar[X] = 36, µY = 16 V ar[Y ] = 150. The random variables U and V are obtained using
U = X + cY and V = X − cY . What values can c have if U and V are uncorrelated? [?]
Solution:

Cov[U V ] = E[(U − µU )(V − µV )]


= E[((X − µX ) + c(Y − µY ))((X − µX ) − c(Y − µY ))]
2
= σX − c2 σY2

If Cov[U V ] = 0 then
2
σX − c2 σY2 = 0
σX
c = ±
σ
rY
36
= ±
150
= ±0.4899

25. X and Y are correlated random variable with a correlation coefficient of ρ = 0.8 µX = 20
V ar[X] = 70, µY = 15 V ar[Y ] = 100. The random variables U and V are obtained using
U = X + cY and V = X − cY . What values can c have if U and V are uncorrelated? [?]
Solution:

Cov[U V ] = E[(U − µU )(V − µV )]


= E[((X − µX ) + c(Y − µY ))((X − µX ) − c(Y − µY ))]
2
= σX − c2 σY2

If Cov[U V ] = 0 then
2
σX − c2 σY2 = 0
σX
c = ±
σ
rY
70
= ±
100
= ±0.8367

Note: Entire material is taken from different text books or from the Internet (different
websites). Slightly it is modified from the original content. It is not for any commercial
purpose. It is used to teach students. Suggestions are always encouraged.

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 35
1.3. Bivariate Transformations Chapter 1. Multiple Random Variables:[?, ?, ?]

1.3 Bivariate Transformations


• Consider a bivariate random variables X and Y with known mean, variance and their covariance are
transformed to U and V with linear transformation is as follows.

U = aX + bY
V = cX + dY

Then the means of U and V are

µU = aµX + bµY
µV = cµX + dµY

The variance of U is

σU2 = E[(U − µU )2 ]
= E[(aX + bY − aµX − bµY )2 ]
= E[(a(X − µX ) + b(Y − µY ))2 ]
= E[a2 (X − µX )2 + 2ab(X − µX )(Y − µY ) + b2 (Y − µY )2 ]
= a2 σX
2
+ 2abCov[XY ] + b2 σY2

Similarly the variance of V is

σV2 = E[(V − µV )2 ]
= E[(cX + dY − cmuX − dµY )2 ]
= E[(c(X − µX ) + d(Y − µY ))2 ]
= E[c2 (X − µX )2 + 2cd(X − µX )(Y − µY ) + d2 (Y − µY )2 ]
= c2 σX
2
+ 2cdCov[XY ] + d2 σY2

2
Cov[U V ] = acσX + (bc + ad)Cov[XY ] + bdσY2

U = cosθX − sinθY
V = sinθX + cosθY

The inverse of the rotational transformations is

X = cosθU + sinθV
Y = −sinθU + cosθV

Then the means of X and Y are

µX = cosθµU + sinθµV
µY = −sinθµU + cosθµV

2
σX = cos2 θσU2 + 2sinθcosθCov[U V ] + sin2 θσV2
σY2 = sin2 θσU2 − 2sinθcosθCov[U V ] + cos2 θσV2
Cov[XY ] = sinθcosθ[σV2 − σU2 ] + (cos2 θ − sin2 θ)Cov[U V ]

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 36
1.3. Bivariate Transformations Chapter 1. Multiple Random Variables:[?, ?, ?]

26. The zero mean bivariate random variables X1 and X2 have the following variances:
V ar[X1 ] = 2 and V ar[X2 ] = 4. Their correlation coefficient is 0.8. Random variables Y1 and Y2
are obtained from

Y1 = 3X1 + 4X2
Y2 = −X1 + 2X2

Find values of V ar[Y1 ] and V ar[Y2 ] and Cov[Y1 Y2 ] [?]


Solution:

Cov[X1 X2 ] = ρX1X2 σX1 σX2



= (0.8) 2 × 4
= 2.2627

σY21 = a2 σX
2
1
+ 2abCov[X1 X2 ] + b2 σX
2
2

= (3)2 (2) + 2(3)(4)(2.2627) + (42 )4


= 136.3058

σY22 = c2 σX
2
1
+ 2cdCov[X1 X2 ] + d2 σX
2
2

= (−1)2 (2) + 2(−1)(2)(2.2627) + (2)2 4


= 8.9492

2 2
Cov[Y1 Y2 ] = acσX 1
+ (bc + ad)Cov[X1 X2 ] + bdσX 2

= (3)(−1)(2) + [(4)(−1) + (3)(2)](2.2627) + (4)(2)(4)


= 30.5254

27. The random variable X has a mean of 3.0 and variances: of 0.7. The random variable
Y has a mean of -3.0 and variance of 0.6. The covariances for X and Y is 0.4666. Given the
transformation

U = 10X + 6Y
V = 5X + 13Y

Calculate the values of V ar[U ] and V ar[V ] and Cov[U V ] [?]


Solution:
Given Cov[XY ] = 0.4666

σU2 = a2 σX
2
+ 2abCov[XY ] + b2 σY2
= (10)2 (0.7) + 2(10)(6)(0.4666) + (62 )(0.6)
= 147.5920

σV2 = a2 σX
2
+ 2abCov[XY ] + b2 σY2
= (5)2 (0.7) + 2(5)(13)(0.4666) + (132 )(0.6)
= 179.5580

2 2
Cov[U V ] = acσX 1
+ (bc + ad)Cov[X1 X2 ] + bdσX 2

= (10)(5)(0.7) + [(6)(5) + (10)(13)](0.4666) + (6)(13)(0.6)


= 156.4560

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 37
1.3. Bivariate Transformations Chapter 1. Multiple Random Variables:[?, ?, ?]

28. The random variables U and V are related to X and Y with

U = 2X − 3Y
V = −4X + 2Y
2 = 5, σ 2 = 6 and Cov[XY ] = 0 Calculate values for V ar[U ]
We know that µX = 13, µY = −7, σX Y
and V ar[V ] and Cov[U V ] [?]
Solution:

σU2 = a2 σX
2
+ 2abCov[XY + b2 σY2
= (2)2 (5) + 0 + ((−3)2 )(6)
= 74

σV2 = a2 σX
2
+ 2abCov[XY + b2 σY2
= (−4)2 (5) + 0 + (22 )(6)
= 104

2 2
Cov[U V ] = acσX 1
+ (bc + ad)Cov[X1 X2 ] + bdσX 2

= (2)(−4)(5) + 0 + (−3)(2)(6)
= −76

29. It is required to have correlated bivariate random variables U and V such that µU =
0, µV = 0, σU2 = 7, σV2 = 20 and ρU V = 0.50. Specify uncorrelated random variables X and Y
and an angle θ, that when used in the transformation U = cosθX − sinθY , V = sinθX + cosθY
will produce the desired U and V . [?]
Solution:

µX = aµU + bµV = 0 + 0 = 0
µY = cµU + dµV = 0 + 0 = 0

Cov[U V ] = ρU V σU σV
p
= 0.5 (7)(20)
= 5.9161

2Cov[U V ]
tan2θ =
σU2 − σV2
2(5.9161)
= = −0.9101
7 − 20
2θ = tan−1 (−0.9101) = −42.3055
θ = −21.1537

cosθ = cos(−21.1537) = 0.9754


sinθ = sin(−21.1537) = −0.3609

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 38
1.3. Bivariate Transformations Chapter 1. Multiple Random Variables:[?, ?, ?]

2
σX = cos2 θσU2 + 2sinθcosθCov[U V ] + sin2 θσV2
= (0.9754)2 (7) + 2(−0.3609)(0.9754)(5.9161) + (−0.3609)2 (20)
= 6.6598 − 4.1651 + 2.6049
= 4.7107

σY2 = sin2 θσU2 − 2sinθcosθCov[U V ] + cos2 θσV2


= (−0.3609)2 (7) − 2(−0.3609)(0.9754)(5.9161) + (0.9754)2 (20)
= 0.1302(7) − 2(−0.3609)(0.9326)(5.9161) + (0.9514)(20)
= 0.9114 + 4.1651 + 19.0281
= 24.1046

30. It is required to have correlated bivariate random variables U and V such that µU =
0, µV = 0, σU2 = 25, σV2 = 4 and ρU V = −0.50. Specify uncorrelated random variables X and Y
and an angle θ, that when used in the transformation U = cosθX − sinθY , V = sinθX + cosθY
will produce the desired U and V . [?]
Solution:

µX = aµU + bµV = 0 + 0 = 0
µY = cµU + dµV = 0 + 0 = 0

Cov[U V ] = ρU V σU σV
p
= −0.5 (25)(4)
= −5

2Cov[U V ]
tan2θ =
σU2 − σV2
2(−5)
= = −0.4762
25 − 4
2θ = tan−1 (−0.4762) = −25.4637
θ = −12.7319

cosθ = cos(−12.7319) = 0.9754


sinθ = sin(−12.7319) = −0.2204

2
σX = cos2 θσU2 + 2sinθcosθCov[U V ] + sin2 θσV2
= cos2 (−25.4637)(25) + 2(sin(−25.4637))cos(−25.4637))(−5) + (sin2 (−25.4637))(4)
= 0.9514(25) + 2(−0.2204)(0.9754)(−5) + (0.1302)(4)
= 23.7851 + 2.1497 + 0.1943
= 26.1292

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 39
1.3. Bivariate Transformations Chapter 1. Multiple Random Variables:[?, ?, ?]

σY2 = sin2 θσU2 − 2sinθcosθCov[U V ] + cos2 θσV2


= (−0.2204)2 (25) − 2(−0.2204)(0.9754)(−5) + (0.9754)2 )(4)
= 0.0485(25) − 2.1497 + (0.9166)(4)
= 1.2125 − 2.1497 + 3.6664
= 2.7292

31. It is required to have correlated bivariate random variables U and V such that µU =
0, µV = 0, σU2 = 7, σV2 = 1 and ρU V = 0.30. Specify uncorrelated random variables X and Y
and an angle θ, that when used in the transformation U = cosθX − sinθY , V = sinθX + cosθY
will produce the desired U and V . [?]
Solution:

µX = aµU + bµV = 0 + 0 = 0
µY = cµU + dµV = 0 + 0 = 0

Cov[U V ] = ρU V σU σV
p
= 0.3 (7)(1)
= 0.7937

2Cov[U V ]
tan2θ =
σU2 − σV2
2(0.7937)
= = 0.2645
7−1
2θ = tan−1 (0.2645) = 14.8154
θ = 7.4077

cosθ = cos(7.4077) = 0.9916


sinθ = sin(−12.7319) = 0.1290

2
σX = cos2 θσU2 + 2sinθcosθCov[U V ] + sin2 θσV2
= (0.9916)2 (7) + 2(0.1290)(0.9916)(0.7937) + (0.1290)2 (1)
= 6.8828 + 0.2030 + 0.01644
= 7.1024

σY2 = sin2 θσU2 − 2sinθcosθCov[U V ] + cos2 θσV2


= (0.1290)2 (7) − 2(0.1290)(0.9916)(0.7937) + (0.9916)2 )(1)
= 0.1164 − 0.2030 + 0.9832
= 0.8966

Note: Entire material is taken from different text books or from the Internet (different
websites). Slightly it is modified from the original content. It is not for any commercial
purpose. It is used to teach students. Suggestions are always encouraged.

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 40
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

1.4 Sums of Two Independent Two Random Variables:


• Consider a two independent random variables X and Y and another random variable W is related
as

W = X +Y

Then the mean and variance of W is

E[W ] = E[X + Y ]
µW = µ X + µY

The variance of W is
2
σW = E[(W − µW )2 ]
= E[(X + Y − µX − µY )2 ]
= E[((X − µX ) + (Y − µY ))2 ]
= E[(X − µX )2 + 2(X − µX )(Y − µY ) + (Y − µY )2 ]
2
= σX + 2Cov[XY ] + σY2
2
= σX + σY2

X and X are independent and are uncorrelated with each other hence 2Cov[XY ]
If pdf of X and Y are known then the cdf of the random variable W is

FW (w) = P {X + Y ≤ w}

The cdf for the random variable W is

P {X + Y ≤ w} = P {(x, y) ∈ <}
Z Z
= fXY (x, y)dxdy
<
Z ∞ Z w−x 
= fXY (x, y)dy dx
−∞ −∞
Z ∞ Z w−x 
FW (w) = fXY (x, y)dy dx
−∞ −∞

The pdf for the random variable W is


Z ∞
fW (w) = fXY (x, w − x)dx
−∞

Assuming that X and Y are independent then


Z ∞
fW (w) = fX (x)fY (w − x)dx
Z−∞

= fY (y)fX (w − y)dy
−∞

The above equation is convolution hence it can be written as

fW (w) = fX (x) ∗ fY (y)

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 41
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

35. The random variables X is uniformly distributed between ±1. Two independent
realizations of are added: Y = X1 + X2 . What is the pdf for Y [?]
Solution:

1 1 1
fX1 (x) = = =
b−a 1 − (−1) 2
1 1 1
fX2 (y) = = =
b−a 1 − (−1) 2

X 1 ( x) X 2 ( x)

-1 +1
x -1 +1
x

Case 1: −1 < (y + 1) < 1 ⇒ −2 < y < 0


X 1 ( x) X 2 ( y  x) Z ∞
fY (y) = fX (x)fX (y − x)dx
−∞
Z y+1
1 1
x x = × dx
-1 +1 y 1 y 1 −1 2 2
X 1 ( x) X 2 ( y  x)
1 y+1 1
= [x] = [y + 1 − (−1)]
4 −1 4
y+2
= −2<y <0
4
y 1 -1 y  1 +1
x

Case 2: −1 < (y − 1) < 1 ⇒ 0 < y < 2


Z ∞
X 1 ( x) X 2 ( y  x)
fY (y) = fX (x)fX (y − x)dx
−∞
Z 1
1 1
= × dx
x x y−1 2 2
-1 +1 y 1 y 1
1 1 1
= [x] = [1 − (y − 1)]
X 1 ( x) X 2 ( y  x) 4 y−1 4
2−y
= 0<y<2
4

-1 y  1 0 1 y 1
x

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 42
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

36. X is a random variable uniformly distributed between 0 and 3. Y is a random variable


independent of X, uniformly distributed between +2 and -2. W = X + Y . What is the pdf
for W [?]
Solution:

1 1 1
fX (x) = = =
b−a 3−0 3
1 1 1
fY (y) = = =
b−a 2 − (−2) 4

X ( y) Y ( y)

y y
0 3 -2 +2

X ( y ) X (w  y)

y y
-3 0 w-3 w

Case 1: Width of the window 3-0=3, Lower


range=-2 upper range=-2+3=1 ⇒ −2 < w < Z ∞
1 fW (w) = fY (y)fX (w − y)dy
Y ( y) X (w  y) Z−∞
w
1 1
= × dy
−2 4 3
1 w 1
= [y]−2 = (w + 2)
12 12
y
w-3 -2 w 2 (w + 2)
= −2<w <1
12

Case 2: 1 < w < 2


Z ∞
fW (w) = fY (y)fX (w − y)dy
Y ( y) X (w  y) Z−∞
w
1 1
= × dy
w−3 4 3
1 w 1
= [y] = (w − (w − 3))
y 12 w−3 12
-2 w-3 w 2 1
= 1<w<2
4

Z ∞
Case 3: Width of the window 3-0=3, Lower fW (w) = fY (y)fX (w − y)dy
range=2 upper range=2+3=5 ⇒ 2 < w < 5 −∞
Z 2
Y ( y) X (w  y) 1 1
= × dy
w−3 4 3
1 −2 1
= [y] = (2 − (w − 3))
12 w−3 12
y 5−w
-2 w-3 2 w = 2<w<5
12

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 43
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

37. X is a random variable uniformly distributed between 0 and 3. Z is a random variable


independent of X, uniformly distributed between +1 and -1. U = X + Z . What is the pdf
for U [?] Solution:

1 1 1
fX (x) = = =
b−a 3−0 3
1 1 1
fZ (z) = = =
b−a 1 − (−1) 2

X ( z) Z (z)

z z
0 3 -1 +1

X ( z ) X (u  z )

z z
-3 0 u-3 u

Case 1: −1 < u < 1


Z ∞
fU (u) = fZ (z)fX (u − z)dz
Z ( z ) X (u  z )
Z−∞
u
1 1
= × dz
−1 2 3
1 u 1
= [z]−1 = (u + 1)
6 6
z (u + 1)
u-3 -1 u 1 = −1<u<1
6

Case 2:Width of the window=3-0=3, lower


range=1, upper range=-1+3=2 1 < u < 2 Z ∞
fU (u) = fZ (z)fX (u − z)dz
−∞
Z ( z ) X (u  z ) Z 1
1 1
= × dz
−1 2 3
1 1 1
= [z]−1 = (1 − (−1))
6 6
z
u-3 -1 1 u 1
= 1<u<2
3
Z ∞
Case 3:Width of the window=3-0=3, lower fU (u) = fZ (z)fX (u − z)dz
range=2, upper range=1+3=4 2 < u < 4 −∞
Z 1
Z ( z ) X (u  z ) 1 1
= × dz
u−3 2 3
1 1 1
= [z]u−3 = (1 − (u − 3))
6 6
z 4−u
-1 u-3 1 u = 2<u<4
6

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 44
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

38. Probability density function for two independent random variables X and Y are

fX (x) = ae−ax u(x)


fY (y) = (a3 /2)y 2 e−ay u(y)

where a=3. If W = X + Y what is fW (w) [?]


Solution:

Z ∞
fW (w) = fY (y)fX (w − y)dy
Z−∞w
= (a3 /2)y 2 e−ay ae−a(w−y) dy
0
a4 e−aw w 2 −ay ay
Z
= y e e dy
2 0
a4 e−aw w 2
Z
= y dy
2 0
 w
a4 e−aw y 3
=
2 3 0
a4 e−aw w3
=
2 3
3 4
= w3 e−3w
6
= 13.5w3 e−3w

39. The pdf for an erlang random variable X of order two is

fX (x) = λ2 xe−λx x > 0

and is 0 otherwise. The random variable Y = X1 + X2 where X1 and X2 are independent


trials of X Find the pdf for Y [?]
Solution:

Z ∞
fW (w) = fY (y)fX (w − y)dy
Z−∞
y
= λ2 xe−λx λ2 (y − x)e−λ(y−x) dx
0
Z y
= λ 4
xe−λx (y − x)e−λ(y−x) dx
0
Z y
= λ 4
e−λx−λy+λx [xy − x2 ]dx
0
Z y
4 −λy
= λ e [xy − x2 ]dx
0 2 y
4 −λy x x3
= λ e y−
2 3 0
 2 3

4 −λy y y
= λ e −
2 3
 3
3y − 2y 3

= λ4 e−λy
6
4 3
λ y −λy
= e
6

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 45
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

40. Probability density function for two independent random variables Z and V are
fZ (z) = ae−az u(z)
fV (y) = a2 ve−av u(v)
wherea = 31 . If Y = Z + V what is fZ (z) [?]
Solution:

Z ∞
fW (w) = fY (y)fX (w − y)dy
Z−∞
y
= a2 ve−av ae−a(y−v) dv
0
Z y
3 −ay
= a e vdv
0
 2 y
3 −ay v
= a e
2 0
a3 2 −ay
= y e = 0.0185y 2 e−ay
2

41. Let the random variable U be uniformly distributed between ±5. Also let the pdf for
the random variable V be
fV (v) = 3e−3v u(v)
U and V are independent and W = U + V . What is the pdf for W [?]
Solution:
The random variable U is uniformly distributed between ±5 = −5 to + 5 it’s pdf is

U ( x)

x
-5 +5

1 1 1
fU (u) = = =
b−a 5 − (−5) 10
Z ∞
fW (w) = fU (u)fV (w − u)du
−∞
fW (w) = 0 w < −5
Z w
1 −3(w−u)
= 3e du
−5 10
" #w
1 e−3(w−u)
= 3
10 3
−5
1
= (1 − e−3(w+5) − 5 < w < 5
10
Z 5
1
= 3e−3(w−u) du − 5 < w < 5
10 −5
1 −3(w−5)
= [e − e−3(w+5) ] w > 5
10

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 46
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

42. It is given that fX (x) is uniformly distributed between ±3. Also

fY (y) = 7e−7y u(y)

W = X + Y where X and Y are independent. Find the pdf for W [?]


Solution:
The random variable X is uniformly distributed between ±5 = −3 to + 3 it’s pdf is

X ( x)

x
-3 +3

1 1 1
fU (u) = = =
b−a 3 − (−3) 6

Z ∞
fW (w) = fX (x)fY (w − x)dx
−∞
fW (w) = 0 w < −3
Z w
1 −7(w−x)
= 7e dx
−3 6
" #w
1 e−7(w−x)
= 7
6 7
−3
1
= (1 − e−7(w+3) − 3 < w < 3
6
1 3 −7(w−x)
Z
= 7e du − 3 < w < 3
6 −3
1 −7(w−3)
= [e − e−7(w+3) ] w > 3
6

43. The random variable X be uniformly distributed between ±0.5. The random variable Z
has the pdf

fZ (z) = 3e−z u(z)

Y = X + Z where X and Z are independent. Find the pdf for Y [?]


Solution:
The random variable X is uniformly distributed between ±5 = −3 to + 3 it’s pdf is

X ( x)

x
-0.5 +0.5

1 1
fU (u) = = =1
b−a 0.5 − (−0.5)

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 47
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

Z ∞
fY (y) = fX (x)fZ (y − x)dx
−∞
fY (y) = 0 w < −0.5
Z y
= 1e−(y−x) dx
−0.5
" #w
e−(y−x)
=
1
−0.5
= (1 − e−(y+0.5) − 0.5 < w < 0.5
Z 0.5
= e−(y−x) dx − 3 < w < 3
−0.5
−(y−0.5)
= e − e−(y+0.5) ] 0.5 < y

44. The random variable X has the pdf c(7 − x) for all x between 0 and 7 and is 0 otherwise.
The random variable Y is independent of X and is uniformly distributed between 0 and 7.
W = X + Y . Find the necessary value of c and then find fW (w) [?]
Solution:

1 1 1
fY (y) = = =
b−a 7−0 7

1
1 = (7)(7c)
2
2
c = =
49

Z ∞
fW (w) = fX (x)fY (w − x)dx
Z−∞
w
2 1
= (7 − x)dx
0 49 7
Z w
2
= (7 − x)dx
343 0
w
x2

2
= 7x −
343 2 0
w2
 
2
= 7w −
343 2
1
= (14w − w2 ) 0 < w < 7
343

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 48
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

fW (w) =
Z 7
2 1
= (7 − x)dx
w−7 49 7
Z 7
2
= (7 − x)dx
343 w−7
7
x2

2
= 7x −
343 2 w−7
(7)2 (w − 7)2
   
2
= 7(7) − − 7(w − 7) −
343 2 2
2
w − 28w + 196
= 7 < w < 14
343
= 0 otherwise

45. The random variable X has the pdf c(5 − x) for all x between 0 and 5 and is 0 otherwise.
The random variable Y is independent of X and is uniformly distributed between 0 and 5.
U = X + Y . Find the necessary value of c and then find fU (u) [?]
Solution:

1 1 1
fY (y) = = =
b−a 5−0 5

1
1 = (5)(5c)
2
2
c = =
25

Z ∞
fU (u) = fX (x)fY (u − x)dx
Z−∞
u
2 1
= (5 − x)dx
0 25 5
Z u
2
= (5 − x)dx
125 0
u
x2

2
= 5x −
125 2 0
u2
 
2
= 5u −
125 2
1
= (10u − u2 ) 0 < u < 5
125

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 49
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

fU (u) =
Z 5
2 1
= (5 − x)dx
u−5 25 5
Z 5
2
= (5 − x)dx
125 u−5
5
x2

2
= 5x −
125 2 w−5
(5)2 (w − 5)2
   
2
= 5(5) − − 5(w − 5) −
125 2 2
2
u − 20u + 100
= 5 < w < 10
125
= 0 otherwise

46. The random variable X has the pdf c(3 − x) for all x between 0 and 3 and is 0 otherwise.

The random variable Y is independent of X and is uniformly distributed between 0 and 3.


V = X + Y . Find the necessary value of c and then find fV (v) [?]
Solution:

1 1 1
fY (y) = = =
b−a 3−0 3

1
1 = (3)(3c)
2
2
c = =
9

Z ∞
fV (v) = fX (x)fY (v − x)dx
Z−∞
v
21
= (3 − x)dx
0 93
Z v
2
= (3 − x)dx
27 0
v
x2

2
= 3x −
27 2 0
v2
 
2
= 3v −
27 2
1
= (6v − v 2 ) 0 < v < 3
27

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 50
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

fU (u) =
Z 3
21
= (3 − x)dx
v−3 93
Z 3
2
= (3 − x)dx
27 v−3
3
x2

2
= 3x −
27 2 v−3
(3)2 (v − 3)2
   
2
= 3(3) − − 3(v − 3) −
27 2 2
2
v − 12v + 36
= 3<v<6
27
= 0 otherwise

47. A discrete random variable Y has the pdf

fY (y) = 0.5δ(y) + 0.5δ(y − 3)

U = Y1 + Y2 where Y 0 s are independent. What is the pdf for U ? [?]


Solution:

Z ∞
fU (u) = fY (y)fY (u − y)dy
Z−∞

= [0.5δ(y) + 0.5δ(y − 3)][0.5δ(u − y) + 0.5δ(u − y − 3)]dy
Z−∞

= [0.25δ(y)δ(u − y) + 0.25δ(y − 3)δ(u − y) + 0.25δ(y)δ(u − y − 3) + 0.25δ(y − 3)δ(u − y − 3)]dy
−∞
= 0.25δ(u) + 0.5δ(u − 3) + 0.25δ(u − 6)

48. A discrete random variable Z has the pdf

fZ (z) = 0.3δ(z − 1) + 0.7δ(z − 2)

V = Z1 + Z2 where Z 0 s are independent. What is the pdf for V ? [?]


Solution:
Z ∞
fV (v) = fZ (z)fZ (v − z)dz
−∞
Z ∞
= [0.3δ(z − 1) + 0.7δ(z − 2)][0.3δ(v − z − 1) + 0.7δ(v − z − 2)]dz
−∞
Z ∞
= [0.09δ(z − 1)δ(v − z − 1) + 0.21δ(z − 2)δ(v − z − 1) + 0.21δ(z − 1)δ(v − z − 1) + 0.49δ(z − 2)δ(v − z − 2)]dz
−∞
= 0.09δ(v − 2) + 0.42δ(v − 3) + 0.49δ(v − 4)

49. A discrete random variable Y has the pdf

fX (x) = 0.6δ(x − 2) + 0.4δ(x − 1)

W = X1 + X2 where X 0 s are independent. What is the pdf for W ? [?]


Solution:

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 51
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

Z ∞
fW (w) = fX (x)fY (w − x)dx
−∞
Z ∞
= [0.6δ(x − 2) + 0.4δ(x − 1)][0.6δ(x − 2) + 0.4δ(x − 1))]dx
−∞
Z ∞
= [0.16δ(x − 1)δ(w − x − 1) + 0.24δ(x − 2)δ(w − x − 1) + 0.24δ(x − 1)δ(w − x − 2) + 0.36δ(x − 2)δ(w − x − 2)]dx
−∞
= 0.16δ(w − 2) + 0.48δ(w − 3) + 0.36δ(w − 4)

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 52
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

35. Let X and Y be independent uniform random variables over (0, 1). Find and sketch the
pdf of Z = X + Y . [?]
Solution:

1 1
fX (x) = = =1
b−a 1 − (0)
1 1
fY (y) = = =1
b−a 1 − (0)

X ( x) Y ( y)

0 1
x 0 1
y

X ( y ) X ( z  y)

x x
-1 0 z-1 z

Case 1: 0 < z < 1


Y ( y)X ( z  y)
Z ∞
fZ (z) = fY (y)fX (z − y)dy
Z−∞
z
= (1) × (1)dy
0
z = [y]z0 = [z − 0]
z 1 0 z 1
= z 0<z<1

Case 2: 0 < (z − 1 < 1) → 1 < z < 2


Y ( y )X ( z  y )
Z ∞
fZ (z) = fX (x)fX (y − x)dy
−∞
Z 1
= (1) × (1)dy
z−1

z z = [y]1z−1 = [1 − (z − 1)]
0 z 1 1
= 2−z 1<z<2

The sketch of pdf Z = X + Y

fZ ( z)

1 z
0 2

Figure 1.5: sketch of pdf Z = X + Y

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 53
1.4. Sums of Two Independent Two Random Variables: Chapter 1. Multiple Random Variables:[?, ?, ?]

Note: Entire material is taken from different text books or from the Internet (different
websites). Slightly it is modified from the original content. It is not for any commercial
purpose. It is used to teach students. Suggestions are always encouraged.

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 54
1.5. Sums of IID Random Variables Chapter 1. Multiple Random Variables:[?, ?, ?]

1.5 Sums of IID Random Variables


Consider a situation when each random variable is added its sum is having the same pdf. The pdf associated
with each f( x). This is denoted as independent and identically distributed (IID) random variables
n
X
W = Xi
i=1

Then the mean and variance of Xi is

E[Xi ] = E[X] = µX

The variance of Xi is
2
V ar[Xi ] = V ar[X] = σX

E[Xi2 ] = E[X 2 ] = µ2X + σX 2



j=i
E[Xi Xj ] =
E[Xi Xi ] = µ2X j 6= i

When n = 2

W2 = X1 + X2

E[W2 ] = 2µX

The variance of W2 is
2
V ar[W2 ] = 2σX

When n = 3

W3 = X1 + X2 + X3
= W2 + X3

E[W3 ] = 3µX

The variance of W3 is
2
V ar[W3 ] = 3σX

Similarly continued then

W = Wn−1 + Xn

µW = nµX

The variance of W3 is

σ2W 2
= nσX

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 55
1.5. Sums of IID Random Variables Chapter 1. Multiple Random Variables:[?, ?, ?]

53. The random variable U has a mean of 0.3 and a variance of 1.5
a) Find the mean and variance of Y if
53
1 X
Y = Ui
53
i=1

b) Find the mean and variance of Z if


53
X
Z = Ui
i=1

In these two sums, the Ui0 s are IID [?]

Solution:
a) The mean and variance of Y is
µU = 0.3, σU2 = 1.5
µY = µU = 0.3
σU2 1.5
σY2 = = = 0.0283
n 53
b) The mean and variance of Z is
µZ = nµU = 53(0.3) = 15.9
σZ2 = nσU2 = 53(1.5) = 79.5

54. The random variable X is uniformly distributed between ±1


a) Find the mean and variance of Y if
37
1 X
Y = Xi
37
i=1

b) Find the mean and variance of Z if


37
X
Z = Xi
i=1

In these two sums, the Xi0 s are IID [?]

Solution:
a) The mean and variance of Y is
22
µX = 0, σU2 = 12 = 0.333
µY = µX = 0
2
σX 0.3333
σY2 = = = 0.009
n 37
b) The mean and variance of Z is
µZ = nµX = 0
σZ2 2
= nσX = 37(0.3333) = 12.3333

55. The random variable V has a mean of 1 and a variance of 4

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 56
1.5. Sums of IID Random Variables Chapter 1. Multiple Random Variables:[?, ?, ?]

a) Find the mean and variance of Y if


87
1 X
Y = Vi
87
i=1

b) Find the mean and variance of Z if


87
X
Z = Vi
i=1

In these two sums, the Vi0 s are IID [?]

Solution:
a) The mean and variance of Y is
µV = 1, σV2 = 4

µY = µV = 1
σV2 4
σY2 = = = 0.0460
n 87
b) The mean and variance of Z is

µZ = nµV = 87(1) = 87
σZ2 = nσV2 = 87(4) = 348

56. The random variable X has a mean of 12.6 and a variance of 2.1. The random variable
Y is related to X by Y = 10(X − µX ). The random variable Z is as shown here.
100
X
Z = Yi
i=1

where Yi0 s are IID. What are µZ and σZ2 [?]

Solution:
2 = 2.1
µX = 12.6, σX

µY = 10(µX − µX ) = 0
σY2 = 102 σY2 = 210
µZ = 100µY = 0
σY2 = 100σY2 = 21000

57. The random variable X = 3 + V , where V is a Gaussian random variable with a mean of
0 and a variance of 30. Seventy two independent realizations of X are averaged.
72
1 X
Y = Xi
72
i=1

What are mean and variance of Y [?]

Solution:

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 57
1.5. Sums of IID Random Variables Chapter 1. Multiple Random Variables:[?, ?, ?]

µV = 0, σV2 = 30

µX = 3 + µV = 3
2
σX = 12 σV2 = 30
µY = 0
σX2
30
σY2 = = = 0.4167
72 72

58. X is random variable with a variance of 1.8 and a mean of 14 and. Y = X − µX . Z is as


shown here.
100
1 X
Z = Yi
100
i=1

where Yi0 s are IID. What are mean and variance of Z [?]

Solution:
2 = 1.8
µX = 14, σX

µY = µX − µX = 0
σY2 = 1 2 σX
2
= 1.8
µZ = µY = 0
σY2
σZ2 = = 0.0180
100

59. The random variable Z is uniformly distributed between 0 and 1. The random variable
Y is obtained from Z as follows

Y = 3Z + 5.5

One hundred independent realizations of Y are averaged


100
1 X
U = Yi
100
i=1

a) Estimate the probability P (U ≤ 7.1)

b) If 1000 independent calculations of U are performed, approximately how many of these


calculated values for U would be less than 7.1?

[?]

Solution:

0+1
µZ = = 0.5
2
b−a 1−0 1
σZ2 = = =
12 12 12
µY = 3µZ + 5.5 = 3(0.5) + 5.5 = 7
9
σY2 = 32 σZ2 =
12
µU = µY = 7
r
9
σU = = 0.0866
1200

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 58
1.5. Sums of IID Random Variables Chapter 1. Multiple Random Variables:[?, ?, ?]

a) The probability P (U ≤ 7.1)


 
x−µ
P (U ≤ 7.1) = FU (7.1) = φ
σ
 
7.1 − 7
= φ
0.0866
= φ (1.1547) F rom Z table
σY2 = = 0.8759

b)

P (U ≤ 7.1) × 1000 = 876

60. The random variable Z is uniformly distributed between 0 and 1. The random variable
Y is obtained from Z as follows

Y = 3.5Z + 5.25

One hundred independent realizations of Y are averaged


100
1 X
V = Yi
100
i=1

a) Estimate the probability P (V ≤ 7.1)

b) If 1000 independent calculations of V are performed, approximately how many of these


calculated values for V would be less than 7.1?

[?]

Solution:

0+1
µZ = = 0.5
2
b−a 1−0 1
σZ2 = = =
12 12 12
µY = 3.5µZ + 5.25 = 3.5(0.5) + 5.25 = 7
(3.5)2
σY2 = (3.5)2 σZ2 =
12
µU = µY = 7
r
1
σU = 3.5 = 0.1010
1200
a) The probability P (U ≤ 7.1)
 
x−µ
P (U ≤ 7.1) = FU (7.1) = φ
σ
 
7.1 − 7
= φ
0.1010
= φ (0.9900) F rom Z table
σY2 = = 0.8389

b)

P (U ≤ 7.1) × 1000 = 839

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 59
1.5. Sums of IID Random Variables Chapter 1. Multiple Random Variables:[?, ?, ?]

61. The random variable Z is uniformly distributed between 0 and 1. The random variable
Y is obtained from Z as follows

Y = 2.5Z + 5.75

One hundred independent realizations of Y are averaged


100
1 X
W = Yi
100
i=1

a) Estimate the probability P (W ≤ 7.1)

b) If 1000 independent calculations of W are performed, approximately how many of these


calculated values for W would be less than 7.1?

[?]

Solution:

0+1
µZ = = 0.5
2
b−a 1−0 1
σZ2 = = =
12 12 12
µY = 2.5µZ + 5.75 = 2.5(0.5) + 5.75 = 7
(2.5)2
σY2 = (2.5)2 σZ2 =
12
µU = µY = 7
r
1
σU = 2.5 = 0.0722
1200
a) The probability P (U ≤ 7.1)
 
x−µ
P (U ≤ 7.1) = FU (7.1) = φ
σ
 
7.1 − 7
= φ
0.0722
= φ (1.3850) F rom Z table
σY2 = = 0.9170

b)

P (U ≤ 7.1) × 1000 = 917

Note: Entire material is taken from different text books or from the Internet (different
websites). Slightly it is modified from the original content. It is not for any commercial
purpose. It is used to teach students. Suggestions are always encouraged.

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 60
1.6. Conditional Joint Probabilities Chapter 1. Multiple Random Variables:[?, ?, ?]

1.6 Conditional Joint Probabilities


The conditioned cdf of a bivariate random variable is defined as
P {(X ≤ x) ∩ (Y ≤ y) ∩ B}
FXY (x, y|B) =
P (B)

The joint pdf conditioned by an event B is defined as

∂2
fXY (x, y|B) = FXY (x, y|B)
∂x∂y

The event B is a set of bivariate observations (x, y) in the (x) (y) plane
Z Z
P (B) = fXY (x, y)dxdy
B

The above equation is convolution hence it can be written as


(
fXY (x,y)
P (B) (x, y) ∈ B
fXY (x, y|B) =
0 otherwise

Conditional joint pdf for y is


Z Z
fY (y|B) = fXY (x, y|B)dx
B

Z Z
fY (y|B) = fXY (u, y|B)du
x x+dx
Z Z
fXY (u, y)
= du
x x+dx P (B)
fXY (x, y)
= dx
P (B)

fXY (x, y)
fY (y|X = x) =
fX (x)
fXY (x, y)
fY (y|x) =
fX (x)

Similarly

fXY (x, y)
fX (x|y) =
fY (y)

Conditional cdfs are


Z y
FY (y|x) = fY (u|x)du
−∞
FX (x|y) = intx−∞ fX (u|y)du

Conditional expectations are


Z ∞
E[g1 (Y )|x] = g1 (y)fY (y|x)dy
Z−∞

E[g2 (X)|y] = g2 (x)fX (x|y)dx
−∞

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 61
1.6. Conditional Joint Probabilities Chapter 1. Multiple Random Variables:[?, ?, ?]

Conditional mean and variance are


Z ∞
µY |x = yfY (y|x)dy
Z−∞

σY2 |x = (y − µY |x )2 fY (y|x)dy
−∞

Z ∞
µX|y = xfX (x|y)dx
Z−∞

2
σX|y = (X − µX|y )2 fX (x|y)dx
−∞

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 62
1.6. Conditional Joint Probabilities Chapter 1. Multiple Random Variables:[?, ?, ?]

The detailed solutions are given in Exercise 11. Refer previous results.
62. Refer to Figure 3.20 used in Exercise 11. Find using (3.113), the pdf of Y conditioned
by X = 1. Then verify that the conditional pdf satisfies (2.12). Finally, find the mean and
the variance of Y conditioned by X = 1. [?]

Solution:
It is given that
1
fXY (x, y) =
8
1
fX (x) = (2 − x)
8

fXY (x, y)
fY (y|x) =
fX (x)
1
8
= 1 −2<x<2
8 (2 − x)

When X = 1
1 1
fY (y|1) = = =1 1 < y < 2, when x = 1
(2 − x) (2 − 1)

Z ∞ Z 2
fY (y|1)dy = 1dy = [y]21 = [2 − 1]
−∞ 1
= 1

Conditional mean and variance are


Z ∞
µY |x=1 = yfY (y|x)dy
−∞
Z 2 2
y2

= ydy =
1 2 1
1 3
= [4 − 1] =
2 2

Z ∞
σY2 |x = (y − µY |x )2 fY (y|x)dy
−∞
= y2 − (µY |x )2

2 2
y3
Z 
2
y2 = y dy =
1 3 1
1 7
= [8 − 1] =
3 3
 2
7 3 28 − 27
σY2 |x=1 = − =
3 2 12
1
=
12

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 63
1.6. Conditional Joint Probabilities Chapter 1. Multiple Random Variables:[?, ?, ?]

The detailed solutions are given in Exercise 12. Refer previous results.
63. Refer to Figure 3.21 used in Exercise 12. Find using (3.113), the pdf of Y conditioned
by X = 1. Then verify that the conditional pdf satisfies (2.12). Finally, find the mean and
the variance of Y conditioned by X = 1. [?]

Solution:
It is given that
1
fXY (x, y) =
8
1
fX (x) = (2 + x)
8

fXY (x, y)
fY (y|x) =
fX (x)
1
8
= 1 −2<x<2
8 (2 + x)

When X = 1
1 1 1
fY (y|1) = = = − 1 < y < 2, when x = 1
(2 + x) (2 + 1) 3

Z ∞ Z 2
1 1 1
fY (y|1)dy = dy = [y]2−1 = [2 + 1]
−∞ −1 3 3 3
= 1

Conditional mean and variance are


Z ∞
µY |x=1 = yfY (y|x)dy
−∞
Z 2  2
1 1 y2
= ydy =
−1 3 3 2 −1
11 1
= [4 − 1] =
32 2

Z ∞
σY2 |x = (y − µY |x )2 fY (y|x)dy
−∞
= y2 − (µY |x )2

2 −1
y3
Z 
2
y2 = y dy =
−1 3 2
1 7
= [8 + 1] =
3 3
 2
7 1 28 − 27
σY2 |x=1 = − =
3 2 12
1
=
12

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 64
1.6. Conditional Joint Probabilities Chapter 1. Multiple Random Variables:[?, ?, ?]

64. Refer to Figure 3.22 used in Exercise 13. Find using (3.113), the pdf of Y conditioned
by X = 1. Then verify that the conditional pdf satisfies (2.12). Finally, find the mean and
the variance of Y conditioned by X = 1. [?]

Solution:
It is given that
1
fXY (x, y) =
8
1
fX (x) = (x + 2)
8

fXY (x, y)
fY (y|x) =
fX (x)
1
8
= 1 −2<x<2
8 (x + 2)

When X = 1
1 1 1
fY (y|1) = = = − 2 < y < 1, when x = 1
(x + 2) (1 + 2) 3

Z ∞ Z 1
1 1 1
fY (y|1)dy = dy = [y]1−2 = [1 + 2]
−∞ −2 3 3 3
= 1

Conditional mean and variance are


Z ∞
µY |x=1 = yfY (y|x)dy
−∞
Z 1  1
1 1 y2
= ydy =
−2 3 3 2 −2
11 1
= [1 − 4] = −
32 2

Z ∞
σY2 |x=1 = (y − µY |x )2 fY (y|x)dy
−∞
= y 2 − (µY |x )2

1 1
y3
Z 
2
y2 = y dy =
−2 3 −2
1 7
= [1 + 8] =
3 3
 2
7 1 28 − 27
σY2 |x = − =
3 2 12
1
=
12

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 65
1.6. Conditional Joint Probabilities Chapter 1. Multiple Random Variables:[?, ?, ?]

65. Refer to the joint pdf fXY (x, y) given in Exercise 9. Find using (3.113), the pdf of Y
conditioned by X = 2. Then verify that the conditional pdf satisfies (2.12). Finally, find the
mean and the variance of Y conditioned by X = 2. [?]

Solution:
It is given that
1
fXY (x, y) =
8
1
fX (x) = (x + 2)
8

fXY (x, y)
fY (y|x) =
fX (x)
1
8
= 1 −2<x<2
8 (x + 2)

When X = 2

−(4 − 1.2y + y 2 ) 4
 

fY (y|2) = exp +
1.9079π 1.82 2

−(y − 0.6)2
 
1
fY (y|2) = √ exp
2π0.91 2(0.91)

Note: Entire material is taken from different text books or from the Internet (different
websites). Slightly it is modified from the original content. It is not for any commercial
purpose. It is used to teach students. Suggestions are always encouraged.

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 66
1.7. Selected Topics Chapter 1. Multiple Random Variables:[?, ?, ?]

1.7 Selected Topics


1.7.1 Chi Square Random Variables
The random variable V where, for integers for r ≥ 1
X
V = Zi2
i=1r

The random variable Z is the normalized Gaussian random variable defined as


1 Z2
fZ (z) = √ e− 2 −∞<z <∞

The event expectations for the random variable Z
E[Z] = µZ = 0
E[Z 2 ] = σZ2 = 1
Consider a new random variable Y which defined as
Y = Z2
Conditional joint pdf for Y is
( y2
√ 1 e− 2 y>0
fY (y) = 2πy
0 y<0
The expectations of Y are
E[Z 2 ] = µY = 1
E[Y 2 ] = E[Z 2 ] = 3
σY2 = E[Y 2 ] − µ2Y = 2
The Characteristic function of Y is
1
φjω = (1 − j2ω)− 2
When r=1 then V1 = Y1 = Y and
fV1 (v) = fY (v)
E[ V1 ] = µY = 1
V ar[V1 ] = σY2 = 2
1
φV1 (jω) = (1 − j2ω)− 2
When r=2 then V2 = Z1 + Z2 = Y1 + Y2 = V1 + Y
E[ V2 ] = 2µY = 2
V ar[V2 ] = 2σY2 = 4
φV2 (jω) = φV1 (jω)2 = (1 − j2ω)−1
Conditional joint pdf for Y is
1 − v2

2e v>0
2
fV2 (v) =
0 v<0

When r=3 then V3 = Z1 + Z2 + Z3 = Y1 + Y2 + Y3 = V2 + Y


E[V3 ] = E[V2 ] + µY = 3
V ar[V3 ] = V ar[V2 ] + σY2 = 6
3
φV3 (jω) = [φY (jω)]3 = (1 − j2ω)− 2

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 67
1.7. Selected Topics Chapter 1. Multiple Random Variables:[?, ?, ?]

− v2
 pv
2 πe v>0
2
fV3 (v) =
0 v<0

Continuing and in general r ≥ 1

( v2
1
τ (r/2)2r/2
v ( 2r − 1)e− 2 v>0
fV (v) =
0 v<0

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 68
1.7. Selected Topics Chapter 1. Multiple Random Variables:[?, ?, ?]

1.7.2 Student’s t Random Variables


The random variable T where, for integers for r ≥ 1
Z
T = p
V /r

The joint pdf


Z ∞
fT V = fT V (t, v)dv
0

By exchanging (t, v) with (x, y)


Z ∞
fT V = fT (t|v)fV (v)dv
0

r
r
T = Z
v

r
v p
fT (t|v) = fZ ( v/rt)
r

r
v −(r2 /r)(v/2)
fT (t|v) = e
2πr

Z ∞
1 2 /r)(v/2)
fT (t) = √ v [(r+1)/2−l] e−(1+t dv
2πr(r/2)(2r/2 ) 0

Let

w = (1 + t2 /r)(v/2)

Z ∞
1
fT (t) = √ w[(r+1)/2−l] e−w dw
2πrτ (r/2)(1 + t2 /r)( r + 1/2) 0

Γ r+1

2
fT (t) = √
πrΓ(r/2)(1 + t2 /r)( r + 1/2)

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 69
1.7. Selected Topics Chapter 1. Multiple Random Variables:[?, ?, ?]

1.7.3 Cauchy Random Variables


Consider a random variable X which is zero mean Gaussian random variable and another variable Y which
is zero mean Gaussian and these two related by the following relation
X
W =a
Y

1 2
fX (x) = a √ e−x /2 −∞<x<∞
σ 2π

1 2
fY (y) = a √ e−y /2 −∞<y <∞
σ 2π
Assume that the joint pdf fW Y (w, y) is known then the pdf for the random variable W is

Z
fW (w) = −∞∞ fW Y (w, y)dy

Z
fW (w) = −∞∞ fW (w|y)fY (y)dy

If y is variable with range −∞ < y < ∞ then


a
W = X
y

fW (w|y) = (y/a)fX (wy|a)|y ≥ 0 − (y/a)fX (wy/a)|y ≤ 0

Z Z

fW (w) = −∞ (y/a)fX (wy|a)fY (y)dy + −∞∞ (y/a)fX (wy/a)fY (y)dy

Z ∞
1
exp −(1 + (w/a)2 y 2 /2σ 2 ydy
 
fW (w) =
aπσ 2 0

v = (1 + (w/a)2 y 2 /2σ 2
a
fW (w) = −∞<y <∞ a>0
π(w2 + a2 )

The cdf is
Z w
FW (w) = fW (x)
−∞
1 w 1
= tan−1 + −∞<x<∞ a>0
π a 2
The characteristic function is

φW (jw) = exp(−a|w|) −∞<x<∞

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 70
1.7. Selected Topics Chapter 1. Multiple Random Variables:[?, ?, ?]

1.7.4 Rayleigh Random Variables [?]


Consider a two independent Gaussian random variable X and Y with zero mean and same variance σ and
are expressed in the following relation
1 1 x 2
fX (x) = √ e− 2 ( σ )
σ 2π
1 1 y 2
fy (y) = √ e− 2 ( σ )
σ 2π

fXY (x, y) = fX (x) × fy (y)


1 − 12 (x2 +y2 )
= 2
e 2σ
σ 2π
Let

x = rcosθ y = rsinθ 0 ≤ r < ∞ 0 ≤ θ ≤ 2π

p
r = x2 + y 2

dxdy = rdrdθ

fXY (x, y)dxdy = P (r, θ)drdθ


r − 1 2 (r 2 )
P (r, θ)drdθ = 2
e 2σ
σ 2π
Z 2π
P (r)θ = P (r, θ)
0
Z 2π
r 1 2
= e− 2σ2 (r ) dθ
0 σ 2 2π
r − 12 (r2 ) 2π
= e 2σ [θ]0
σ 2 2π
r − 2 (r2 )
1
= e 2σ
σ2


r2
r − 2σ2


σ2
e r≥0
f (r) =

0 Otherwise

2r − r2
f (r) = e b r≥0
b

 2
2r − rb


 b e r≥0

fR (r) = r≥0



0 Otherwise

 2
− rb


 1 − e r≥0

FR (r) = r ≥ 0



0 Otherwise

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 71
1.7. Selected Topics Chapter 1. Multiple Random Variables:[?, ?, ?]

1.7.5 Central Limit Theorem


Central Limit Theorem states that the sums of independent and identically distributed (IID)
random variables can become a Gaussian random variable.
Let X1 , X2 , X3 , Xn are independent and identically distributed (IID) random variables, then their sum
is
n
X
W = Xi
i=1

For independent random variables X and Y , the distribution fZ of Z = X + Y equals the convolution
of fX and fY :

(x−µX )2
1 −
2σ 2
fX (x) = √ e X
2πσX
(x−µY )2
1 −
2σ 2
fY (y) = √ e Y
2πσY
By taking Fourier transform
 2 2
σ ω
F{fX } = FX (ω) = exp[−jωµX ]exp − X
2
 2 2
σ ω
F{fY } = FY (ω) = exp[−jωµY ]exp − Y
2

fZ (z) = (fX ∗ fY )(z)


= F −1 F{fX }.F{fY }
 2 2  2 2
σ ω σ ω
= F −1 {exp[−jωµX ]exp − X exp[−jωµY ]exp − Y }
2 2
 2 + σ 2 )ω 2 
(σX
−1 Y
= F {exp[−jω(µX + µY )]exp − }
2
2
= N (z; µX + µY , σX + σY2 )

Consider a random variable Z is Gaussian distributed with parameters µ and σ, (abbreviated as


N (µ; σ 2 ), if it is continuous with p.d.f. (probability density function)
1 Z2
φ(Z) = √ e− 2

Let Z1 , Z2 , Z3 , Zn be i.i.d. standard Gaussians, , then their sum is
n
X
W = Zi
i=1
2
X
= Zi = Z1 + Z2
i=1
2 2
1 Z1 1 Z2
= √ e− 2 + √ e− 2
2π 2π
1 − Z12 +Z22
= √ e 2

Dr. Manjunatha P Prof., Dept of ECE, JNN College of Engg Shimoga [email protected] 9964378365 72

You might also like