0% found this document useful (0 votes)
15 views

Lecture #11: Distributions With Random Parameters

This document contains notes from Statistics 351 lecture 11. It discusses distributions with random parameters, including examples of a jointly distributed random vector (X,Y) with density f(x,y), and calculating related marginal densities and conditional densities. It also covers the law of total probability for continuous and discrete random variables X and Y, and provides an example of determining the unconditional distribution of X given X|M=m ~ Po(m) and M ~ Exp(1).

Uploaded by

mohamed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views

Lecture #11: Distributions With Random Parameters

This document contains notes from Statistics 351 lecture 11. It discusses distributions with random parameters, including examples of a jointly distributed random vector (X,Y) with density f(x,y), and calculating related marginal densities and conditional densities. It also covers the law of total probability for continuous and discrete random variables X and Y, and provides an example of determining the unconditional distribution of X given X|M=m ~ Po(m) and M ~ Exp(1).

Uploaded by

mohamed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Statistics 351 (Fall 2015)

Prof. Michael Kozdron

October 2, 2015

Lecture #11: Distributions with Random Parameters


Example. Suppose that the random vector (X, Y )0 has joint density function
(
e y , if 0 < x < y < 1,
fX,Y (x, y) =
0,
otherwise.
(a) Determine fX (x), the marginal density function of X.
(b) Determine fY (y), the marginal density function of Y .
(c) Calculate fX|Y =y (x), the conditional density function of X given Y = y.
(d) Calculate fY |X=x (y), the conditional density function of Y given X = x.
Solution. For (a) we have, by definition, that
Z 1
fX (x) =
e y dy =
e
x

= e x,

x > 0,

implying that X 2 Exp(1). For (b) we have, by definition, that


Z y
fY (y) =
e y dx = ye y , y > 0,
0

implying that Y 2 (2, 1). Thus, for (c) we conclude that


fX|Y =y (x) =

fX,Y (x, y)
e y
1
=
= ,
y
fY (y)
ye
y

0 < x < y,

implying that X|Y = y 2 U (0, y). Finally, for (d) we find


fY |X=x (y) =

fX,Y (x, y)
e
=
fX (x)
e

y
x

= ex y ,

0 < x < y < 1.

Example. Suppose that (X, Y )0 is a jointly distributed random variable with density function
(
cxy, if 0 < y < 1 and 0 < x < y 2 < 1,
fX,Y (x, y) =
0,
otherwise,
Z 1Z 1
where the value of the normalizing constant c is chosen so that
fX,Y (x, y) dx dy = 1.
1

(a) Determine the value of c.


(b) Compute fX (x).
111

(c) Compute fY (y).


(d) Compute fY |X=x (y).
Solution. (a) We find
Z

1
0

1
p

xy dy dx =
x

1
0

1
x y2
2

y=1

1
dx =
p
2
y= x

x(1
0

1
x) dx =
2

1 2
x
2

1 3
x
3

=
0

1
12

so that c = 12.
(b) By definition,
fX (x) =

1
12xy dy = 12x y 2
p
2
x

y=1
p
y= x

= 6x(1

x)

provided that 0 < x < 1.


(c) By definition,
fY (y) =
provided that 0 < y < 1.

y2
0

1
12xy dy = 12y x2
2

x=y 2

= 6y 5
x=0

(d) By definition, if 0 < x < 1 is fixed, then


fY |X=x (y) =
provided that

fX,Y (x, y)
12xy
2y
=
=
fX (x)
6x(1 x)
1 x

x < y < 1.

Last class we introduced the law of total probability. It turns out that if X and Y are jointly
distributed random variables, then we can generalize this result.
Suppose that X is continuous.
If Y is also continuous, then
Z 1
Z
fX (x) =
fX,Y (x, y) dy =
1

fX|Y =y (x)fY (y) dy.

However, if Y is discrete, then


X
X
fX (x) =
fX,Y (x, y) =
fX|Y =y (x)P {Y = y}.
y

On the other hand, suppose that X is discrete.

112

If Y is continuous, then
P {X = x} =

1
1

P {X = x|Y = y}fY (y) dy.

However, if Y is also discrete, then


X
P {X = x} =
P {X = x|Y = y}P {Y = y}.
y

Example. Suppose that X|M = m 2 Po(m) with M 2 Exp(1). Determine the (unconditional) distribution of X.
Solution. By the law of total probability, we find that for k = 0, 1, 2, . . .,
Z 1
Z 1 x k
Z
e x
1 1 k
x
P {X = k} =
P {X = k|M = x}fM (x) dx =
e dx =
x e
k!
k! 0
0
0
Making the substitution u = 2x, du = 2 dx gives
Z
Z
Z 1
1
1 1 k 2x
1 1 k k u 1
x e
dx =
u 2 e 2 du = k+1
uk e
k! 0
k! 0
2 k! 0
since (k + 1) = k!. Hence, we see that P {X = k} = 2
conclude that X 2 Ge(1/2).

113

k 1

du =

2x

dx.

(k + 1)
1
= k+1
k+1
2 k!
2

, k = 0, 1, 2, . . ., and so we

You might also like