Lecure-4 Probability
Lecure-4 Probability
By Tafari Lemma
由NordriDesign提供
www.nordridesign.com
Lecture Outline
• Joint PMF
• Joint CDF
• Joint PDF
• Marginal Statistics
• Independence
• Conditional Distributions
• Correlation and Covariance
2
Multiple Random Variables
Example: Let X and Y denote the blood pressure and heart rate
of a randomly chosen ASTU student (during an exam).
3
Joint Probability Mass Functions
4
Example
Suppose you want to study the relation between the number of
bars in your mobile, and the quality of the call. You collect data
over time and come up with the following stochastic model:
1 2 3 4
0 0 0 0 0
1 0.10 0.05 0 0
2 0.05 0.10 0.04 0.01
3 0 0.05 0.15 0.15
4 0 0 0.05 0.25
It’s easy to check that this table describes a valid joint probability
mass function
5
Marginal Probability Distributions
These are valid probability mass functions on their own, and are
called the marginal p.m.f.’s of X and Y.
6
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
7
Conditional Probability Distribution
What information does one variable carry about the other?
8
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
9
Properties
Note that the conditional probability mass function is actually a
proper mass function therefore
10
Independence of Random V.’s
There are situations were knowing the value of X doesn’t tell
something about Y and vice-versa. This brings up to notion of
independence of random variables.
10
11
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
12
Example
13
Example
a. P
xi yj
XY ( xi , y j ) = 1
2 2
k (2 xi + y j ) = 1
xi =1 y j =1
k[( 2 + 1) + (2 + 2) + (4 + 1) + (4 + 2)] = 1
18k = 1
k = 1 / 18
14
Example
1 1
PX ( xi ) = (2 xi + 1) + (2 xi + 2)
18 18
1
(4 xi + 3), xi = 1, 2
PX ( xi ) = 18
0, otherwise
15
Example
1 1
PY ( y j ) = (2 + y j ) + (4 + y j )
18 18
1
(2 y j + 6), y j = 1, 2
PY ( y j ) = 18
0, otherwise
16
Example
c. PXY ( xi , y j ) PX ( xi ) PY ( y j )
X and Y are not independent.
17
The Joint Cumulative Distribution Function
FXY ( x, y ) = P[ X ( ) x and Y ( ) y ]
FXY ( x, y ) = P( X x, Y y )
where x and y are arbitrary real numbers.
Properties of the Joint cdf, FXY(x, y):
i. 0 FXY ( x, y) 1
ii. lim FXY ( x, y ) = FXY (, ) = 1
x →
y →
19
The Joint Probability Density Function
2 FXY ( x, y )
f XY ( x, y ) =
xy
▪ Thus, the joint cumulative distribution function (cdf) is
given by:
y x
FXY ( x, y ) = f XY (u , v)dudv
- -
20
Example
21
Marginal Probability Distributions
22
Conditional Probability Distributions
23
Properties
Note that the conditional probability density function is actually
a proper density function therefore
24
Example
E x a m p l e : Let X be the input to a communication channel and Y the
output. The input to the channel is + 1 volt or −1 volt with equal probability.
The ou tput of the channel is the input plus a noise voltage N t h a t is
uniformly distributed in the interval [−2, +2] volts. Find P [X = +1, Y ≤ 0].
Solution:
P [X = + 1, Y ≤ y] = P [Y ≤ y|X = + 1]P [X = + 1],
where P [X = +1] = 1/2. When the input X = 1, the out put Y is
uniformly distributed in the interval [−1,3]. Therefore,
y + 1 for −1 ≤ y ≤ 3.
P [Y ≤ y|X = +1] =
4
25
Example
E x a m p l e : Let X be the input to a communication channel and let
Ybe the output. The input to the channel is + 1 volt or −1 volt with
equal probability. The o ut put of the channel is the input plus a
noise voltage N t h a t is uniformly distributed in the interval [−2,+2]
volts. Find the probability t h a t Yis negative given t h a t X is + 1 .
Solution
If X = + 1 , then Y is uniformly distributed in the interval [−1, 3] a nd
f (y|1) = 1
−1 ≤ y ≤ 3
Y 4
Thus
.
Independence of Random Variables
As with discrete random variables there are situations were
knowing the value of X doesn’t tell something about Y and vice-
versa. This brings up again the notion of independence of random
variables.
22
27
Examples on Two Random Variables
Example-1:
The joint pdf of two continuous random variables X and Y is
given by: kxy , 0 x 1, 0 y 1
f XY ( x, y ) =
0 , otherwise
whe re k is a constant.
a. Find the value of k .
b. Find the marginal pdf of X and Y .
c. Are X and Y independent?
d. Find P( X + Y 1)
e. Find the conditional pdf of X and Y .
28
Examples on Two Random Variables Cont’d……
1 1
a.
- −
f XY ( x, y )dxdy = 1
0 0 kxydxdy= 1
x2 1
1
k y = 1
0
2 0
k 1 y 2 1 k
ydy = k = = 1
2 0 4 0 4
k = 4
29
Examples on Two Random Variables Cont’d……
y2
1
f X ( x) = 4 x = 2 x
2 0
2 x , 0 x 1
f X ( x) =
0, otherwise
30
Examples on Two Random Variables Cont’d……
x2 1
fY ( y ) = 4 y = 2 y
2 0
2 y , 0 y 1
fY ( y ) =
0, otherwise
31
Examples on Two Random Variables Cont’d……
c. f XY ( x, y) = f X ( x) fY ( y)
X and Y are independent
1 1− y 1 x2 1
d . P( X + Y 1) = 4 xydxdy = 4 y dy
0 0 0
2 0
1 1
= 4 y[1 / 2(1 − y ) ]dy = 2( y − 2 y 2 + y 3 )dy
2
0 0
= 2( y 2 / 2 − 2 y 3 / 3 + y 4 / 4) = 1 / 6
P( X + Y 1) = 1 / 6
32
Examples on Two Random Variables Cont’d……
2 x, 0 x 1, 0 y 1
f X / Y ( x / y) =
0, otherwise
33
Examples on Two Random Variables Cont’d……
2 y, 0 x 1, 0 y 1
fY / X ( y / x) =
0, otherwise
34
Examples on Two Random Variables Cont’d……
Example-2:
The joint pdf of two continuous random variables X and Y is
given by:
k , 0 y x 1
f XY ( x, y ) =
0, otherwise
whe re k is a constant.
a. Determine the value of k .
b. Find the marginal pdf of X and Y .
c. Are X and Y independent?
d . Find P(0 X 1 / 2)
e. Find the conditional pdf of X and Y .
35
Examples on Two Random Variables Cont’d……
1 1
a.
- −
f XY ( x, y )dxdy = 1
0 kdxdy = 1
y
1
k (x ) = 1
1
0 y
1 y 2 1 k
k (1 − y )dy = k y − = = 1
0
2 0 2
k = 2
36
Examples on Two Random Variables Cont’d……
x
f X ( x) = (2 y ) = 2 x
0
2 x , 0 x 1
f X ( x) =
0, otherwise
37
Examples on Two Random Variables Cont’d……
1
fY ( y ) = (2 x ) = 2(1 − y )
y
2(1 − y ), 0 y 1
fY ( y ) =
0, otherwise
38
Examples on Two Random Variables Cont’d……
c. f XY ( x, y) f X ( x) fY ( y)
X and Y are not independent
1/ 2 x
d . P(0 X 1 / 2) = f XY ( x, y )dydx
0 0
1/ 2 x 1/ 2 x
= 2dydx = (2 y ) dx
0 0 0 0
1/ 2 1/ 2
= 2 xdx = x = 1/ 4 2
0 0
P(0 X 1 / 2) = 1 / 4
39
Examples on Two Random Variables Cont’d……
40
Examples on Two Random Variables Cont’d
41
Covariance
Definition: Covariance
42
Covariance
The covariance is a measure of linear relationship between
random variables. If one of the variables is easy to predict as a
linear function of the other then the covariance is going to be
non-zero.
Definition:
30
43
Correlation Coefficient
It is useful to normalize the covariance, and define the
44
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
45
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
46
Cont’d
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
47
Cont’d
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
Proposition:
49
Correlation
Uncorrelation IS NOT EQUIVALENT to Independence
It’s important to note that the implication goes only in one direction:
50
Thank You !!!