Random Variables
Random Variables
19-January-2011
WPI
19-January-2011
1 / 37
2. Mathematical notation.
3. Course introduction.
4. Review of essential probability concepts.
WPI
19-January-2011
2 / 37
Some Notation
{a, b, c}2 is shorthand for the set {aa, ab, ac, ba, bb, bc, ca, cb, cc}.
R2 is the two-dimensional real plane.
R3 is the three-dimensional real volume.
An element of a set: s S.
A subset: W S.
The probability of an event A: Prob[A] [0, 1].
The joint probability of events A and B: Prob[A, B] [0, 1].
The probability of event A conditioned on event B:
Prob[A | B] [0, 1].
WPI
19-January-2011
3 / 37
y(t)
0.2
0.4
0.6
0.8
1
time
1.2
1.4
1.6
1.8
19-January-2011
4 / 37
y(t)
0.2
0.4
0.6
0.8
1
time
1.2
1.4
1.6
1.8
19-January-2011
5 / 37
L1
X
h sk + wk
=0
In some scenarios, we may want know the bits that were sent and the
channel coefficients. This is a joint estimation and detection problem.
Why?
WPI
19-January-2011
6 / 37
Consequences
To develop optimal decision rules or estimators, we need to quantify the
consequences of incorrect decisions or inaccurate estimates.
Simple Example
It is not known if a coin is fair (HT) or double headed (HH). We are given
one observation of the coin flip. Based on this observation, how do you
decide if the coin is HT or HH?
Observation
H
T
Rule 1
HH
HH
Rule 2
HT
HT
Rule 3
HH
HT
Rule 4
HT
HH
Suppose you have to pay $100 if you are wrong. Which decision rule is
optimum?
WPI
19-January-2011
7 / 37
If the coin is HT (fair), the decision was wrong and you must pay
$100.
If the coin is HH (double headed), the decision was right and you pay
nothing.
19-January-2011
8 / 37
If the coin is HT (fair), the decision was right and you pay nothing.
If the coin is HH (double headed), the decision was wrong and you
must pay $100.
WPI
19-January-2011
9 / 37
WPI
19-January-2011
10 / 37
19-January-2011
11 / 37
Remarks
HT
HH
p(y|)
states of nature
y
observations
Is Rule 3 optimal?
WPI
19-January-2011
12 / 37
4
6
4
2
1
2
+
+
+
+
=
36 36 36 36 36
2
WPI
19-January-2011
13 / 37
19-January-2011
14 / 37
x
11500
WPI
19-January-2011
15 / 37
pX (x) dx = 1.
Rb
Prob[a < X b] =
a pX (x) dx = FX (b) FX (a).
Example: Let X be the Dow Jones average at the close on Friday.
pX (x)
x
11500
WPI
19-January-2011
16 / 37
Definition
The variance of the random variable X is defined as
Z
(x E[X])2 pX (x) dx.
var[X] =
p
var[X].
19-January-2011
17 / 37
WPI
19-January-2011
18 / 37
pX (x) =
1
ba
axb
otherwise
What
What
What
What
What
WPI
is
is
is
is
is
Prob[X = 3]?
Prob[X < 2]?
Prob[X > 1]?
E[X]?
var[X]?
D. Richard Brown III
19-January-2011
19 / 37
1
n
and
1
((x s1 ) + + (x sn ))
n
19-January-2011
20 / 37
19-January-2011
21 / 37
If X and Y have the same PDF, then they have the same CDF, the
same mean, and the same variance.
If X and Y have the same mean and/or variance, they might have the
same PDF/CDF but not necessarily.
mean
WPI
var
19-January-2011
22 / 37
Joint Events
Suppose you have two events A and B. We can define a new event
C = both A and B occur
and we can write
Prob[C] = Prob[A B] = Prob[A, B]
A
AB
B
WPI
19-January-2011
23 / 37
Prob[A, B]
Prob[A B]
=
Prob[B]
Prob[B]
A
B
WPI
19-January-2011
24 / 37
19-January-2011
25 / 37
n
X
i=1
Prob[A | Bi ]P [Bi ].
19-January-2011
26 / 37
Independence of Events
Two events are independent if and only if their joint probability is equal to
the product of their individual probabilities, i.e
Prob[A, B] = Prob[A]Prob[B]
Lots of events can be assumed to be independent. For example, suppose
you flip a coin twice with A =the first coin flip is heads, B =the
second coin flip is heads, and C =both coin flips are heads.
Are A and B independent?
Are A and C independent?
Note that when events A and B are independent,
Prob[A | B] =
Prob[A]Prob[B]
Prob[A, B]
=
= Prob[A].
Prob[B]
Prob[B]
19-January-2011
27 / 37
1. What is the probability that the first ball you select from the urn is
black?
2. Given that the first ball that you select from the urn is black, what is
the probability that the second ball you select from the urn is also
black?
3. What is the probability that both balls you select are black?
WPI
19-January-2011
28 / 37
2. If the ball removed from the urn is white, what is the probability that
the ball remaining in the urn is white?
WPI
19-January-2011
29 / 37
2
FX,Y (x, y)
xy
If you know the joint distribution, you can get the marginal distributions:
FX (x)
FY (y) =
pX (x)
pY (y) =
FX,Y (x, )
FX,Y (, y)
Z
pX,Y (x, y) dy
Z
pX,Y (x, y) dx
Marginals are not enough to specify the joint distribution, except in special cases.
WPI
19-January-2011
30 / 37
Joint Statistics
Note that the means and variances are defined as usual for X and Y .
When we have a joint distribution, we have two new statistical quantities:
Definition
The correlation of the random variables X and Y is defined as
Z Z
xypX,Y (x, y) dx dy.
E[XY ] =
Definition
The covariance of the random variables X and Y is defined as
Z Z
cov[X, Y ] =
(x E[X])(y E[Y ])pX,Y (x, y) dx dy.
WPI
19-January-2011
31 / 37
Conditional Distributions
If X and Y are both discrete random variables (both are drawn from finite sets)
with Prob[X = x] > 0, then
Prob[Y = y | X = x] =
Prob[X = x, Y = y]
Prob[X = x]
pX,Y (x, y)
pX (x)
=y]
where pX,Y (x, y) := limh0 Prob[xh<Xx,Y
is the joint PDF-probability of
h
the random variable X and the event Y = y.
If X and Y are both continuous random variables, then the conditional PDF of Y
given X = x is
pY (y | X = x) = pY (y | x) =
pX,Y (x, y)
.
pX (x)
19-January-2011
32 / 37
Conditional Statistics
Definition
The conditional mean of a random variable Y given X = x is defined as
Z
E[Y | x] =
ypY (y | x) dy.
The definition is identical to the regular mean except that we use the
conditional PDF.
Definition
The conditional variance of a random variable Y given X = x is defined
as
Z
(y E[Y | x])2 pY (y | x) dx.
var[Y | x] =
WPI
19-January-2011
33 / 37
pX,Y (x, y)
pX (x)pY (y)
=
= pY (y)
pX (x)
pX (x)
pX (x | y) =
pX,Y (x, y)
pX (x)pY (y)
=
= pX (x).
pY (y)
pY (y)
and
19-January-2011
34 / 37
E[(X1 X1 )(X1 X1 )]
..
P =
.
E[(Xk Xk )(X1 X1 )]
WPI
i.e.,
...
..
.
...
E[(X1 X1 )(Xk Xk )]
..
.
E[(Xk Xk )(Xk Xk )]
19-January-2011
35 / 37
Infer if the coin is fair or unfair after observing one or more flips.
HT
HH
p(y|)
states of nature
y
observations
19-January-2011
36 / 37
The Kalman Filter uses a mathematical technique that removes noise from
series of data. From incomplete information, it can optimally estimate and
control the state of a changing, complex system over time. Applications
include target tracking by radar, global positioning systems, hydrological
modeling, atmospheric observations, and time-series analyses in
econometrics.
(paraphrased from http://www.nae.edu/)
WPI
19-January-2011
37 / 37