0% found this document useful (0 votes)
2 views

lecture4

The document discusses the concepts of joint distributions, conditional distributions, and expectations for multiple random variables. It defines key terms such as joint probability mass function, conditional expectation, covariance, and independence among random variables. Additionally, it covers properties of variance and covariance, as well as the implications of independence for expected values and variances of random variables.

Uploaded by

민솔
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

lecture4

The document discusses the concepts of joint distributions, conditional distributions, and expectations for multiple random variables. It defines key terms such as joint probability mass function, conditional expectation, covariance, and independence among random variables. Additionally, it covers properties of variance and covariance, as well as the implications of independence for expected values and variances of random variables.

Uploaded by

민솔
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

14157ft Discrete R v continued

random vectors
multiple r v
joint distributions
conditional distributions
conditional expectation

IE FiiIItiE

We need a notion of the joint combined probability distribution

Def The jo ntpmfpy.ge IR 0,1 between two r.v s X and Y is

Pxy lx.ge P X
T xandY y

Def The m nanfofxfromlthejointpyy.is defined as

the probability distribution of X when Y is not considered

ZPxylx.TT
Px lxl

Likewise one can computethmarginalfor Y by summing out x̅

Multiple random variables can be collected and formed into a vector


X Xn R R ne IN
if 5ᵗʰ toss is H
e
g tossing a coin n times Set
it ith toss is T
The vector x̅ describes the total result of the composite experiment

Def The jo ntpmfpy.IR 0,1 of random vector X In

M EIP X
xi.xmx.kz Xn
Likewise the joint cmf 1 R 0,1 is defined

msn.net
random vectors
iiiii ei t
The expectation is taken over the joint pmf

1 glx.ME
Pxcx.y g x.g

The linearity popertyextend stolenearcmbinationsofr.v.is

g ZW aztbw

IE ax̅ bY ax x
by Pxy y
a x b ypxycx.ly
xp y y

EE
aIE x̅
É Effie
belR
bIE Y a

Likewise for more than 2 r.v s

Recall that the variance is a measure of the variability of one v.v x̅


We can define a new way to measure the joint variability among
multiple v v s

Def The covariance of X and Y is defined as

t.at
IIIIII
IEEI amans
and the

Note that Cov X X Var X and p X Y 1,1 always


The tends to show linear relationship
sign of the covariance a

if of x̅ correspond to larger values of Y


larger values
and smaller values of x̅ correspond to smaller values of
Y and vice versa then Coley
if of x̅ correspond to smaller values of Y
larger values
and smaller values of x̅ correspond to larger values of
Y and vice versa then Eiko
The magnitude of the covariance reflects the strength of the
relationship between X and Y

Def and Y are inelated if Cov X Y 0

For random vectors x̅ x̅ In for n 2 the covariance

and correlation are as a matrix


usually represented
Var X Cov X X2 Cov Xi Xn

ou X2 X Var X2
Cov x̅ E

anti varix

There are several important properties of the covariance


1 Cov X c 0 constant CER
4 Cov X Y Cov Y x̅
3
Cov a x̅ by abCov X Y a ber

all i.e M MT
By property 2 covariance matrices are
symmetric

m For v.v X and Y

proof exercise for


FEE.IE
tEi
The covariance can also be used to obtain a useful formula about the
sum of two v.v s

m
ii iEEEI
msn.it i
t.FI
properties of variance
i i

i f
For simplicity of notation let x̅
m.am

x̅ E X

var E E x̅
E E x̅
E ÉE x̅
E E x̅ E E x̅ x̅ which is the
desired result
as probability PCBIA for two
events A BE F This notion can be extended more generally to v.v s

Def For two v.v s x̅ Yon R F P the

mass function in discrete case conditionaldispiffin


of 5 given x

s.t IP x 0 is pyx
o x

IP Y 11
pyx.ly x y1X x

Note that we can write pay


by the definition of conditional

probability

The usual properties of a pmf follow for the condition at mass as

well In particular Py y x 1 xeR where IP x̅ x 0

Def The expected value of the conditional distribution Pyx is the


conditionalexpectations of Y given x̅ written IELY x̅

IE YI x̅ y Pay y x̅
2

Remark While we defined the conditional distribution I and

Fditional expectation 2 as given another v.v the same


notation is often used when conditioning on a fixed event
AE F That is

Pyaly IP Y y A I YA y Pya y A

Many of the usual expectation properties we've seen before also hold
for the conditional expectation For example
For function g IR IR I g Y A g y Pata y A
any
LawofTotalExpectation i.fr partition A of 1

www.tt iEiiII I ntaimaay


When conditioning on a fixed event A E Y A is a fixed value
But in 2 we see that IE YT X is actually another r.v not a
fixed value What happens if we to compute the of
try mean

this v.v

EE
EIE
sEnII
profo simply
IE IE YI x̅ IE Y Px x

x x
y pyyx.ly px

I y Pxe g Elt

A useful consequence of the tower property is that


9EPq
fff
IELY g X IE ELYI x̅ g XD

One can now relate back to the law of total expectation as

IELY E Y1x̅ x P x̅ x
Def The conditionalvariance of given x̅ is

Var Y x̅ IE Y IELY x̅ x̅

It essentially determines how much


variability is left after

conditioning on
knowing x̅ and using IE YI x̅ as the center

tii
for you
eaia
proof exercise

Independencer
With a
way to describe multiple v.v s we are now ready to talk
about their independence

independence of a ru R IR from an event AEF

This is similar to the concept of two events being independent of each other
the events and A must be independent events xeR

i l
É
Alternatively
É
ÉÉ

ÉÉÉÉ f

74 in
y

independence of two r.v.SI RandYE1Ry if andonly if


Lemma Discrete v.v s X and Y are independent if
Px x y expy.LT
Px x
ye R
ie the events and Y y must be independent events xy ER
Alternatively in terms of conditional distributions

I
ii t.FI i
itti

independent
if the following holds

H
ifi.IT
e
ii t
Lemma If X Y are independent v.v s then E
ELXEfTT
T

proof IE XY xyPxy x
y

xypxCXPy.ly
XP x YPy.ly ELXTECY

More generally for


any functions g IR IR h IR IR we have

independence of X Y implies
IE g X h Y IE g X E h Y

Lemma If X Y are independent v.v s then

at tY varcx ar
proof exercise for y1
Mutal independence of multiple r.v s x̅ In follow similar formulas
Px Xi Xn Px Xi Pyn Xn s

Var X Xn Var X Var Xn


Def A collection of r.ve's x̅ In are said to be inendentand

Ime probability distribution as all the


other x̅
2
all the x̅ are
mutually independent
e
g sequence of coin tosses is i i d sequence of die rolls is ii d etc

For i id x̅ In equation 3 becomes

Px Xii Xn x

Var X Xn n Var X
where x̅ has the same distribution as each x̅ and was defined
so that we can remove the subscript Sometimes it is conventional
to write x̅ X

You might also like