0% found this document useful (0 votes)
32 views13 pages

Lecture 3 Notes (24.2 Update)

The document discusses multiple random variables X and Y with joint and marginal probability mass/density functions. It defines that the joint PMF/PDF of X and Y is the product of their marginal PMF/PDFs if and only if X and Y are independent. It also discusses transformations of random variables, conditional distributions and expectations, and the properties of sums of independent Poisson random variables.

Uploaded by

likestefano
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views13 pages

Lecture 3 Notes (24.2 Update)

The document discusses multiple random variables X and Y with joint and marginal probability mass/density functions. It defines that the joint PMF/PDF of X and Y is the product of their marginal PMF/PDFs if and only if X and Y are independent. It also discusses transformations of random variables, conditional distributions and expectations, and the properties of sums of independent Poisson random variables.

Uploaded by

likestefano
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

MULTIPLE

RANDOM VARIABLES

Let X,4 be two deserts random vainableswith


prufPx(x and

Py(y) respectively. The JOINT PROBABLTY M55 FUNCTION of and


X Y

defined:
Px,(x,y) P(X x,y y)
is
P(3X 354 y3)
=
= = =
=
=

H(x,y) X(S) xY(S)


-

From the joint


prof device the MARGINAL PROBABILITY
Px,y(x,y) we

MISS FUNCTION
py(x) ofX:
Px(x) P(X x) [P(X x,y y)
= =
= = =

[p(x,y)
=

y y(5)
=

y:p(x,y)>0

Similarly Pylyl=xpixP

Given X,Y caltinuous rv.5, the JOINT PROBABMTY DENSITY FUNCTION

of X,Y isthe function f(x,y) such that:

P(x x,y B)
+
+

(aSfx,y,4) oxoy
=

The MARGINAL DENAT FUNIONS of X and Y are


computed:
-A ↓

fxx) S f(xoly fy(y)=/f(x,y)dx


=

-
D
-
For both calbinous the JOINT
and discrete r.v.5 WMULTI

DISTRIBUTION FUNION Fx,y(X,4) is defined:

Fx,y(x,y) P(X- x,y=y)


=

the r.v.5 X,4 are INDEPENDENT if

P(x A,4 B) P(XCA)P(YCB)


=
+
=

HA,B IR
=

In terms
RESULT of prf/pos, X and Y are independent
H
Px,y(x,y) Px(x)Py(y)
=

and
fx,(x,y) fx(x)fy(y)
=

RESOL If X and Y are independentthen


E[xY] [x] =[4]
=

FRONSFORMATIONS R.V. 5
OF (E [] and Var() (

Given a
generic transformation 9: 1R ->IR *

[[g(x9 xx15,9/px4)
discrete
=

[9/x7 ((xfx(x10
=

-
continuous
RESU For linear transformationswe have:

=[a+bX]ab=[x] =
+

Var(a+bX=b2Var(X)
Result given the r.v.5 X,Y:
[[x 4] (x) [4]
+
=
+

Var(x+y):Var (x)+Varly)
only ifX,Y are independent
CONDITIONAL AISTRIBUTIONS AND EXPECTATIONS

For X,Y discrete the


r.v.s Commonal strution of
x given y=y defined
is provided plyko:
P(X x,y y)
P(x,y)
= =

Px1y(xy) P(X x/y


=

y)= = =
=

P(Y y)
=
P(y)

The CONDITIONALEXPECTATION X
OF given y=y is defined

-(x1Y y] = xP(X x/Y y)


xxxPx,(xy)
= =
=
=

x+ X(5)

~best
guess for X that
knowing y=y
57 X and y are cautinuous r.Vs the CONDITIONAL PROBABLUTY

DENETTY FUNCTION OF X
given y=y is defined

-xi(xy1 fx,y(x,y) =

Fy(4)
The CONMIONAL
ExEUAON of X
given y=y is:

=(x17 4]
xfxi(x14)dx
=
=
EXMPK
consider the r.v. X
representing the score of

a dice with 6 sides any defined:


if
5
I X
is well
y = =

0 if X is odd
The conditional expectations for 4-90,13 are:

E[X1y 9] =

4
=

=(x14 0] = 3
=

isa ruden vanable:


Note that For y=y, X1Y=y

x 1Y 1
=

prnf:Pxx, (2)
has Y:Pxins, (4) Px 14.,(6)
=
=

also X 1Y=1 42,4,63


-

In
general 5[X14] is also a random variable:
= (X13-53,13 and
P)=(x1):3): 12
P(=[x145:4):12
If we by
set then E(x1Y=4]= f(y) is

a value instead.
RECALL Sf XuPoisson (1) p(il=P(X=i) ex
+
=

i !

RESUxvPoisson(ax), YuPoisson (14). X,Y independent


Then X+ywPoisson(dx+by)
PROOF
Pxxy(b) P(X +y n) = = =

=
P(X 0,=n) P(X 1,4 n 1) P(X 2,4 n
=
+ =
=
-
+ =
=
-
z) ...+P(X n,y 0)
+
=
=

-P(X= i,X i/independenceP(X= i/P(Y


n-
=

n
=
-
i) =

si eibey":
>

exx i! (n -i)!
(xxtbu)
(ax+bu)"
n! i noi
(bx+xu)i!(n-il!
-

=
↓xby
-
e
n!
n!

!(i) xs"=jxxx
X+ywPoisson(d,xy)
=

prf of Poisson (dx+du)


evaluated at
n
Two call
EXAMPS centres receive an averaged, and

↳z calls
per day, respectively and
independently.
X = calls of 1 call
=

centre
Y =calls of
=
and call centre
xePoisson (d) YePasson (x2) x,4 independent
on are received
calls the two
a
given day i in

cells centres in total. I.e. XY=h

is the conditional
What distribution of the calls
in centre an such day?
i.e. XIX+Y=h
SOLUTION

P(X bx +y n) P(X h,X y n)


=
=
=

=
+
=

P(X n) =

-P(X h)P(X y=
+
=

n)X h)-

P(X +y n) =

P(X h)P(Y m )
=
=
=

P(X*Y an) =
P(X-h)P(Y ml_
=
=

fbefbu h
P(X*Y=) h! (n-h)!

(i) eibul (bitbu)"


n!
! I j
-!(n-h)
⑳°

↳ ich
(titbul -
(6, bn)(d, di)
+
+

Sal(in)" (a(*(m)p(n-p
=

x/x+= Binomial(n,p) sp ba
=

n-
PROPERLES OF CONMIONAL EXPECTATION

Recall that
[ X1] isa row. While [[X/Y:y) SIR
① (Sndependence) of and Y are
X
independent
E(x14):E[X]
② (Measurability) For
every 5:IR->IR
[f(x)/X] f(x)
= =

more generally
[[f(x)Y(x) f(x) = [41x]
=

③(law of
total expectations

-[x) [E.[X14)]
=

espection ~expectation
wrtxly
Proof
of TOTAL EXPECTION
Law of (Optional(
Assume X,Y auserete, X, Y Nso
law oftotal prob conditioning

i?,P(X=i,45j): i ,P4:j(PIX:i14:j
*

E(x) E,iP(X=i)
=
=

-Plyj)iP(X: i1:j):,PM=j3t[X1=i):
=[ = [xin)) X
YmBe(0.5).
EXAMPLE
Define X asa 6-sideddie ify=0, andasa lo-sided
die if y 1.5.e.
=

55 4 09x(1)
=

Px(2) ...-Px(6)
= =
1/6
=

554 14x(3) Px(10) 0


=
=
=
...
-

-[x] =
?

=[x14 0] 3.5F[x14 1] 5.5


=
=
=
=

[*]
* =(4]] Py(0) =
=[x14 0] + =

Py(z) [X14=1]
0.5.3.5 +0.5. J.I
=>
The CONitowal Voroneof X
given y is
defined
Var(X14): = [(X-5 [x1)) in)
=

e.g.Var(X/4=y): residual variability"ofX


"

knowing that
y
y
=

LAW OF TOT VARIANCE

Var(x) [var(x(4)] var(=[x1u])


=
=
+

Var(X), [(x-=[xSl]=E(=[(x- (x)/]]


Proof
Det
law total
exp

E((((X -

z(x,y) z[xi) ,
-

[x])"4)7:
=

f(f)(x-[xing/in)) [var(x,nIS
-

((z()E+[x949)
-

-
-[[x19 xxx)Y]
-var (*[X14])
-

[FMMT =

f(Y)
measurability
*
f(E(2(x -

-[x1))(=(x14) R
E[x])(95:0
-

N
constantgiven a line." Measurable)
A

=(2=[x [x1]/4/(+ [x14] [x])]


-
-

=(x14)
-

=[(x14)(4):5(x14] f(x14):0 -
o? O?
↑ ↑

Exompa
(var(x) Var(=[x145) iWar (X14)])
=
+

independent i.v. 4, 5, such that


Consider two

YWNC0, 52) and EwNIO, 021, X 4+ 2 =

Var(X) 04 02
=
+

[[x14g [[7 E14]


=

[[414)
+ y[z145 measurability
=
+

independence
Y [Z) Y- =
+ =

var(*(X14])=o2
var(X1y) o2 =
* [(Var(XIY)] o
=

③ SUPER 1611 Or, SUPER


-X,,
cow
-4,
mx,,2
- 9
X,

very concentrated
distribution of XIY
-Xz,l
->
Yz ->
X2,2 In thiscase we have
-x2,3
&

1619H across
groupsvariability
X

very spread out cow between


groupsvariability
distribution of4

~ X3,

-Y - Y3,2
-
X1, 3
FXAMPU (cash machine)
Bank needs to load with money.
a cash machine
enough
On average d= 350 people withdraw money each
day, Fach persen
withdraw independently Xi errorwhere:
Vi
Px, (50):0.3. Px;/100):0.5 Pxi/200):0.2
What isthe expectation and variance of the total amount
mithdrawn in a
day?
Soc N:=
Acostumersin a
day
NwPoisson(b)
↑==total amount with drawn T
xi
=

·[i] 5 =
=

law total
( [TIN]]:
exp def
=[5(,IN7]:s[,[xiINS):
=
E
[NE(x]]:s[NSE[X]= 36750
350-105:

independence

var(t)=E(var (TIN)] +Var (ECTING) 2812500


=

=[Var (TIN=-(,Var(x;19:E(wvar (xi)] =(w) rar(Xi) =

: 350. 2725

var(=[TINS):var (NF[x]):varINIFYxS
:3502105"

You might also like