0% found this document useful (0 votes)
46 views3 pages

Mit6 041SCF13 L07

This document summarizes key concepts from a lecture on multiple random variables: - It introduces concepts like joint PMFs, conditional probability, and independence of random variables. - It reviews expectations and introduces how to calculate expectations of functions of random variables. - It discusses the binomial distribution and its mean and variance. - It presents an example problem about the number of people picking their own hat from a box of randomly distributed hats.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views3 pages

Mit6 041SCF13 L07

This document summarizes key concepts from a lecture on multiple random variables: - It introduces concepts like joint PMFs, conditional probability, and independence of random variables. - It reviews expectations and introduces how to calculate expectations of functions of random variables. - It discusses the binomial distribution and its mean and variance. - It presents an example problem about the number of people picking their own hat from a box of randomly distributed hats.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

LECTURE 7 Review

• Readings: Finish Chapter 2


pX (x) = P(X = x)
Lecture outline
pX,Y (x, y) = P(X = x, Y = y)
• Multiple random variables
pX |Y (x | y) = P(X = x | Y = y )
– Joint PMF
– Conditioning
– Independence !
pX (x) = pX,Y (x, y)
• More on expectations y

• Binomial distribution revisited pX,Y (x, y) = pX (x)pY |X (y | x)

• A hat problem

Independent random variables Expectations

!
pX,Y,Z (x, y, z) = pX (x)pY |X (y | x)pZ |X,Y (z | x, y) E[X] = xpX (x)
x
!!
• Random variables X, Y , Z are
E[g(X, Y )] = g(x, y)pX,Y (x, y)
x y
independent if:
" #
pX,Y,Z (x, y, z) = pX (x) · pY (y) · pZ (z) • In general: E[g(X, Y )] #= g E[X], E[Y ]
for all x, y, z
y • E[αX + β] = αE[X] + β
4 1/20 2/20 2/20
• E[X + Y + Z] = E[X] + E[Y ] + E[Z]
3 2/20 4/20 1/20 2/20

2 1/20 3/20 1/20 • If X, Y are independent:


1 1/20
– E[XY ] = E[X]E[Y ]
1 2 3 4 x

– E[g(X)h(Y )] = E[g(X)] · E[h(Y )]


• Independent?

• What if we condition on X ≤ 2
and Y ≥ 3?

1
Variances Binomial mean and variance

• Var(aX) = a2Var(X) • X = # of successes in n independent


trials
• Var(X + a) = Var(X)
– probability of success p
n
! "n#
• Let Z = X + Y . E[X] = k pk (1 − p)n−k
k
If X, Y are independent: k=0


Var(X + Y ) = Var(X) + Var(Y ) 1, if success in trial i,
• Xi =
0, otherwise

• Examples:
• E[Xi] =
– If X = Y , Var(X + Y ) =

– If X = −Y , Var(X + Y ) = • E[X] =

– If X, Y indep., and Z = X − 3Y ,
Var(Z) = • Var(Xi) =

• Var(X) =

The hat problem Variance in the hat problem

• n people throw their hats in a box and • Var(X ) = E[X 2] − (E[X])2 = E[X 2] − 1
then pick one at random.
– X: number of people who get their own
! !
hat X2 = Xi2 + XiXj
i i,j :i=j
#
– Find E[X]

• E[Xi2] =

1, if i selects own hat
Xi =
0, otherwise.
P(X1X2 = 1) = P(X1 = 1)·P(X2 = 1 | X1 = 1)
• X = X1 + X2 + · · · + Xn
=
• P(Xi = 1) =

• E[Xi] =

• Are the Xi independent? • E[X 2 ] =

• E[X] = • Var(X) =

2
MIT OpenCourseWare
http://ocw.mit.edu

6.041SC Probabilistic Systems Analysis and Applied Probability


Fall 2013

For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

You might also like