0% found this document useful (0 votes)
9 views18 pages

5: Discrete Random Variables: Probability Mass Functions and Expectations

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views18 pages

5: Discrete Random Variables: Probability Mass Functions and Expectations

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

LECTURE 5: Discrete random variables:

probability mass functions and expectations

• Random variables: the idea and the definition


- Discrete: take values in finite or countable set

• Probability mass function (PMF)

• Random variable examples


- Bernoulli
Uniform
Binomial
Geometric

• Expectation (mean) and its properties


The expected value rule
- Linearity
Random variables: the idea
e fo ma ·sm

• A andom variable ( u .v ") assoc·ates a value (a umbe )


to every possible outcome
• Mathematically: ,A fu ction fi om the sample spac ,e Q to the real numbers

• t can take discrete or continuous va ues

o a random variable X numer·cal value x

• We can have several random va ·ables defined on the same sample space

• A function of one o several random variables 1s a so a random variab l,e

- meaning of X + Y:
robabi ·ty mass · nction (PM ) of a discrete .v. X X

X
• It is the uprobabil1ty law" or • probability distributio " of X
5
• If we t·x some x, then ux - x" ·s an event

1
prob= -
4
Px(x) - P(X - x) = P({w E Q s.t. X(w) = x})

• Properties: Px(x) >O Px(x) =1


X
M calcula ·on

• wo ro Is of a tetra edral die • Le every possib e ,ou · come ave probability 1/16

4
z _x Y

Y = Second 3 • repeat for a ll z:


roll ------
2 col ect a I possib e outcomes for which Z ·s equa l to z
- add t ei p obab i ·ti es
1

1 2 3 4
X = First roll
pz(z) )~

,._
,
1 2 3 4 5 6 7 8 9 z
Th simples ra dam variable. ernou · wi p ram rpE[ ,1

w. - . p
X = 1,
0, w.p. 1 - p

• Models a trial that results ·n success/fa·1ure, Heads/Tails, etc.

• Indicator r.v. of a event A . IA 1 1f A oc r


e·nomial random var·able, param ters. o ·t·ve int ger n; p E [ , 1

• Experiment. n independent tosses of a co·n with P(Heads) =p


• Sample space. Set of sequences of H and T, of length n
• andom variable X. numb ,e of eads observed
• Mode l of: number of successes in a given number of independent trials

HHH

Px(k)
0
1- p 'fff
n - 3 n -- 10 n - 100
0.4 0.25 0.08

0.35 0.07
0.2
0.3 0.06

0.25 p - 0.5 0.15 p - 0.5 0.05 p - 0.5


,.......
-2.,
X 0.2 ---
C,
X ~ 0.04
0.. 0.. 0..

0.1
0.15 0.03

0.1 0.02
0.05
0.05 0.01

0 0 0
-1 -0.5 0 0.5 1.5 2 2.5 3 3.5 4 0 2 4 6 8 10 0 10 20 30 40 50 60 70 80 90 100
l{ X l{

0.7 0.35 0.14


( )

--
0.6 0.12 -
-
0.3 )

0.5
p - 0.2
0.25
p 0.2
0.1
G:

G>
p 0.1
-
G:
0.4 0.2 0.08 -
)
'x 'x 'x
'--"x
0..
x
0..
x
0..
0.3 0.15 0.06 Ci: -
:i>

0.2 0.1 0.04 -


Gl Ki:)

0.1 0.05 0.02 Ll ti> -

0 0 0-
~
-1 -0.5 0 0.5 1 1.5 2 2.5 3 3.5 4 0 2 4 6 8 10 0 10 20 30 40 50 60 70 80 90 100
X X X
Geometric rando v, riab e: para e er p: o<p < 1
• Experiment. i finitely many independent toss ,es of a coin P( eads) - p

• Sample space: Set of inf 1 ·te seque ces of H and T

• Random variable X. numbe of tosses unt·1 the first Heads

• Model of: wa·ting times; number of tria s u til a success

Px(k) -

Px(k)
p-1/3
( o He d ev r)

1 2 3 4 5 6 7 8 9 k
Exp ctation/mean o a random variable
1, w.p. 2/10
• Mo ivation. Play a game 1000 times.
X- 2, w.p. 5/10
Random gain at each play described by:
4, w.p. 3/10
• "Average" gain:

[X] • In erpretation. Av ,erage in large number


• Definition
X of indepe · dent repetitions oft e experiment

• Caut1 n. If we have an inf·n·te sum it needs to be well-defined.


We assu e ~]xi Px(x) < oo
X
Expec a ·on of a Ber o Iii r.v

1, w.p. P
X=
0, w.p 1 - p

If X is the indicator of an event A, X IA :


xpec ation of a ·rorm r v.

• nifor on0,1, ... ,n

Px(x) ~

1
n+l

••••••

0 1 n X

E[X
Expectation as a population average

• n students

• Weight of ith student: xi

• Exp ,eriment: pick a student at random, all equally likely

• Random variable X : weight of selected student


- assume the xi are distinct

Px(xi) =

E[X] =
le en ary properties of expec a ions
• Defini ·on: [X]
X
• If X > 0, then E[X] >

• If a < X < b, then a < [X] < b

• If c 1sa constant, E[c C


he expecte value rule, for calc ating E[g(X)] X y

• Let X be a r.v. and let Y - g(X) g

• Averaging over y: E[Y] - LYPy(y)


y
prob

• Averag·ng over x:

Proof:

• Ca t10 · In g,e e al, E[g(X) -::/=g(E[X])


Linearity of expectation: E[aX + b] = aE[X] + b

• Intuitive

• Derivation, bas,ed on the expected value rule:


MIT OpenCourseWare
https://ocw.mit.edu

Resource: Introduction to Probability


John Tsitsiklis and Patrick Jaillet

The following may not correspond to a particular course on MIT OpenCourseWare, but has been provided by the author as an individual learning resource.

For information about citing these materials or our Terms of Use, visit: https://ocw.mit.edu/terms.

You might also like