0% found this document useful (0 votes)
71 views16 pages

MIT18 05S14 Class4slides PDF

This document provides an overview of discrete random variables and expectation. It defines key concepts such as probability mass functions (pmf), which give the probabilities of discrete random variables taking on particular values, and cumulative distribution functions (cdf), which give the probabilities of discrete random variables being less than or equal to particular values. It also introduces common discrete distributions like the binomial and geometric and defines expected value as a weighted average representing the central tendency of a random variable. Examples are given of calculating expected values for Bernoulli, binomial, and other random variable distributions.

Uploaded by

Aftab Saad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
71 views16 pages

MIT18 05S14 Class4slides PDF

This document provides an overview of discrete random variables and expectation. It defines key concepts such as probability mass functions (pmf), which give the probabilities of discrete random variables taking on particular values, and cumulative distribution functions (cdf), which give the probabilities of discrete random variables being less than or equal to particular values. It also introduces common discrete distributions like the binomial and geometric and defines expected value as a weighted average representing the central tendency of a random variable. Examples are given of calculating expected values for Bernoulli, binomial, and other random variable distributions.

Uploaded by

Aftab Saad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Discrete Random Variables; Expectation

18.05 Spring 2014

https://en.wikipedia.org/wiki/Bean_machine#/media/File:
Quincunx_(Galton_Box)_-_Galton_1889_diagram.png
http://www.youtube.com/watch?v=9xUBhhM4vbM January 1, 2017 1 / 16
Reading Review
Random variable X assigns a number to each outcome:

X :Ω→R

“X = a” denotes the event {ω | X (ω) = a}.

Probability mass function (pmf) of X is given by

p(a) = P(X = a).

Cumulative distribution function (cdf) of X is given by

F (a) = P(X ≤ a).

January 1, 2017 2 / 16
CDF and PMF
1 F (a)
.9
.75
.5

a
1 3 5 7

p(a)
.5
.25
.15
a
1 3 5 7

January 1, 2017 3 / 16
Concept Question: cdf and pmf

X a random variable.
values of X : 1 3 5 7
cdf F (a): 0.5 0.75 0.9 1

1. What is P(X ≤ 3)?


(a) 0.15 (b) 0.25 (c) 0.5 (d) 0.75

2. What is P(X = 3)
(a) 0.15 (b) 0.25 (c) 0.5 (d) 0.75

January 1, 2017 4 / 16
Deluge of discrete distributions
Bernoulli(p) = 1 (success) with probability p,

0 (failure) with probability 1 − p.

In more neutral language:


Bernoulli(p) = 1 (heads) with probability p,
0 (tails) with probability 1 − p.

Binomial(n,p) = # of successes in n independent


Bernoulli(p) trials.

Geometric(p) = # of tails before first heads in a


sequence of indep. Bernoulli(p) trials.
(Neutral language avoids confusing whether we want the number of
successes before the first failure or vice versa.)

January 1, 2017 5 / 16
Concept Question

1. Let X ∼ binom(n, p) and Y ∼ binom(m, p) be


independent. Then X + Y follows:
(a) binom(n + m, p) (b) binom(nm, p)
(c) binom(n + m, 2p) (d) other

2. Let X ∼ binom(n, p) and Z ∼ binom(n, q) be


independent. Then X + Z follows:
(a) binom(n, p + q) (b) binom(n, pq)
(c) binom(2n, p + q) (d) other

January 1, 2017 6 / 16
Board Question: Find the pmf

X = # of successes before the second failure of a


sequence of independent Bernoulli(p) trials.

Describe the pmf of X .


Hint: this requires some counting.

January 1, 2017 7 / 16
Dice simulation: geometric(1/4)

Roll the 4-sided die repeatedly until you roll a 1.

Click in X = # of rolls BEFORE the 1.

(If X is 9 or more click 9.)

Example: If you roll (3, 4, 2, 3, 1) then click in 4.

Example: If you roll (1) then click 0.

January 1, 2017 8 / 16
Fiction

Gambler’s fallacy: [roulette] if black comes up several


times in a row then the next spin is more likely to be red.

Hot hand: NBA players get ‘hot’.

January 1, 2017 9 / 16
Fact

P(red) remains the same.


The roulette wheel has no memory. (Monte Carlo, 1913).

The data show that player who has made 5 shots in a row
is no more likely than usual to make the next shot.
(Currently, there seems to be some disagreement about
this.)

January 1, 2017 10 / 16
Amnesia

Show that Geometric(p) is memoryless, i.e.

P(X = n + k | X ≥ n) = P(X = k)

Explain why we call this memoryless.

January 1, 2017 11 / 16
Expected Value
X is a random variable takes values x1 , x2 , . . . , xn :
The expected value of X is defined by
= n
E (X ) = p(x1 )x1 + p(x2 )x2 + . . . + p(xn )xn = p(xi ) xi
i=1
It is a weighted average.
It is a measure of central tendency.
Properties of E (X )
E (X + Y ) = E (X ) + E (Y ) (linearity I)
E (aX + b) ==aE (X ) + b (linearity II)
E (h(X )) = h(xi ) p(xi )

January 1, 2017 12 / 16
Examples
Example 1. Find E (X )
1. X: 3 4 5 6
2. pmf: 1/4 1/2 1/8 1/8
3. E (X ) = 3/4 + 4/2 + 5/8 + 6/8 = 33/8

Example 2. Suppose X ∼ Bernoulli(p). Find E (X ).


1. X: 0 1
2. pmf: 1 − p p
3. E (X ) = (1 − p) · 0 + p · 1 = p.

Example 3. Suppose X ∼ Binomial(12, .25). Find E (X ).


X = X1 + X2 + . . . + X12 , where Xi ∼ Bernoulli(.25). Therefore

E (X ) = E (X1 ) + E (X2 ) + . . . E (X12 ) = 12 · (.25) = 3

In general if X ∼ Binomial(n, p) then E (X ) = np.

January 1, 2017 13 / 16
Board Question: Interpreting Expectation

(a) Would you accept a gamble that offers a 10% chance


to win $95 and a 90% chance of losing $5?

(b) Would you pay $5 to participate in a lottery that


offers a 10% percent chance to win $100 and a 90%
chance to win nothing?

• Find the expected value of your change in assets in each


case?

January 1, 2017 14 / 16
Board Question

Suppose (hypothetically!) that everyone at your table got up, ran


around the room, and sat back down randomly (i.e., all seating
arrangements are equally likely).

What is the expected value of the number of people sitting in their


original seat?

(We will explore this with simulations in Friday Studio.)

January 1, 2017 15 / 16
MIT OpenCourseWare
https://ocw.mit.edu

18.05 Introduction to Probability and Statistics


Spring 2014

For information about citing these materials or our Terms of Use, visit: https://ocw.mit.edu/terms .

You might also like