0% found this document useful (0 votes)
15 views

Intro To Discrete Random Variables

This document provides an overview of probability and statistics concepts related to random variables. It defines discrete random variables and their probability mass functions and cumulative distribution functions. It discusses how to calculate the expected value and variance of random variables. Several examples are provided to illustrate these concepts and how to solve probability problems involving discrete random variables.

Uploaded by

8margalit
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views

Intro To Discrete Random Variables

This document provides an overview of probability and statistics concepts related to random variables. It defines discrete random variables and their probability mass functions and cumulative distribution functions. It discusses how to calculate the expected value and variance of random variables. Several examples are provided to illustrate these concepts and how to solve probability problems involving discrete random variables.

Uploaded by

8margalit
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

Probability and Statistics (MATH F113)

Pradeep Boggarapu

Department of Mathematics
BITS PILANI K K Birla Goa Campus, Goa

January 25, 2024

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 1 / 27
Random Variables. Discrete Random Variables

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 2 / 27
Outline

1 Definitions of random variable and discrete random


variable.
2 Density function and cumulative distribution of a RV.
3 Expectation and distribution parameters. (Variation,
standard deviation, moments and moment generating
function)

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 3 / 27
Random Variables

Definition 0.1 (Random Variable).


Random variable is a real-valued function from a sample
space S. We use uppercase letters to denote a random
variable and lowercase letter to denote the numberical
values observed by random variable (rv).

Example 1. Suppose that our experiment consists of


tossing 3 fair coins. If we let Y denote the number of
heads that appear, then Y is a random variable taking
one of the values 0, 1, 2, and 3.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 4 / 27
Examples for RV

Example 2. Consider the experiment of throwing two dice.


Let X denotes the sum of the numbers shown by the dice.
Then the X is a random variable which takes the values 2,
3, 4, . . . , 12.
Notation: For any I ⊂ R, we define
P[X ∈ I ] := P[{s ∈ S : X (s) ∈ I }]. The probability that
the rv X takes values in I ⊂ R.
Example 3. Three balls are to be randomly selected
without replacement from an urn containing 20 balls
numbered 1 through 20. If we bet that atleast one of the
balls that are drawn has a number as large as or larger
than 17, what is the probability that we win the bet?
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 5 / 27
Discrete random variable

Definition 0.2 (Discrete random variable).


A random variable is discrete if it can assume at most a finite or
countably infinite numbers of possible values.

The random variables in the examples discussed so far are discrete


random variables.
Definition 0.3 (Probability density function or mass function).
The probability distribution or probability mass function(pmf) of a
discrete random variable is defined for every x by

p(x) = P(X = x) = P({s ∈ S : X (s) = x}),

where S is the sample space.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 6 / 27
Probability density function

Remark 0.4.
A real valued function p(x) is a probability mass function
for a discrete random variable if and only if
1 p(x) ≥ 0,
X
2 p(x) = 1.
all x

Example 4. Write down the probability mass functions


and verify the above remark for the random variables
defined in Example 1 and Example 2.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 7 / 27
Cumulative distribution function- Discrete

Definition 0.5 (Cumulative distribution function (cdf)).


Let X be a discrete random variable with density (pmf) p.
The cumulative distribution function (cdf) for X , denoted
by F , is defined by
X
F (x) = P[X ≤ x] = p(a) for x real.
a≤x

Example 5. Find the cdf for the random variable defined


in Example 1 and Example 2.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 8 / 27
Problems

Example 6. Five distinct number are randomly distributed


to players numbered 1 through 5. Whenever two players
compare their numbers, the one with higher one is
declared the winner. Initially, player 1 and 2 compare their
numbers; the winner then compares with player 3, and so
on. Let X denote the number of times player 1 is winner.
Find P(X = i) for i = 0, 1, 2, 3, 4.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 9 / 27
Expectation of a random variable

Definition 0.6 (Expected value of a random variable X ).


Let X be a discrete random variable with density function
f (x). The expectation or expected value of X , denoted by
E [X ], is defined by
X
E [X ] = x f (x).
all x

Example 7. Find E [X ] where X is the outcome when we


roll a fair die.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 10 / 27
Expectation of a random variable

Note that expected value is also known as mean and some times
we use ‘µ’ to denote the expectation or expected value or mean.

Let X be a discrete random variable with density function (pmf)


f (x) and H(X ) be a real-valued function of X , then H(X ) is a
random variable and its expectation is given by
X
E [H(X )] = H(x) f (x).
all x

Example 8. Three balls are randomly chosen from an urn containing 3


white, 3 red, and 5 black balls. Suppose that we win $ 1 for each
white ball selected and lose $ 1 for each red ball selected. Find the
expected value of total winnings.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 11 / 27
Problems

Example 9. A rv X has the following probability function:


x : 0 1 2 3 4 5 6 7
2 2
p(x) : 0 k 2k 2k 3k k 2k 7k 2 + k
(i) Find k, (ii) Evaluate P(X < 6), P(X ≥ 6), and
P(0 < X < 5), (iii) if P(X ≤ a) > 12 , find the minimum
value of a, (iv) determine the cumulative distribution
function of X and (v) find E (X ) and E (X 2 + 1).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 12 / 27
Problems

Example 10. Independent trials consisting of the flipping of a coin


having probability p of coming up heads are continually performed
until either a head occurs or a total of n flips is made. If we let X
denote the number of times the coin is flipped, then find P(X = x).
Example 11. A shipment of 8 similar microcomputers to a retail
outlet contains 3 that are defective. If a school makes a random
purchase of 2 of these computers, find the probability distribution for
the number of defectives.
Example 12. A couple decides to have 3 children. If none of the 3 is a
girl, they will try again; and if they still don’t get a girl, they will try
once more. If the random variable X denotes the number of children
the couple will have following this scheme, then what is the expected
value of X ?

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 13 / 27
Problems

Example 13. A man with n keys wants to open his door and tries the
keys independently and at random. Find the mean of the number of
trials required to open the door,
(i) if unsuccessful keys are not eliminated from further selection, and
(ii) if they are
Example 14. A lot of 8 TV sets includes 3 that are defective. If 4 of
the sets are chosen at random for shipment to a hotel, how many
defective sets can they expect?

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 14 / 27
Variance and Standard deviation of random variable

Definition 0.7 (Variance and standard deviation).


Let X be a discrete random variable with mean µ.
1 The variance of X , denoted by Var [X ] or σ 2 , is
defined by
Var [X ] = σ 2 = E [(X − µ)2 ].

2 The standard deviation of X , denoted by σ, is defined


by p
σ = Var [X ]

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 15 / 27
Rules for expectations

Theorem 0.8.
Let X and Y be two discrete random variables and c be
any real number.
1 E [c] = c
2 E [cX ] = cE [X ]
3 E [X + Y ] = E [X ] + E [Y ].

Corollary 0.9.

Var [X ] = E [X 2 ] − (E [X ])2 .

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 16 / 27
Independence Variables

Let’s consider two discrete random variables, X and Y . They are said
to be independent if the probability of any particular value taken by X
is not influenced by any value taken by Y , and vice versa.
Mathematically, for independent random variables X and Y :

P(X = x and Y = y ) = P(X = x)P(Y = y ).

Example 15. Consider two six-sided dice, and let X be the outcome
of the first die, and Y be the outcome of the second die.The events
of rolling the first die and rolling the second die are independent,
assuming the dice are fair and not biased. Then it can be easily seen
that X and Y are independent random variables

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 17 / 27
Rules for variance

Theorem 0.10.
Let X and Y be two discrete random varibles and c be
any real number.
1 Var [c] = 0
2 Var [cX ] = c 2 Var [X ]
3 Var [X + Y ] = Var [X ] + Var [Y ], provided X and Y
are independent random variables.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 18 / 27
Problem

Example 16. A fair die is tossed. Let X be random


variable denoting ‘twice the number appearing’ and Y be
the random variable takes 1 or 3 accordingly as odd or
even number appears. Then find the pmf, expectation and
variance for the random variables X , Y , Z = X + Y and
W = X · Y . Also, verify the following:
1 E [Z ] = E [X ] + E [Y ].
2 Var [Z ] ̸= Var [X ] + Var [Y ].
3 E [W ] ̸= E [X ] · E [Y ].

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 19 / 27
Moments and moment generating function (mgf)

Definition 0.11 (Moments and mgf).


Let X be a discrete random variable with density function f (x).
1 The kth moment of X is defined as E [X k ].
 
2 The kth central moment of X is defined as E (X − µ)k
3 The moment generating function for X is denoted by mX (t) and
is defined by
mX (t) = E [e tX ]
provided this expectation is finite for all real numbers t in some
open interval (−h, h).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 20 / 27
Moments and moment generating function (mgf)

Example 17. Two balls are randomly chosen from an urn


containing 2 white, 2 red, and 4 black balls. Suppose that
we win Rs. 1 for each white ball selected and lose Rs. −1
for each red ball selected. If we let X denote our total
winnings from the experiment, then find the first, second
moments of X and mgf for X .
Example(18. Let X be a have pmf
1 8 x

, for x = 0, 1, 2, · · · ∞
p(x) = 9 9 What is the
0, otherwise
mgf of the rv X ?
1
t < ln 98

ANS: 9−8e t , if

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 21 / 27
Moments and moment generating function (mgf)

Theorem 0.12 (moments using mgf).


If mX (t) is the moment generating function for a random
variable X , then the kth moment of X is given by
k d k mX (t)
E [X ] = .
dt k t=0

Example 19. Let M(t) be a mgf of the rv X .


If M(t) = a0 + a1 t + a2 t 2 + · · · + an t n + · · · is the Taylor
series expansion of M(t) about t = 0, then E (X n ) = n!an
for all natural number n.
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 22 / 27
Examples

Example 20. Let the random variable X have moment


generating function MX (t) = (1 − t)−2 for t < 1. What is
the third moment of X about the origin? ANS:24
Example 21. What is the 479th moment of X about the
1
origin, if the mgf of X is 1+t ?

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 23 / 27
Properties of mgf

Theorem 1.
Let X be a rv with mgf MX (t). If a and b are any two
real constants, then
1. MX +a (t) = e at MX (t)
2. MbX (t) = MX (bt)
3. The mgf of the sum of a number of independent
random variables is equal to the product of their
respective mgf. i. e.
MX1 +X2 +···+Xn (t) = MX1 (t) · · · MXn (t)

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 24 / 27
Properties of mgf

Theorem 2 (Uniqueness).
The mgf of a distribution, if it exists, uniquelly determine
the distribution. This implies that corresponding to a
given probability distribution, there is only one mgf
(provided it exists) and corresponding to a given mgf,
there is only one probability distribution. Hence
MX (t) = MY (t) imples X and Y are identically
distributed.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 25 / 27
Properties of Expectation

If X and Y are random variables, then E (X + Y ) = E (X ) + E (Y ),


provided all the expectation exist.
n
X
In general E (X1 + · · · + Xn ) = E (Xi ), if all the expectations exist.
i=1
If X and Y are Independent random variables, then
E (XY ) = E (X )E (Y )
If X is a rv and a and b are constants, then E (aX + b) = aE (X ) + b,
provided all expectation exist.
If E (X r ) exist, then E (X s ), ∀ 1 ≤ s ≤ r exist. That is, if the
moments of a specified order exist, then all the lower order moments
automatically exist. However, the converse is not true.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 26 / 27
Thank you for your attention

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 27 / 27

You might also like