0% found this document useful (0 votes)
12 views53 pages

ST 1210-Discrete Random Variables

The document discusses discrete random variables, defining them as uncertain quantities that can take countable values, and introduces concepts such as Probability Mass Function (PMF) and Cumulative Distribution Function (CDF). It further explores multiple random variables, independence, the Principle of Inclusion and Exclusion, and the Ballot Theorem, alongside the Bernoulli and Binomial distributions. Additionally, it covers the Poisson distribution, its characteristics, and provides examples for better understanding.

Uploaded by

ff5352235
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views53 pages

ST 1210-Discrete Random Variables

The document discusses discrete random variables, defining them as uncertain quantities that can take countable values, and introduces concepts such as Probability Mass Function (PMF) and Cumulative Distribution Function (CDF). It further explores multiple random variables, independence, the Principle of Inclusion and Exclusion, and the Ballot Theorem, alongside the Bernoulli and Binomial distributions. Additionally, it covers the Poisson distribution, its characteristics, and provides examples for better understanding.

Uploaded by

ff5352235
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 53

DISCRETE RANDOM VARIABLES

• A random variable is an uncertain quantity whose value


depends on chance.
• A random variable may be discrete or continuous
• We usually denote the random variable by X.
• Discrete if it takes only a countable number of values.
• For example, number of events on tossing a die, number of
heads in coin tossing.
• Probability distribution of the random variable is a rule that
assigns probabilities to different values of the random
variable.
• The probability of X=xi for discrete random variables x1 ,x2 , x3
, x4 ……. is called Probability Mass Function (PMF), Px(xi).
DISCRETE PROBABILITY DISTRIBUTION
DISCRETE PROBABILITY DISTRIBUTION
DISCRETE PROBABILITY DISTRIBUTION
DISCRETE PROBABILITY DISTRIBUTION
Cumulative Distribution Function

• The Cumulative Distribution Function (CDF) F of a


discrete random variable X denoted by FX(x) is the function
defined by
FX(a) = P(X ≤ a) for -∞ < a < ∞
• Properties of CDF
(i) P(a < X ≤ b) = FX(b) – FX(a)
(ii) FX(b)> FX(a) if b>a
(iii) FX(-∞)=0
(iv) FX(∞)=1
(v) 0<FX(a)<1
Cumulative distribution function

• Example: The PMF p(a) and CDF F(a)=a2/36 for the


maximum of two independent throws of a fair die are
given by the table below.
Cumulative distribution function
Cumulative distribution function

• Example: The cumulative distribution function of a


discrete random variable X is FX(b)=b2/64 .
Find the probabilities of the following random
variables; (a) X≤2 (b) X>4 (c) 2<X≤4
Multiple random variables
 Multiple random variables may be defined on the
same sample space, and their relations can be studied.
 If X and Y are random variables, then the pair (X, Y)
is a random vector. Its distribution is called the joint
distribution of X and Y
 The joint CDF of X and Y is denoted by F X,Y(a, b)
and is defined by
FX,Y(a, b) = P(X ≤ a, Y ≤ b) for -∞ < a < ∞,
-∞ < b < ∞
Multiple random variables
 Individual distributions of X and Y are then called the
marginal distributions
 If the joint CDF of X and Y is known, we can find the
corresponding marginal CDFs FX(a) and FX(b).
 The marginal CDFs of random variables X and Y are give by
FX(a) = FX,Y(a, ∞) and FY(b) = FX,Y(∞, b)
 The joint probability mass function of discrete random
vector (X, Y) is the function defined by
pX,Y(a, b) = P(X = a, Y = b) for -∞ < a < ∞,
-∞ < b < ∞
Independent random variables

 Random variables X and Y are independent if every


event involving only X is independent of every
event involving only Y, that is,
p(X,Y) (a, b) = P({X = a} ∩{Y = b})
= P({X = a})P({Y = b}) = pX(a)pY(b)
And F(X,Y)(a, b) = FX(a)FY(b)
The Principle of Inclusion and Exclusion

 The Principle of Inclusion and Exclusion gives a


formula for the size of the union of n finite sets.
 Usually the universe is finite too.
 It is a generalization of the familiar formulas
|A ∪ B| = |A| + |B|−|A ∩ B| and
|A ∪ B ∪ C| = |A| + |B| + |C|−|A ∩ B|−|A ∩ C|−
|B ∩ C| + |A ∩ B ∩ C|.
The Ballot Theorem

 The Ballot Problem: Suppose that in an election,


candidate A receives “a” votes and candidate B
receives “b” votes, where a ≥ kb for some positive
integer k. Compute the number of ways the ballots
can be ordered so that A maintains more than k times
as many votes as B throughout the counting of the
ballots.
 The solution to the ballot problem is
The Ballot Theorem

 Let us call a permutation of the ballots good if A stays


ahead of B by more than a factor of k throughout the
counting of the ballots, and bad otherwise.
 Since the total number of distinct permutations of the
a + b ballots is

 The probability that A stays ahead of B by more than


a factor of k throughout the counting of the ballots is
The Ballot Theorem Revolutions

 In 1887 Joseph Bertrand introduced the ballot


problem for the case k = 1 with an inductive proof.
 After Bertrand,, Emile Barbier stated and provided a
solution to the ballot problem for arbitrary k, but
without any proof.
 Very shortly after Barbier, D´esir´e Andr´e produced
a short combinatorial proof of the ballot theorem for
k = 1.
 In 1923 Aeppli announced that he had the first proof
of the ballot theorem for k ≥ 1.
Expected Value and Variance of a Discrete Random Variable
DISCRETE PROBABILITY DISTRIBUTION

• From the question below, find the value of k, and calculate


mean and variance also draw its graph
DISCRETE PROBABILITY DISTRIBUTION
DISCRETE PROBABILITY DISTRIBUTION
Bernoulli distribution

 A random variable with two possible values, 0


and 1, is called a Bernoulli variable, and its
distribution is Bernoulli distribution
 Ber(p) is a Bernoulli distribution with
parameter p as a probability of success, where
0 ≤ p ≤ 1, and
p(1) = P(X = 1) = p
p(0) = P(X = 0) = 1 − p
E[X] = p, Var(X) = p(1 − p)
THE BINOMIAL DISTRIBUTION

• Let a random experiment be performed repeatedly and let the


occurrence of an event in a trial be called a success and its non-
occurrence a failure.
• Consider a set of n independent trials (n being finite), in which
the probability 'p' of success in any trial is constant for each
trial. Then q = 1 - p, is the probability of failure in any trial.
• The random variable that counts the number of successes in
many independent, with the same probability of success and
failure in each trial is called a Binomial Random Variable.
THE BINOMIAL DISTRIBUTION

• Conditions for a Binomial Random Variable

1. There are only two mutually exclusive outcomes in the


experiment i.e. S = {success, failure}
2. In repeated trials of the experiment, the probabilities of
occurrence of these events remain constant
3. The outcomes of the trials are independent of one another
4. The number of trials ‘n' is finite.
• The probability distribution of Binomial Random Variable is
called the Binomial Distribution
BINOMIAL PROBABILITY FUNCTION

• To describe the distribution of Binomial random variable we


need two parameters, n and p we write; X ~ B (n, p) to
indicate that X is Binomially distributed with n number of
independent trials and p probability of success in each trial.
The letter B stands for binomial.
BINOMIAL PROBABILITY FUNCTION

• Now we know that there are nCx ways of getting x successes


out of n trials.

• We also observe that each of these nCx possibilities has


px(1-p)n-x probability of occurrence corresponding to x
successes and (n-x) failures. Therefore,

• P(X = x) = nCx px(1-p)n-x for x = 0,1,2,………, n


• This equation is the Binomial probability formula.
BINOMIAL PROBABILITY FUNCTION

• If we denote the probability of failure as q then the Binomial


probability formula is
• P(X = x) = nCx px(q)n-x for x = 0,1,2,………, n
i.e q=1-p
BINOMIAL PROBABILITY FUNCTION
BINOMIAL PROBABILITY FUNCTION

3. Skewness
• To bring out the skewness of a Binomial distribution we can
calculate, coefficient of skewness, γ1
• If p is the probability of success and q as the probability of
failure, the coefficient of skewness of a binomial distribution
is given by
BINOMIAL PROBABILITY FUNCTION
BINOMIAL PROBABILITY FUNCTION

• From the formula above, we note that;


 The Binomial distribution is skewed to the right i.e. has
positive skewness when γ1 > 0, which is so when p < q
 The Binomial distribution is skewed to the left i.e. has
negative skewness when γ1 < 0, which is so when p > q
 The Binomial distribution is symmetrical i.e. has no skewness
when γ1 = 0, which is so when p = q
 Thus, n being the same, the degree of skewness in a Binomial
distribution tends to vanish as p approaches ½ i.e. as p→ ½
BINOMIAL DISTRIBUTION

• Example. Assuming the probability of male birth as ½, find


the probability distribution of number of boys out of 5 births.
(a) Find the probability that a family of 5 children have
(i) at least one boy
(ii) at most 3 boys
(b) Out of 960 families with 5 children each find the expected
number of families with (i) and (ii) above
BINOMIAL DISTRIBUTION

• X ~ B (5, ½)
• P(X = x) = nCx px(q)n-x , x=0,1,2,3,4,5
BINOMIAL DISTRIBUTION
BINOMIAL DISTRIBUTION

(a) The required probabilities are


(i) P(X ≥ 1) = 1- P(X = 0)
= 1- 1/32
= 31/32
(ii) P(X ≤ 3) = P(X = 0)+ P(X = 1)+ P(X = 2)+
P(X = 3)
= 1/32 + 5/32 + 10/32 + 10/32
= 26/32
BINOMIAL DISTRIBUTION

(b) Out of 960 families with 5 children, the expected number of


families with
(i) at least one boy = 960 * P(X ≥ 1)
= 960 * 31/32
= 930
(ii) at most 3 boys = 960 * P(X ≤ 3)
= 960 * 26/32
= 720
BINOMIAL DISTRIBUTION

• Example. A coin thrown ten times. Find the probability of


getting
(a) at least seven heads
(b) at least nine heads
(c) at most eight heads
(d) at least two heads
(e) at most ten heads
(f) less than seven heads
• Example. Team A and B play a game in which their chances of
winning are in the ratio 3 : 2. Find their chances of winning at
least three games out of the five games played
BINOMIAL DISTRIBUTION

Example. Comment on the following:


“The mean of a binomial distribution is 3 and variance is 4”
Example. The mean and variance of binomial distribution are 4
and 4/3 respectively. Find P(X ≥ 1) and E( X2 )
THE POISSON DISTRIBUTION

Poisson process: a very large population of


independent events, where each has a very
small probability to occur, and the average
occurrences in a range is roughly stable.
THE POISSON DISTRIBUTION
THE POISSON DISTRIBUTION

• A Poisson distribution is valid under following conditions:

1. The number of trials n, is infinitely large i.e. n → ∝


2. The constant probability of success p, for each trial is
infinitely small i.e. p → 0 (obviously q → 1)
3. np = μ is finite
• We can develop the Poisson probability rule from the
Binomial probability rule under the above conditions.
THE POISSON DISTRIBUTION
THE POISSON DISTRIBUTION

• Poisson distribution may be expected in situations where the


chance of occurrence of any event is small, and we are
interested in the occurrence of the event and not in its non-
occurrence.
• For example, number of road accidents, number of deaths in
flood or because of snakebite or because of a rare disease etc.
• In these situations, we know about the occurrence of an event
although its probability is very small, but we do not know how
many times it does not occur
Characteristics of a Poisson Distribution

• Expected Value or Mean


The expected value or the mean, denoted by μ, of a Poisson
distribution is always μ itself. i.e. mean = μ
• Variance
The variance, denoted by σ2 , of a Poisson distribution is
σ2 = μ
• Skewness
The coefficient of skewness, γ1 in a Poisson distribution, is the
reciprocal of the square root of the mean, μ i.e.
γ1 = μ-1/2
Characteristics of a Poisson Distribution
Characteristics of a Poisson Distribution
Characteristics of a Poisson Distribution

• Evaluating γ1 = μ-1/2 , we note that Poisson distribution is


always skewed to the right i.e. has positive skewness
• The degree of skewness in a Poisson distribution decreases as
the value of μ increases.
Poisson Distribution

• Example
At a parking place the average number of car-arrivals during a
specified period of 15 minutes is 2. If the arrival process is
well described by a Poisson process, find the probability that
during a given period of 15 minutes
(a) no car will arrive
(b) at least two cars will arrive
(c) at most three cars will arrive
(d) between 1 and 3 cars will arrive inclusively
Poisson Distribution

• Example: Six coins are tossed 6,400 times. Using the Poisson
distribution, Find the probability of getting six heads r times.
• Example: The mean and variance of a binomial distribution
are 2 and 1.5 respectively. Find the probability of (a) 2
successes (b) at least 2 successes (c) at most 2 successes.

You might also like