BS UNIT 2 Note # 3
BS UNIT 2 Note # 3
A discrete random variable is a random variable that can assume only distinct values.
If we toss a coin twice, the number of heads obtained could be 0,1or 2.The probabilities
of these occurring are as follows.
The variable being considered is the ‘ the number of heads obtained in two tosses’ and it
can be denoted by X. It can only take exact values 0,1 or 2 and so is called a discrete
variable.
The probabilities can be written P (X=0) = 0.25, P (X=1) = 0.5, P (X=2) =0.25
Now if the sum of the probabilities is 1, the variable is said to be random
In this example
P (X=0) + P (X=1) + P (X=2) = 1
So X is a random variable.
The probability distribution is often written
X 0 1 2
P(X=x 0.25 0.5 0.25
)
We usually denote a random variable (r .v) by a capital letter ( X , Y, Z etc. ) and the
particular value it takes by a small letter ( x, y, r etc.).
Example
x 0 1 2 3
P(X=x .125 .375 .375 .125
)
Expectation, E (X)
Experimental Approach
1
Suppose we throw an unbiased die 120 times and record the results in a frequency
distribution:
Score, x 1 2 3 4 5 6
Frequency, f 15 22 23 19 23 18
x
fx 15 44 69 76 115 108 3.558
f 120
Theoretical Approach
The probability distribution for the random variable X, where X is a ‘the number on the
die’, is shown:
Score, x 1 2 3 4 5 6
P (X=x) 1 1 1 1 1 1
6 6 6 6 6 6
We can obtain a value for the ‘expected’ mean by multiplying each score by its
corresponding probability and summing.
1 1 1 1 1 1
Expected mean = 1 2 3 4 5 6
6 6 6 6 6 6
21
= 3.5
6
If we have a statistical experiment
A practical approach results in a frequency distribution and a mean value,
E (X) = xP( X
allx
x)
Variance, Var(X)
Experimental Approach
2
s =
f ( x x) 2
(x) 2
f f
Theoretical approach
For a discrete random variable X, with E (X) = , the variance is defined as follows
2
Alternatively, Var(X) = E(X - ) 2
= E (X2 - 2 X 2 )
= E(X2) – 2 E(X) + E ( 2 )
= E (X2) - 2 2 2
= E (X2) - 2
Var(X) = E(X2) - E ( X ) 2
Example # 1
The following table contains the probability distribution of the number of traffic
accidents in a small city.
Solution
( a) E ( X ) x P( X x)
all x
E ( X 2 ) x 2 P( X x)
all x
5.4 (2) 2
1.4
s.d ( X ) Var ( X )
1.4
1.18
BINOMIAL DISTRBUTION
James Bernoulli discovered binomial distribution in the year 1700. Let a random
experiment be performed repeatedly and let the occurrence of an event in a trial be called
a success and its non- occurrence is a failure. Consider a set of n independent Bernoulli
trials (n being finite), in which the probability ‘p’ of success in any trial is constant for
each trial. Then q = 1 – p, is the probability of failure in any trial.
The probability of x successes and consequently (n - x) failures in n independent trials, in
a specified order (say) SSFSFFFS. . . . . . FSF (Where S represents success and F failure)
is given by the compound probability theorem by the expression.
3
P(SSFSFFFS. . . . . .FSF) = P(S) P(S) P(F) P(S) P(F) P(F) P(F) P(S). . . . . . P(F) P(S) P(F)
= p. p .q . p .q .q .q .p. . . . . .q . p . q
= p. p. . . . .p q . q . q. q. . . . . q
(x factors) ((n-x) factors)
= p x q n-x
But x successes in n trials can occur in nCx ways and the probability for each of these
ways is p x q n-x. Hence the probability of x successes in n trials in any order whatsoever is
given by the addition theorem of probability by the expression:
n
Cx px qn-x
The probability distribution of the number of successes, so obtained is called the
Binomial probability distribution for the obvious reason that the probabilities of 0, 1,
2. . .n successes, viz.,
qn, nC1q n-1p , nC2 q n –2 p2 , . . . . . . ,p n, are the successive terms of the binomial expansion
(q + p)n
In general:
P (X=x) = n C x p x q n-x x = 1, 2, 3, . . . . . . , n
The two independent constants n and p in the distribution are known as the parameters of
the distribution
Binomial distribution is discrete distribution as X can take only the integral values, viz.,
0, 1, 2, . . . . . , n. Any variable which follows binomial distribution is known as binomial
variate.
We shall use the notation X B (n, p) to denote that the random variable X follows a
binomial distribution with parameters n and p.
(i) Each trial results in two mutually disjoint outcomes, termed as success and
failure
(ii) The number of trials ‘n’ is finite.
(iii) The trials are independent of each other
(iv) The probability of success ‘ p’ is constant for each trial.
The problems relating to tossing of a coin or throwing a dice or drawing cards from a
pack of cards with replacement lead to binomial probability distribution.
Expectation and Variance
Example # 2
Suppose that the warranty records show the probability that a new computer needs a
warranty repair in the first 90 days is 0.05. If a sample of 10 new computers is selected,
4
(a) What is the probability that:
(b) What are the mean and standard deviation of the probability distribution?
Solution
(ii) P( X 2) 1 P( X 0) P( X 1)
= 1 – (0.599 +0.315)
= 1 - 0.914
= 0.086
(iii ) P ( X 2) 1 P( X 0) P( X 1) P( X 2)
1 0.599 0.315 0.075
1 0.989
0.011
(b) E ( X ) np
10 0.05
0.5
Var ( X ) npq
10 0.05 0.95
0.475
s.d ( X ) Var ( X )
0.475
0.69
POISSON DISTRIBUTION
Poisson Process
5
To better to understand Poisson process, suppose we examine the number of arriving
during 12 noon to 1 p.m. lunch hour at a bank located in the central business district in a
large city. Any arrival of a customer is a discrete event at a particular pointing time over
the continuous 1- hour interval. Over such an interval of time, there might be an average
of 180 arrivals. Now if we were to break up the 1- hour interval into 3600 consecutive 1-
second intervals,
The expected number (or average) number of customers arriving in any 1- second
interval would be 0.05
The probability of having more than one customer arriving in any one-second interval
approaches 0.
The arrival of one customer in any 1- second interval has no effect on the arrival of any
other customer in any other1- second interval.
The Poisson distribution has one parameter, called , which is the average or expected
number of successes per unit. Interestingly, the variance of a Poisson distribution is also
equal to and the standard deviation is equal to . Moreover, the number successes x
of the Poisson random variable ranges from 0 to
The mathematical expression for the Poisson distribution for obtaining x successes, given
that successes are expected, is
e x
P (X = x) = X = 0, 1, 2, 3. . . . .
x!
Where,
= mean number of occurrences (successes) in particular interval of time
e = mathematical constant approximated by 2.718
x = number of occurrences (successes)
Example # 3
If, on average, three customers arrive per minute at the bank during the lunch hour, what
is the probability that in a given minute exactly two customers will arrive? Also, what is
the chance that more than two customers will arrive in a given minute?
Solution
6
Mean ( ) 3
P ( X 2) 0.2240
P ( X 2) 1 P( X 0) P( X 1) P( X 2)
1 0.0498 0.1494 0.2240
1 0.4232
0.5768