0% found this document useful (0 votes)
23 views15 pages

Unit IV Probability Theory and Distribution

The document discusses the foundations of probability theory, tracing its origins to Galileo and later developments by Pascal and Fermat. It defines key concepts such as trials, events, mutually exclusive events, independent events, and various probability calculations, including classical probability and conditional probability. Additionally, it presents examples and theorems related to probability, including Bayes' theorem and the multiplication law.

Uploaded by

sumitrathore2001
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views15 pages

Unit IV Probability Theory and Distribution

The document discusses the foundations of probability theory, tracing its origins to Galileo and later developments by Pascal and Fermat. It defines key concepts such as trials, events, mutually exclusive events, independent events, and various probability calculations, including classical probability and conditional probability. Additionally, it presents examples and theorems related to probability, including Bayes' theorem and the multiplication law.

Uploaded by

sumitrathore2001
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Unit IV Probability Theory and Distribution

Galileo (1564-1642), an Italian mathematician, was the first to attempt at a quantitative measure of
probability while dealing with some problems related to the theory of dice in gambling. But the first
foundation of the mathematical theory of probability was laid in the mid-seventeenth century by two
French mathematicians, B. Pascal (1623-1662) and P. Fermat (1601-1665).

Definitions of Various Terms


1.​ Trial and Event:- Consider an experiment which, though repeated under essentially identical
conditions, does not give unique results but may result in anyone of the several possible
outcomes. The experiment is known as a trial and the outcomes are known as events or casts.
For example:
i)​Throwing of a die is a trial and getting 1(or 2 or 3, ... or 6) is an event
ii)​Tossing of a coin is a trial and getting head (H) or tail (T) is an event
iii)​ Drawing two cards from a pack of well-shuffled· cards is a trial and getting a king and a
queen are events.
2.​ Exhaustiye Events:- The total number of possible outcomes in any trial known as exhaustive
events or exhaustive cases. For example:-
i) In tossing of a coin there are two exhaustive cases, viz., head and. Tail (the possibility of the
coin standing on an edge being ignored).
ii)In throwing of a die, there are six, exhaustive cases since anyone of the 6 faces 1,2, ... ,6 may
come uppermost.
3.​ Favorable Events or Cases:- The number of cases favorable to an event in a trial is the number of
outcomes which entail the happening of the event For example:-,
i)In drawing a card from a pack of cards 'the number of cases favorable to drawing of an ace is 4,
for drawing a spade is13 and for drawing a red card is 26.
ii)In throwing of two dice, the number of cases favorable to getting the sum 5 is : (1,4) (4,1) (2,3)
(3,2), i.e., 4.
4.​ Mutually exclusive events:- Events are said 'to be mutually exclusive or incompatible if the of
happening anyone of them precludes the happening of all the others (i.e., if no two or more of
them can happen simultaneously in the same trial;-For example:
i) In throwing a die all the 6 faces numbered 1 to 6 are mutually exclusive since if anyone of these
faces comes, ,the possibility of others, in the same trial, is ruled out.
ii) Similarly in tossing a coin the event head and tail are mutually exclusive.
5.​ Equally likely events:- Outcomes of a trial are set to be equally likely if taking into consideration
all the relevant evidence, there is no reason to expect one in preference to the others. For
example w

i) In tossing an unbiased or uniform coin, head or tail are equally likely events.
ii) In throwing an unbiased die, all the six faces are equally likely to come.
6.​ Independent events:- Several events are said 'to be independent if the happening (or
non-happening) of an event is not affected by the supplementary knowledge concerning the
occurrence of any number of the remaining events. For example
i)In tossing an unbiased coin the event of getting a head in the first toss is independent of getting
a head, in the second third and subsequent throws.
ii) If we draw a card from a pack of well-shuffled cards and replace it before drawing the second
card, the result of the second draw is independent of the first draw. But, however, if the first card
drawn is not replaced then the second draw is dependent on the first draw.
7.​ Mathematical or Classical Probability:-If a trial results in n exhaustive, mutually exclusive and
equally likely cases and m of them are favorable to the happening of an event E, then the
probability 'p' of happening of A is given by
𝐹𝑎𝑣𝑜𝑟𝑎𝑏𝑙𝑒 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑐𝑎𝑠𝑒𝑠 𝑚
𝑝 = 𝑃(𝐴) = 𝐸𝑥ℎ𝑎𝑢𝑠𝑡𝑖𝑣𝑒 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑐𝑎𝑠𝑒𝑠 = 𝑛

Remark:-
i.​ It is clear that if m cases out of n exhaustive cases favor the happening of an event A, then n-m
cases favor that the event will not happen. Thus the probability that the event A will not
happen, denoted by 𝑞 𝑜𝑟 𝑃(𝐴) is given by
𝑛−𝑚 𝑚
a.​ 𝑞 = 𝑃(𝐴) = 𝑛 = 1 − 𝑛 = 1 − 𝑃(𝐴)⇒𝑝 + 𝑞 = 1
ii.​ Probability 'p' of the happening of an event is also known as the probability of success and the
Probability 'q' of the non- happening of the event as the probability of failure.
iii.​ If A and B are two events, then the probability of the happening A or B( i.e. at least one of the two
events) is denoted as P(A+B).
iv.​ If A and B are two events, then the probability of the happening A and B( i.e. both the two events
occur) is denoted as P(AB).

Eg:-What is the chance that a leap year selected at random will contain 53 Sundays?
Solution. In a leap year (which consists of 366 days) there are 52 complete weeks and 2 days over. The
following are the possible combinations for these two over days:
(i) Sunday and Monday, (ii) Monday and Tuesday, (iii) Tuesday' and Wednesday,(iv) Wednesday and
Thursday, (v) Thursday and Friday, (vi) Friday and Saturday, and (vii) Saturday and Sunday.
In order that it leap year selected at random should contain 53 Sundays, one of the two 'over’ days must
be Sunday. Since out of the above 7 possibilities, 2 viz. (i) and (vii), are favorable to this event,
2
Required probability = 7

Eg:-A bag contains 3 red, 6 white and 7 blue balls, What is the probability that two balls drawn are
white and blue?
Solution. Total number of balls = 3 + 6 + 7= 16,
Now, out of 16 balls, 2 can be drawn in 16C2 ways,
16*15
Exhaustive number of cases = 16C2 = 2 =120,
Out of 6 while balls 1 ball can be drawn in C(6,1) ways and out of 7 blue balls 1ball can be drawn in 7C1
ways. Since each of the former cases can be associated with each of the latter cases, total number of
favorable cases is: 6C1 x 7C1= 6x7= 42,
42
Required probability= 120 = 270.
Eg:-What is the probability of getting 9 cards of the same suit in one hand at a game of bridge?
Solution. One hand in a game of bridge consists of 13 cards.
:. Exhaustive number of cases = 52C13
Number of ways in which, in one hand, a particular player gets 9 cards of one suit are 13C9, and the
number of ways in which the remaining 4 cards are of some other suit are 39C4. Since there are 4 suits in
a pack of cards, total number of favorable cases = 4 x 13C9 x 39C4.

4 𝑥 13𝐶9 𝑥 39𝐶4
Required probability = 52𝐶13

Theorem of Probability
1.​ Statement :- If n events A1,A2,…,An are mutually exclusive, then the probability of the happening
of at least one of the events is the sum of the probability of the individual events
In symbols
P(A1+A2+…+An)=P(A1)+P(A2)+…+P(An)
2.​ Statement :- If two events A and B are not mutually exclusive then
P(A+B)=P(A)+P(B)-P(AB)
Where P(AB) denotes the probability of the simultaneous occurrence of the events A and B.
3.​ If A,B and C are any three not mutually exclusive events the
P(A+B+C)=P(A)+P(B)+P(C)-P(AB)-P(BC)-P(CA)+P(ABC)
4.​ P(φ) = 0

Eg:- A single dice is thrown once. Find the probability of getting a 3 or 5.

Solution. In throwing a dice the sample space is


S={1,2,3,4,5,6}=6

1
P(1)=P(2)=…=P(6)= 6

1 1 1
P(3 or 5)=P(3)+P(5)= 6 + 6
= 3

Conditional Probability:- Let A and B be two events in S, the probability of the happening an event A
when the event B has already happened is called conditional probability and is denoted by P(A/B).

Theorem of Multiplication Law or Probability and Conditional Probability

Statement:- If A and B are two event then

𝑃(𝐴∩𝐵) 𝑃(𝐴𝐵)
P(A/B)= 𝑃(𝐵)
= 𝑃(𝐵)
, P(B)>0

Corollary1:-Interchanging A and B in the above result we get


𝑃(𝐴∩𝐵) 𝑃(𝐴𝐵)
P(B/A)= 𝑃(𝐴)
= 𝑃(𝐴)
, P(B)>0

Corollary 2:- If A and B are independent event then

𝑃(𝐴∩𝐵) = 𝑃(𝐴)𝑃(𝐵)

Eg:-A card is drawn from a well-shuffled pack 52 playing cards. What is the probability that it is either a
spade or an ace?

Solution. The probable sample space of drawing a card from a well shuffled pack of playing cards
consists of 52 sample points.

If A and B denote the events of drawing a 'spade card' and 'an ace' respectively then A consists of 13
sample points and B consists of 4 sample points so that,

13 4
P(A)= 52
and P(B)= 52

The compound event 𝐴∩𝐵consists of only one sample point, viz .,ace of spade so that,

1
P(AnB)= 52

The probability that the card drawn is either a spade or an ace is given by

13 4 1 4
P(A∪B)= P(A)+ P(B)- P(A∩B) = 52
+ 52
− 52
= 13

Eg:-A box Contains 6 red,4 white and 5 black balls. A person draws 4 balls from the box at random. Find
the probability that among the balls drawn there is at least one ball of each color.

Solution. The required event E that in a draw of 4 balls from the box at random there is at least one
ball of each color can materialize in the following mutually disjoint ways:

(iJ1 Red, 1 White, 2 Black balls

(iiJ2 Red, 1 White, 1 Black balls

(iii) 1 Red, 2 White, 1 Black balls.

Hence by the addition theorem of probability, the required probability is given by

P (E) = P (I) + P (ii) + P (iii)

𝐶(6,1)×𝐶(4,1)×𝐶(5,2) 𝐶(6,2)×𝐶(4,1)×𝐶(5,1) 𝐶(6,1)×𝐶(4,2)×𝐶(5,1)


= 𝐶(15,4)
+ 𝐶(15,4)
+ 𝐶(15,4)

1
= 𝐶(15,4)
[6×4×10 + 15×4×5 + 6×6×5] = 0. 5275
Baye’s Theorem:-
If n event E can only occur in combination with one of the mutually exclusive events E1,E2,..En then

( )
𝑃 𝐸𝑘 𝑃(𝐸/𝐸𝑘)
(
𝑃 𝐸𝑘/𝐸 = ) 𝑛
( )
∑ 𝑃 𝐸𝑖 𝑃(𝐸/𝐸𝑘)
𝑖=1

Eg:-In 1989 there were three candidates for the position of principal- Mr. Chatterji, Mr. Ayangar and Dr.
Singh., whose chances of getting the appointment are in the proportion 4:2:3 respectively. The
probability that Mr.Chatterji if selected would introduce co-education in the college is 0·3. The
probabilities 0/ Mr. Ayangar and Dr. Singh doing the same are respectively 0-5and 0·8. What is the
probability that there was co-education in the college in 199O?

Solution. Let the events and probabilities be defined as follows:

A : Introduction or co-education

EI :Mr. Chatterji is selected as principal

E2 :Mr. Ayangar is selected as principal

E3:Dr. Singh is selected as principal

4 2 3
P(E1)= 9 , P(E2)= 9 and P(E3)= 9

3 5 8
P(A/E1)= 10 ,P(A/E2)= 10 and P(A/E3)= 10

[( ) ( ) (
P(A)=𝑃 𝐴∩𝐸1 ∪ 𝐴∩𝐸2 ∪ 𝐴∩𝐸3 = 𝑃 𝐴∩𝐸1 + 𝐴∩𝐸2 + (𝐴∩𝐸3) )] ( ) ( )
=P(E1)P(A/E1)+P(E2)P(A/E2)+P(E3)P(A/E3)

4 3 2 5 3 8 23
=9 × 10
+ 9
× 10
+ 9
× 10
= 45

Eg:-The contents of urns I, II and III are as follows:


I white, 2 black and 3 red balls,

2 white, 1 black and 1 red balls, and

4white, 5 black and 3 red balls.


One urn is chosen at random and two balls drawn. They happen to be white and red. What is the
probability that they come from urns I II or III?

Solution. LetE1, E2and E3denote the events that the urn I, II and III is chosen, respectively, and let A be
the event that the two balls taken from the selected urn are white and red. Then

1
P (E1) = P (E2) = P (E3)= 3

1×3 1 2×1 1 4×3 2


P(A/E1)= 𝐶(6,2) = 5
, P(A/E2)= 𝐶(4,2) = 3
and P(A/E3)= 𝐶(12,2) = 11

( )
𝑃 𝐸2 𝑃(𝐴/𝐸2) 1
3
1
×3 55
Hence P(E2/A)= 3 = 1 1 1 1 1 2 = 118
×5+ ×3+3 × 11
( )
∑ 𝑃 𝐸𝑖 𝑃(𝐴/𝐸𝑖)
𝑖=1
3 3

( )
𝑃 𝐸3 𝑃(𝐴/𝐸3) 1
3
× 11
2
30
P(E3/A)= 3 = 1 1 1 1 1 2 = 118
× 5 + × 3 + 3 × 11
( )
∑ 𝑃 𝐸𝑖 𝑃(𝐴/𝐸𝑖)
𝑖=1
3 3

55 30 33
P(E1/A)=1 − 118
− 118
= 118

Probability Theoretical Distribution


Random Variable:-If a real variable X is associate with the outcome of a random experiment is called a
random variable. For example, if E consists of two tosses of a coin, we may consider the random variable
X which is the number of heads ( 0, 1 or 2).

Outcome: HH​ ​ HT​ TH​ TT

Value of X: 2​ ​ 1​ 1​ 0

1
P(X=0)=P(TT)= 4

1
P(X=1)=P[(HT),(TH)]= 2

1
P(X=2)=P(HH)= 4

We have P(X=0)+P(X=1)+P(X=2)=1

Hence X, is a random variable taking values 0,1,2. As the values of X depend on the chance, it is also
called a chance variable, stochastic variable or simply a variable

Types of Random Variable


1.​ Discrete Random Variable:-If a random, variable takes at most a countable number of values, it
is called a discrete random variable. In other words, a real valued function defined on a discrete
sample space is called a discrete random variable.
2.​ Continuous Random Variable:- A random variable X is said to be continuous if it can take all
possible values between certain limits. In other words, a random variable is said to be
continuous when its different values cannot be put in 1-1 correspondence with a set of positive
integers.

Discrete Probability Distribution

1.​ Probability Mass Function:-Suppose X is a one-dimensional discrete random variable taking


at most a countably infinite number of values x1,x2,… . With each possible outcome xi , we
associate a number pi= P ( X = xi) = p ( xi), called the probability of xi. The numbers p (xi);
i=1,2,... must satisfy the following conditions:

(i) p( xi)≥ 0 ∀ i, (ii) ∑ 𝑝(𝑥𝑖) = 1
𝑖=1
This function p is called the probability mass function of the random variable X and the set (xi, p
(xi) ) is called the probability distribution (p.d.) of the random variable (r.v.) X.

Remarks: The set of values which X takes is called the spectrum of the random variable.
2.​ Discrete Distribution Function. If X is a discrete random variable then the function F (x ) is
defined as
𝑥
F(x)=P(X≤𝑥) = ∑ 𝑝(𝑥𝑖) where x is just the integer is called discrete distribution function of X, it
𝑖=1
is also known as Cummulative distribution function(CDF) of X. The domain of distribution is (
∞, − ∞) and its range is[0,1].

Eg:-An-experiment consists of three independent tosses of a fair coin. Let


X = The number of heads
Y = The number of head runs,
Z = The length of head runs, -
a head run being defined as consecutive occurrence of at least two heads, its length then being the number of
heads occurring together in three tosses of the coin. Find the probability function of X and construct
probability tables and draw their probability charts

Solution
S.No. Elementary Event X Y z
1 HHH 3 1 3
2 HHT 2 1 2
3 HTH 2 0 0
4 HTT 1 0 0
5 THH 2 1 2
6 THT 1 0 0
7 TTH 1 0 0
8 TTT 0 0 0

Here sample space is .


S = {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT}
Obviously X is ar.v. which can take the values 0, 1, 2, and 3
3
p(3) = P (HHH) =( 12 ) =
1
8

1 1 1 3
p(2)=P [HHT ∪ HTH ∪ THH] =p (HHT ) + 'p (HTH) + P (THH)= 8 + 8
+ 8
= 8

Similarly p (1) = 3/8 and p (0) = 1

Eg:-A random variable X has the following probability distribution:


X 0 1 2 3 4 5 6 7
P(x) 0 k 2k 2k 3k k2 2 k2 7 k2+k

(i)​ Find k
(ii)​ Evaluate P(X<6)
(iii)​ Determine the distribution function of X
Solution.
7
i)​ Since ∑ 𝑝(𝑥) = 1, we have
𝑥=0
0+k+2k+2k+3k+k2+2 k2+7 k2+k=1
10k2+9k-1=0
(10k-1)(k+1)=0
1
k= 10 {k = -I, is rejected, since probability can not be negative)
1 2 2 3 1 81
ii)​ P (X < 6) = P (X = 0 ) + P (X = 1) + ... + P (X = 5)= 10
+ 10
+ 10
+ 10
+ 100
= 100
iii)​
X F(x)=P(X≤𝑥)

0 0

1 k=1/10

2 3k=3/10

3 5k=5/10

4 8k=4/5

5 8k+k2=81/100
6 8k+3k2=83/100

7 9k+10k2=1

Continuous Probability Distribution


1.​ Probability Density Function:-A continuous random variable has a probability zero of
assuming exactly any of its values. The function f(x) is said to be probability density
function of a continuous random variable X, if it is defined as:-
i)​ 𝑓(𝑥)≥0, − ∞ < 𝑥 < ∞

ii)​ ∫ 𝑓(𝑥) = 1
−∞
𝑏
iii)​ P(a<X<b)=∫ 𝑓(𝑥)𝑑𝑥
𝑎
2.​ Continuous distribution Function:- If X is a continuous random variable, then the
function f(x) is defined as
𝑥
𝐹(𝑥) = 𝑃(𝑋≤𝑥) = ∫ 𝑓(𝑡)𝑑𝑡, −∞<𝑥<∞
−∞

Eg:-The diameter of an electric cable; say X, is assumed to be a continuous random variable


with p.df. f( x ) = 6x ( 1 - x), 0≤x ≤ 1.
(i) Check that above is p.d.f,
(ii) Determine a number b such that P (X <b) = P (X> b)
Solution.
i)​ Obviously , for 0≤x ≤ 1, 𝑓(𝑥) = ≥0
1 1
Now∫ 𝑓(𝑥)𝑑𝑥 = 6 ∫ 𝑥(1 − 𝑥)𝑑𝑥
0 0
1 1
| 𝑥2
2
3
|
(
= 6 ∫ 𝑥 − 𝑥 𝑑𝑥 = 6| 2 −
| ) 𝑥
3
| =1
|0
0

ii)​ P (X <b) = P (X> b)


𝑏 1
∫ 𝑓(𝑥)𝑑𝑥 = ∫ 𝑓(𝑥)𝑑𝑥
0 𝑏

𝑏 1
2 2
( )
6 ∫ 𝑥 − 𝑥 𝑑𝑥 = 6 ∫ 𝑥 − 𝑥 𝑑𝑥 ( )
0 𝑏

𝑏 1
| 𝑥2
( ) )− ( )
3
| | 𝑥2 3
| 2 3 2 3
|2 −
|
𝑥
3
| =|2 −
|0 |
𝑥
3
|
|𝑏
𝑏
2

𝑏
3
= ( 1
2

1
3
𝑏
2

𝑏
3
3b2-2b3=1-3b2+2b3
4b3-6b2+1=0
(2b-1)(2b2-2b-1)=0
Hence b=1/2 is the only real value lying between 0 and 1.

Theoretical Discrete Probability Distribution

We will discuss theoretical discrete distributions in which variables are distributed according to some
definite probability law which can be expressed mathematically. The present study will also enable us
to fit a mathematical' model or a function of the form y = p(x) to the observed data.

We have already defined distribution function. This section based on univariate distributions like
Binomial and Poisson distribution.

Binomial Distribution:-Binomial distribution was discovered by James Bernoulli (1654-1705) in the


year. Let a random experiment be performed repeatedly and let the occurrence of an event in a trial
be called a success and its non-occurrence a failure. Consider a set of n independent Bernoullian trials
(n being finite),in which the probability 'p' of success in any trial is constant for each trial. Then q = 1 -
p, is the probability of failure in any trial. Then in n independent Bernoulli trials the probability there
will be x successes and n-x failures are given by:-

P(X=x)=nCxpxqn-x, x=0,1,2,…

Remarks​

𝑛 𝑛
𝑥 𝑛−𝑥
1.​ ∑ 𝑃(𝑥) = ∑ 𝐶(𝑛, 𝑥)𝑝 𝑞 =1
𝑥=0 𝑥=0
2.​ Let us suppose that n trials constitute an experiment. Then if this experiment is repeated N
times the frequency function of the binomial distribution is given by
f(x) = NP(x) = NnCxpxqn-x
and the expected frequencies of 0,1, 2,...,n successes are the successive terms of the
binomial expansion, N(q+p)n, q+p=1
3.​ Binomial distribution is important not only because of its wide applicability but because it
gives rise to many other probability distributions. Table sfor p(x) are available for various
values of n, and p.
4. Physical conditions for Binomial Distribution. We get the binomial distribution under the
following experimental conditions.
(i) Each trial results in two mutually disjoint outcomes termed as success and failure.
(ii) The number of trials ‘n' is finite.
(iii) The trials are independent of each other.
(iv) The probability of success 'p’is constant for each trial.
The problems relating to tossing of a coin or throwing of dice or drawing cards from a pack-
of cards with replacement lead to binomial probability distribution.

Eg:-Ten coins are thrown simultaneously. Find the probability of getting at least seven heads'

Solution.p= Probability of getting a head= 12


1
q= Probability of not getting a head= 2

The probability of getting x heads in a random throw of 10 coins is

P(x)=10Cxpxq10-x

Probability of getting at least seven: headsis given by

𝑃(𝑋≥7) = 𝑃(7) + 𝑃(8) + 𝑃(9) + 𝑃(10)

1 10
= ( )
2
{𝐶(10, 7) + 𝐶(10, 8) + 𝐶(10, 9) + 𝐶(10, 10)}

120+45+10+1 176
= 1024
= 1024

Eg:-A and B play a game in which their chances of winning are in the ratio 3 : 2. Find A's chance of
winning at least ,three games out of the five games played.

Solution. Let p be the probability that 'A’ wins the game. Then we are given p= 3/5 and then q = 1-p =
2/5.

Hence by binomial probability law, the probability that out of 5 games played, A wins 'r’ games is given
by :

P(X=r)=p(r)=5Cr(3/5)r(2/5)5-r; r=0,1,2,3,4,5

The required probability that 'A' wins at least three games is givenby :

5 𝑟 5−𝑟
32
𝑃(𝑋≥3) = ∑ 𝐶(5, 𝑟) 5
𝑟=3 5

3
=
3
5
5 [𝐶(5, 3) × 22 + 𝐶(5, 4)×3×2 + 𝐶(5, 5)×32×1] = 0. 68
Constants of Binomial Distribution:-

1.​ Mean=np
2.​ Variance= npq
2
µ3 2
(𝑞−𝑝)
3.​ Kurtosis β1 = 2 = 𝑛𝑝𝑞
µ2
1−2𝑝
4.​ Skewness =
𝑛𝑝𝑞

Poisson Distribution(as a limiting case of Binomial Distribution). Poisson distribution was discovered by
the French mathematician and physicist Simeon Denis Poisson (1781-1840) who published it in 1837.
Poisson distribution is a limiting case of the binomial distribution under the following conditions:
(I) n, the number of trials is indefinitely large, i.e., 𝑛→∞.
(ii) p, the constant probability of success for each trial is indefinitely small,i.e., 𝑝→0.
(iii) np=λ, (say), is finite. Thus p = λ/n, q = 1 –λ/𝑛,where λ is a positive real number.
The probability of x successes in a series of n independent trials is
−λ 𝑥
𝑒 λ
𝑃(𝑋 = 𝑥) = 𝑥!
, where λ is known as the parameter of the distribution.

Remarks:-

1.​ It should be noted that


∞ ∞ 𝑥
−λ λ −λ λ
∑ 𝑃(𝑋 = 𝑥) = 𝑒 ∑ 𝑥!
=𝑒 𝑒
𝑥=0 𝑥=0

2.​ Mean=Variance =λ
3.​ The corresponding distribution function is:
𝑥 ∞ 𝑟
−λ λ
𝑓(𝑥) = 𝑃(𝑋≤𝑥) = ∑ 𝑝(𝑟) = 𝑒 ∑ 𝑟!
; x=0,1,2,…
𝑟=0 𝑟=0
4.​ Poisson distribution occurs when there are events which do not occur as outcomes of a definite
number of trials (unlike that in binomial) of an experiment but which occur at random points of
time and space wherein our interest lies only in the number of occurrences of the event, not in
its non-occurrences.
5.​ Following are some instances where Poisson distribution may be successfully employed
·
●​ Number of deaths from a disease (not in the form of an epidemic) such as heart attack or cancer or
due to snake bite.
●​ Number of suicides reported in a particular city.
●​ The number of defective material in a packing manufactured by a good concern.
●​ Number of faulty blades in a packet of 100.
●​ Number of air accidents in some unit of time.
●​ Number of printing mistakes at each page of the book.
●​ Number of telephone calls received at a particular telephone exchange.
●​ Number of cars passing a crossing per minute during the busy hours of a day.
●​ The number of fragments received by a surface area 't 'from a fragment atom bomb.
●​ The emission of radioactive (alpha) particles
Eg:-A car hire firm has two cars which it fires out day by day. The number of demands for a car on each
day is distributed as Poisson variate with mean 1· 5. Calculate the proportion of days on which (i) neither
car is used, and(ii) some demand is refused.

Solution.The proportion of days on which there are x demands for a car


−1.5 𝑥
𝑒 (1.5)
= P {of x demands in a day}= 𝑥!

Since the number of demands for a car on any day is a Poisson variate with mean 1.5. Thus
−1.5 𝑥
𝑒 (1.5)
P(X=x)= 𝑥!

Proportion of days on which neither car is used is given by

−1.5
P(X=0)=𝑒 = 0. 2231

Eg:-A manufacturer of cotter pins knows that 5% of his product is defective. If he sells cotter pins in
boxes of 100 and guarantees that not more than 10 pins will be defective, what is the approximate
probability that a box will fail to meet the guaranteed quality?

Solution. We are given-n =100.


Let p - Probability of a defective pin = 5% = 0.05

λ = Mean number of defective pins in a box of 100

= np = 100 x 0·05 = 5

Since 'p' is small, we .nay use Poisson distribution.

Probability that a box will fail to meet the guaranteed quality is

10 −5 𝑥
𝑒 5
𝑃(𝑋 > 10) = 1 − 𝑃(𝑋≤10) = 1 − ∑ 𝑥!
𝑥=0

Normal Distribution:-The normal distribution was first discovered in1733 by English mathematician
De-Moivre, who obtained this continuous distribution as a limiting case of the binomial distribution and
applied it to problems a rising in the game of chance.

The normal distribution has wide application in the analysis and evaluation of every experimental data in
Science, Engineering and Medicine. Normal distribution is also known as Gaussian distribution.

A continuous random variable X is said to have the normal distribution, if its probability density function
is given by
𝑥−μ 2

𝑓(𝑥, μ, σ) =
1
𝑒
−2
1
( )
σ
− ∞ < 𝑥 < ∞, − ∞ < μ < ∞, σ > 0
σ 2π

Where µ is mean and σ is standard deviation

Chief Characteristics of the Normal Distribution and Normal Probability Curve

1.​ The curve is bell shaped and symmetrical about the line x =µ.
2.​ Mean, median and mode of the distribution coincide.
3.​ As x increases numerically, f(x) decreases rapidly, the maximum probability occurring at the
1
point x =µ and given by |p(x)|max=
σ 2π
4.​ β = 0 𝑎𝑛𝑑 β = 3
1 2
5.​ Since f(x) being the probability, can never be negative. no portion of the curve lies below the x-axis.
6.​ Linear combination of independent normal variates is also a normal variate.
7.​ x-axis is an asymptote to the curve.
8.​ Area under the curve is 1

Eg:-The mean yield for one-acre plot is 662 kilos with a s.d. 32 kilos. Assuming normal distribution, how
many one-acre plots in a batch of 1.000 plots would you expect to have yield over 700 kilos?

Solution. If the r.v. X denotes the yield (in kilos) for on-acre plot, then we are given that 𝑋~𝑁(μ, σ2)
where μ = 62 𝑎𝑛𝑑 σ = 32.

The probability that a plot has a yield over 700 kilos is given by

𝑥−662
P(X>700)=P(Z>1.19) where 𝑧 = 32

= 0. 5 − 𝑃(0≤𝑍≤1. 19)

= 0. 5 − 0. 3830 = 0. 117

Hence ina batch of 1.000 plots, the expected number of plots with yield over 700kilos is
1,000x0.117=117 .

You might also like