P&S UNIT-3 Probability and Distributions
P&S UNIT-3 Probability and Distributions
Sample Space :
The set of all possible outcomes of a random Experiment is Called a Sample Space
Ex: If we tossing a coin, the possible outcomes are Head and Tail . It is Denoted by S.
So, the Sample Space S = { H, T }
The Sample Space are Two Types . They Are i) Discrete Sample Space
ii) Continuous Sample Space
Discrete & Continuous Sample Space:
If the Number of Outcomes in a sample space is finite then it is called Discrete Sample Space .
If the Number of Outcomes in a sample space is infinite then it is called Continuous Sample Space .
Ex: i) If we tossing a coin, the possible outcomes are Head and Tail .
So, the Sample Space S = { H, T }. So, it is Discrete.
ii) Measuring the Height of a person. In this experiment , we get infinitely possible outcomes. In this case the sample space is
Continuous.
Types of Events:
i) Simple Event : An event in a trail that cannot be further split is called a simple event or elementary
event.
Ex: If we tossing a coin, the possible outcomes are Head and Tail .
So, the Sample Space S = { H, T }.
in this case, 𝐄𝟏 = H , 𝐄𝟐 = T are simple Events , why because we cannot split 𝐄𝟏 and 𝐄𝟐 further.
ii) Equally Likely Events: Events are said to be equally likely if all the events have same probability in a
random experiment.
Ex: When a card is drawn from a pack, any card may be obtained. In this trail, all the 52 cards (or events)
have same probability.
Ex: If we tossing a coin, the possible outcomes are Head and Tail .
So, the Sample Space S = { H, T }.
in this case, 𝐄𝟏 = H , 𝐄𝟐 = T are mutually exclusive Events because 𝐄𝟏 ∩ 𝐄𝟐 = ∅.
iv) Exhaustive Events: A set of Events that includes all possible outcomes of a random experiment is
called Exhaustive Events.
v) Complementary Events : Two Events 𝐄𝟏 and 𝐄𝟐 are said to be Complementary events if their intersection is
empty and their union is the entire Sample Space.
i.e, 𝐄𝟏 ∩ 𝐄𝟐 = ∅ and 𝐄𝟏 ∪ 𝐄𝟐 = S.
Ex: If we tossing a coin, the possible outcomes are Head and Tail .
So, the Sample Space S = { H, T }.
in this case, 𝐄𝟏 = H , 𝐄𝟐 = T are Complementary Events because 𝐄𝟏 ∩ 𝐄𝟐 = ∅ and 𝐄𝟏 ∪ 𝐄𝟐 = S.
ii) Sample without replacement : It means that the object which was drawn is put aside.
Axioms of Probability :
𝐏 𝐀 ∪ 𝐁 ∪ 𝑪 = 𝐏 𝐀 + 𝐏 𝐁 + 𝐏 𝐂 − 𝐏 𝐀 ∩ 𝐁 − 𝐏 𝐁 ∩ 𝑪 − 𝑷 𝑪 ∩ 𝑨 + 𝑷(𝑨 ∩ 𝑩 ∩ 𝑪).
i) 𝐏 𝐀 ∪ 𝐁 = 𝐏 𝐀 + 𝐏 𝐁 because 𝐏(𝐀 ∩ 𝐁) = ∅.
ii) 𝐏 𝐀 ∪ 𝐁 = 𝟏 − 𝐏 𝐀 ∪ 𝐁 = 𝟏 − 𝐏 𝐀 + 𝐏 𝐁 ഥ −𝐏 𝐁 .
=𝟏−𝐏 𝐀 −𝐏 𝐁 =𝐏 𝑨
ഥ∪𝑩
i) 𝑷(𝑨 ഥ ) = 𝑷(𝑨 ∩ 𝑩) = 𝟏 − 𝐏(𝐀 ∩ 𝐁)
ഥ∩𝑩
ii) 𝑷(𝑨 ഥ ) = 𝑷(𝑨 ∪ 𝑩) = 𝟏 − 𝐏(𝐀 ∪ 𝐁)
Conditional Probability :
Let A and B are two events in a sample space S and 𝑃 𝐵 ≠ 0, Then the
probability of B after the event A happened is called conditional probability of the event B given A and is
𝐁
denoted by 𝐏 𝐀 and we define as
𝐁 𝑷(𝑨∩𝑩) 𝒏(𝑨∩𝑩)
𝑷 = = .
𝐀 𝑷(𝑨) 𝒏(𝑨)
Similarly,
𝑨 𝑷(𝑨∩𝑩) 𝒏(𝑨∩𝑩)
𝑷 = = .
𝑩 𝑷(𝑩) 𝒏(𝑩)
Independent Events :
If the occurrence of the event B is not affected by the occurrence of the
𝐁
event A , then event B is said to be an independent of A and in that case 𝐏 𝐀 = 𝐏 𝐁 .
in that case Multiplication theorem becomes 𝐏 𝐀 ∩ 𝐁 = 𝐏 𝐀 . 𝐏 𝐁
ii) Probability of the occurrence of at least one of the independent events A,B,C is
𝑷 𝑨 ∪ 𝑩 ∪ 𝑪 = 𝟏 − 𝑷(𝑨 ∪ 𝑩 ∪ 𝑪) = 𝟏 − 𝑷(𝐀 ഥ∩𝐁 ത =1-𝐏 𝐀
ഥ ∩ 𝐂) ഥ . 𝐏(𝐁ഥ ). 𝐏 𝐂ത [Since
ഥ, B
A ഥ, Cഥ are independent events]
BAYE’S Theorem :
𝐸1 , 𝐸2 , 𝐸3 ,….., 𝐸𝑛 are n mutually exclusive and exhaustive events such that
𝑃 𝐸𝑖 > 0 𝑖 = 1,2, … , 𝑛 in a sample space S and A is any other event in S intersecting with every 𝐸𝑖 such
that P A > 0 .
If 𝐸𝑖 is any of the events of 𝐸1 , 𝐸2 , 𝐸3 ,….., 𝐸𝑛 where 𝑃 𝐸1 , 𝑃 𝐸2 ,𝑃 𝐸3 ….. 𝑃 𝐸𝑛
𝐴 𝐴 𝐴 𝐴
and 𝑃 𝐸 , 𝑃 𝐸 𝑃 𝐸 ……, 𝑃 𝐸 are known, then
1 2 3 𝑛
𝑨
𝑬𝒊 𝑷 𝑬𝒊 .𝑷
𝑬𝒊
𝑷 = 𝑨 𝑨 𝑨
.
𝑨 𝑷 𝑬𝟏 .𝑷 𝑬 +𝑷 𝑬𝟐 .𝑷 𝑬 +⋯……+𝑷 𝑬𝒏 .𝑷 𝑬
𝟏 𝟐 𝒏
Random Variable:
Random Variable is a real- valued function which assigns real values to the sample points.
i.e, 𝑿: 𝑺 → 𝑹 is a random variable , which is defined by 𝑋 𝑠 = 𝑟 , where the Domain 𝑆 is a sample Space
and the Co-Domain R is the set of all Real numbers.
Example:
If we toss two coins at a time, then the sample Space contains the outcomes (or) sample points are
𝑆 = 𝑠1 , 𝑠2 , 𝑠3 , 𝑠4 , where 𝑠1 = 𝐻𝐻, 𝑠2 = 𝐻𝑇, 𝑠3 = 𝑇𝐻, 𝑠4 = 𝑇𝑇
𝑋 𝑠1 = 2 , 𝑋 𝑠2 = 1 , 𝑋 𝑠3 = 1 , 𝑋 𝑠4 = 0.
In this Example, the function X assigns real values to the sample points in the sample space . So, it is a random
Variable.
i) 0≤ 𝐹 𝑥 ≤ 1 ii) 𝐹 −∞ =0 and 𝐹 ∞ = 1
The Function 𝑃(𝑋) is called Probability Mass Function if is satisfies two conditions . They are
𝑋 𝑠1 = 2 , 𝑋 𝑠2 = 1 , 𝑋 𝑠3 = 1 , 𝑋 𝑠4 = 0.
Here,
X 0 1 2
P(𝒙𝒊 )=𝒑𝒊 𝟏 𝟐 𝟏
𝟒 𝟒 𝟒
1 2 1
Clearly, Every P(𝑥𝑖 ) = 𝑝𝑖 > 0 , 𝑖 = 1,2,3 and σ3𝑖=1 P 𝑥𝑖 = 4 + 4+ 4 = 1
So, This function is a Probability Mass Function.
In this Concept, we Will learn how to find Expectation(mean), variance , Standard Deviation Of a Distribution
Function .
Similarly, 𝑬 𝒙𝒓 = σ𝒏𝒊=𝟎 𝒑𝒊 𝒙𝒊 𝒓
1. E X + K = E X + K
2. E K = K
3. E KX = K. E X
4. E aX ± b = a E X ± b
5. E X + Y = E X + E Y
6. E X + Y + Z = E X + E Y + E Z
7. If X, Y are independent random Variables then E XY = E X . E Y
Similarly,
If X, Y, Z are independent random Variables then E XYZ = E X . E Y . E Z
Note :
1 1
1. 𝐸 𝑋
and 𝐸(𝑋) are not same.
2. Sometimes we can denote the Mean with 𝛍
Mean:
the mean value 𝜇 of the discrete distribution function is given by
σ𝐩𝐢 𝐱𝐢 σ𝒑𝒊 𝒙𝒊
𝝁= = = σ𝒑𝒊 𝒙𝒊 = 𝐄 𝑿
σ𝐩𝐢 𝟏
Variance:
If X is a Random Variable, then the Variance of X is defined as Mathematical Expectation of
2
𝐗−𝛍 .
i.e, 𝐕𝐚𝐫 𝐗 = 𝐕 𝐗 = 𝐄 𝐗 − 𝛍 𝟐 = 𝐄 𝐗 𝟐 − 𝐄 𝐗 𝟐 = 𝐄 𝐗 𝟐 − 𝛍𝟐 = σ𝒏𝒊=𝟎 𝒑𝒊 𝒙𝒊 𝟐 −𝛍𝟐
Note:
1. Variance of a Constant is Zero i.e, V 𝐾 = 0 , Where K is a constant.
2. If K is a Constant, then V KX = 𝐾 2 . 𝑉 𝑋
3. If X is a Random Variable and K is Constant then 𝑉 𝑋 + 𝐾 = 𝑉 𝑋
4. If X is a Discrete Random Variable and a,b are constants then 𝑉 𝑎𝑋 + 𝑏 = 𝑎2 𝑉 𝑋
5. If X and Y are two independent random variables, then 𝑉 𝑋 + 𝑌 = 𝑉 𝑋 + 𝑉(𝑌)
Let 𝑓(𝑥) be the probability density function for the random variable X. then
∞
i) Expectation (Mean) : 𝛍 = 𝐄 𝐗 = −∞ 𝐱. 𝐟 𝐱 𝐝𝐱
𝑏
if X is Defined from a to b , then 𝜇 = 𝐸 𝑋 = 𝑥 𝑎. 𝑓 𝑥 𝑑𝑥.
∞
ii) Variance : 𝛔𝟐 = −∞ 𝐱 𝟐 . 𝐟 𝐱 𝐝𝐱 − 𝛍𝟐
𝑏
if X is Defined from a to b , then 𝜎 2 = 𝑥 𝑎2 . 𝑓 𝑥 𝑑𝑥 − 𝜇2 .
𝑴 𝒃 𝟏
iii) Median : To get the value of Median , we have follow = 𝒙𝒅 𝒙 𝒇 𝑴 = 𝒙𝒅 𝒙 𝒇 𝒂 𝟐
∞
v) Mean Deviation : m = −∞ 𝐱 − 𝛍 . 𝐟 𝐱 𝐝𝐱.
Note:
1. Cumulative Distribution Function of a continuous random Variable X is denoted by 𝐹(𝑋) and is defined
as
𝒙
𝑭(𝒙) = 𝑷 𝑿 ≤ 𝒙 = −∞ 𝒇 𝒙 𝒅𝒙
𝒌 ∞
2. 𝑷 𝑿 ≤ 𝒌 = 𝒙𝒅 𝒙 𝒇 𝟎, 𝑷 𝑿 ≥ 𝒌 = 𝒙𝒅 𝒙 𝒇 𝒌and 𝑷 𝑿 > 𝒌 = 𝟏 − 𝑷 𝑿 ≤ 𝒌 , where K is a
constant.
𝒃 𝒃
3. 𝑷 𝒂 ≤ 𝒙 ≤ 𝒃 = 𝒙𝒅 𝒙 𝒇 𝒂and 𝑷 𝒂 < 𝒙 < 𝒃 = 𝒙𝒅 𝒙 𝒇 𝒂
In Some Random Experiments, we cannot express all possible outcomes of an event according to the
number of trails.
In that case, we have to use the following theoretical distributions. They are
1. Binomial Distribution
2. Poisson Distribution
1) Definition:
A random Variable X has a Binomial distribution if it assumes only non-negative
values and its probability density function is given by
𝒏𝑪𝒓 𝒑𝒓 𝒒𝒏−𝒓 , 𝒓 = 𝟎, 𝟏, 𝟐, … . 𝒏
𝑷 𝑿=𝒓 =𝑷 𝒓 =ቊ
𝟎, 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆
2) Mean:
Mean of a binomial Distribution is 𝑬 𝑿 = 𝝁 = 𝝀 (= 𝒏. 𝒑)
3) Variance :
Variance of a binomial distribution is 𝑽 𝑿 = 𝒏𝒑𝒒.
4) Standard Deviation :
Standard Deviation of a binomial Distribution is 𝒏𝒑𝒒
𝒏
6) The Binomial Frequency Distribution is given by N 𝒑 + 𝒒 , where N = total Frequency
Definition :
An Event associated with the random trial is called Success and the complementary event is called
Failure. Probability of Success is denoted by p and Probability of Failure is denoted by q
Ex: A Fair Coin Tossed One Time. What is the Probability of getting Head.
In this Question , our Random Experiment associated with the outcome Head. So, In this
Example
1
𝐩 = Probability of Success = Probability of getting a Head = 2
1
𝐪 = Probability of Failure = Probability of not getting a Head = 𝟏 − 𝐩 = 2
1) Definition : A Random Variable X is said to follow Poisson Distribution if it assumes only non negative
terms and its probability density function is given by
𝒆−𝝀 𝝀𝒙
𝑷 𝑿=𝒓 =𝑷 𝒓 = ቐ 𝒙! , 𝒓 = 𝟎, 𝟏, 𝟐, … . 𝒏
𝟎, 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆
2) Mean :
Mean of a Poisson Distribution is 𝑬 𝑿 = 𝝁 = 𝝀 (= 𝒏. 𝒑)
3) Variance :
Variance of a Poisson Distribution is 𝑉 𝑋 = 𝝀 (= 𝒏. 𝒑)
4) Standard Deviation :
Standard Deviation of a Poisson Distribution is 𝝈 = 𝝀 = 𝒏. 𝒑
𝝀𝒓
5) The Poisson distribution function is 𝑭𝑿 𝒙 = 𝑷 𝑿 ≤ 𝒙 = 𝒆−𝝀 σ∞
𝒓=𝟎 𝒓!
Normal Distribution:
A random Variable X is said to have a Normal Distribution, if its density function or
probability distribution is given by
− 𝒙−𝝁 𝟐
𝟏
𝒇 𝒙; 𝝁, 𝝈 = .𝒆 𝟐𝝈𝟐 , −∞ < 𝒙, 𝝁 < ∞, 𝝈 > 𝟎 , where 𝜇 = mean , 𝜎 =
𝝈 𝟐𝝅
Standard Deviation
Standardized Variable:
𝐱−𝛍
If a random Variable X has mean 𝝁, and S.D. 𝝈 , then the corresponding variable 𝐳 = is
𝛔
called Standardized Variable Corresponding to X.
1. The shape of a normal curve is bell shaped and symmetrical with respect to mean (x= 𝝁) or z = 0
2. Area under normal curve represents total population
3. X-axis is an asymptote to the curve
4. Total Area under normal curve is 1
5. It is extended from −∞ 𝑡𝑜 ∞
6. Mean, median and mode of this normal distribution is coincide at z = 0. So, Normal curve is unimodal
( has only one maximum point)