0% found this document useful (0 votes)
9 views37 pages

Probability Distribution

Module 3 covers the concept of probability, including counting rules, basic probability rules, and Bayes' Theorem. It introduces probability distributions, specifically focusing on theoretical distributions such as Binomial, Poisson, and Normal distributions. Key topics include permutations, combinations, and the empirical rule for normal distributions.

Uploaded by

shivakumarjs2003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views37 pages

Probability Distribution

Module 3 covers the concept of probability, including counting rules, basic probability rules, and Bayes' Theorem. It introduces probability distributions, specifically focusing on theoretical distributions such as Binomial, Poisson, and Normal distributions. Key topics include permutations, combinations, and the empirical rule for normal distributions.

Uploaded by

shivakumarjs2003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 37

Probability

Distribution
Module 3
Topics Covered

Concept of probability
Counting rules for determining number of outcomes - Permutation and
Combination,

Rules of probability- Addition and Multiplication, Baye’s Theorem.

Concept of Probability Distribution,

Theoretical Probability Distributions - Binomial, Poisson, Normal


(Problems only on Binomial, Poisson and Normal).
Concept of Probability

❖ Probability denotes the possibility of the outcome of any random event. The meaning of
this term is to check the extent to which any event is likely to happen.
❖ The probability is the measure of the likelihood of an event to happen. It measures the
certainty of the event.
❖ The formula for probability is given by;

P(E) = n(E)/n(S)
Here,
n(E) = Number of event favorable to event E
n(S) = Total number of outcomes
For example, when we flip a coin in the air, what is
the possibility of getting a head?
Terminologies to be known

•Random Experiment: An experiment whose result cannot be predicted, until it is


noticed is called a random experiment. For example, when we throw a dice randomly,
the result is uncertain to us. We can get any output between 1 to 6.
•Random Variables: The variables which denote the possible outcomes of a random
experiment are called random variables. They are of two types:
1. Discrete Random Variables: take only those distinct values which are countable.
2. Continuous Random Variables: could take an infinite number of possible values.
•Independent Event: When the probability of occurrence of one event has no impact on
the probability of another event, then both the events are termed as independent of each
other. For example, if you flip a coin and at the same time you throw a dice.
Fundamental Counting Principle

•The Fundamental Counting Principle (FPC) is a way to figure out the


number of possible outcomes for a given situation.
•The fundamental counting principle is a rule which counts all the possible
ways for an event to happen or the total number of possible outcomes in a
situation.
•It states that when there are n ways to do one thing, and m ways to do
another thing, then the number of ways to do both the things can be
obtained by taking their product. This is expressed as n×m.
•This is called as Multiplicative rule.
Example: An ice cream seller sells 3 flavors of ice creams,
vanilla, chocolate and strawberry giving his customers 6
different choices of cones.
How many choices of ice creams does Wendy have if she goes
to this ice cream seller?
Multiplicative rule

Example: Suppose you have 2 pairs of shoes and 3 pairs of


socks. In how many ways can you wear them? Now, the
possible ways of choosing a pair of shoes are 2, since 2
pairs of shoes are available.
Can you tell?
Another example: Suppose you have 3 pairs of shoes and 4
pairs of socks. In how many ways can you wear them?
Permutation and Combination

❖ Permutation and Combination are the ways to select


certain objects from a group of objects to form subsets with or
without replacement.
❖ It defines the various ways to arrange a certain group of data.
❖ When we select the data or objects from a certain group, it is
said to be permutations
❖ whereas the order in which they are represented is called
combination.
Permutation rule
Permutation relates to the act of arranging all the members of a set
into some sequence or order. In other words, if the set is already ordered,
then the rearranging of its elements is called the process of permuting.
Permutation Formula
A permutation is the choice of r things from a set of n things without
replacement and where the order matters.
Combination rule
The combination is a way of selecting items from a collection, such that
(unlike permutations) the order of selection does not matter. In smaller
cases, it is possible to count the number of combinations. Combination refers to the
combination of n things taken k at a time without repetition.
Combination Formula
A combination is the choice of r things from a set of n things without replacement and
where order does not matter.
Basic Probability Rules
1) Possible values for probabilities range from 0 to 1
0 = impossible event
1 = certain event

2) Probability is described as a fraction between 0 and 1, zero being a surety that the event
will not occur (impossible event) and one being a certainty that it will occur (certain event).
This translates directly to percentage, 0% to 100%, or decimal, 0.0 to 1.0.

3) The sum of all the probabilities for all possible outcomes is equal to 1.

4) Marginal Probability, or unconditional probability, is the likelihood of an event occurring


without the influence of other events.
It is written P(A), or the probability (P) of an event (A) occurring. Drawing a 3 from
the deck on the first draw would be marginal, because no other factors have
influenced the cards.

5) Addition Rule - Whenever an event is the union of two other events, the
probability that one or both events occur mutually exclusive events:
P(A or B) = P(A) + P(B)
not mutually exclusive events:
P(A or B) = P(A) + P(B) - P(A and B)
P(A∪B)=P(A)+P(B)−P(A∩B)
6) Multiplication Rule - the probability that both events occur
together
Independent events: P(A and B) = P(A) * P(B)
Dependent events: P(A and B) = P(A) * P(B|A)
P(A∩B)=P(A)⋅P(B∣A)
Also known as Joint Probability -the likelihood of two events
occurring at the same time.
Ex: Drawing both a queen and a diamond would be joint, because
the card can be both a diamond and a queen.
7) Conditional Probability - When event A is already known to
have occurred and probability of event B is desired, then P(B, given
A)=P(A and B)P(A, given B). It can be vice-versa in case of event B.
P(B∣A)=P(A∩B)P(A)
8) Complementary Rule: Whenever an event is the complement of
another event, specifically, if A is an event, then P(not A)=1−P(A) or
P(A') = 1 - P(A').
P(A)+P(A′)=1
BAYE’s Theorem
•Bayes theorem is a theorem in probability and statistics, named after the
Reverend Thomas Bayes, that helps in determining the probability of an event
that is based on some event that has already occurred.
•The Bayes theorem is based on finding P(A | B) when P(B | A) is given.
•Bayes theorem, in simple words, determines the conditional probability of
event A given that event B has already occurred based on the following:
ØProbability of B given A
ØProbability of A
ØProbability of B
Bayes rule states that the conditional probability of an event A, given the occurrence of another event
B, is equal to the product of the likelihood of B, given A and the probability of A divided by the
probability of B. It is given as:
probability distribution
● A probability distribution is an idealized frequency distribution.
● A Probability Distribution is a table or an equation that
interconnects each outcome of a statistical experiment with its
probability of occurrence.
● In simple words, Probability Distribution gives out the possibility
of every outcome pertaining to a random experiment or event. It
gives forth the probabilities of various possible occurrences.
● Definition: Probability Distribution is basically the set of all
possible outcomes of any random experiment or event.
•Random experiments are often defined to be the result of
an experiment whose result is hard to predict. For instance,
while flipping a coin, one cannot predict the outcome, that
is, whether it will be a head or a tail.
•P( X=x) denotes the Probability that random variable x is
equivalent to any particular value, represented by X. For
example: P (X=1) states the Probability Distribution of the
random variable X is equivalent to 1.
Types of Probability Distribution:

❖ There are two types of Probability Distribution


which are used for distinct purposes and various
types of data generation processes:
1.Normal or Continuous Probability Distribution
2.Binomial or Discrete Probability Distribution
Normal/ Continuous/Cumulative Probability Distribution
In this Distribution, the set of all possible outcomes can take
their values on a continuous range.
For example- Set of real Numbers, set of prime numbers, are
the Normal Distribution examples as they provide all possible
outcomes of real Numbers and Prime Numbers. Real-life
scenarios such as the temperature of a day is an example of
Continuous Distribution.
Binomial / Discrete Probability Distribution
It is also termed as a Discrete Probability
Function where the set of outcomes is Discrete in
nature. For example: if a dice is rolled, then all its
possible outcomes will be Discrete in nature and
it gives the mass of outcome. It is also considered
a Probability mass Function
•In probability theory and statistics, the binomial
distribution is the discrete probability distribution that
gives only two possible results in an experiment, either
Success or Failure.
•In the binomial probability distribution, the
success/yes/true/one is represented with probability ‘p’
and the failure/no/false/zero with probability ‘q’ where q
= 1 − p.
•The binomial distribution formula is for any random variable X, given by;

P(x=r) = nCr pr (1-p)n-r


Or

P(x=r) = nCr pr (q)n-r


Where,
n = the number of experiments
x = r = number of successes desired
p = Probability of Success in a single experiment
q = Probability of Failure in a single experiment = 1 – p
Conditions

The binomial distribution describes the behavior of a


count variable X if the following conditions apply:
1: The number of observations n is fixed.
2: Each observation is independent.
3: Each observation represents one of two outcomes
("success" or "failure").
4: The probability of "success" p is the same for each
outcome.
Poisson Distribution
❖ In 1830, the Poisson distribution model was introduced by
Siméon Denis Poisson. Poisson distribution definition is used to
model a discrete probability of an event where independent
events are occurring in a fixed interval of time and have a known
constant mean rate.
❖ In other words, Poisson distribution is used to estimate how
many times an event is likely to occur within the given period of
time. λ is the Poisson rate parameter that indicates the expected
value of the average number of events in the fixed time interval.
❖ The Poisson distribution formula is applied when there is a large number
of possible outcomes. For a random discrete variable X that follows the
Poisson distribution, and λ is the average rate of value, then the
probability of x is given by:

f(x) = P(X=r) = (e-λ λr )/r!


Where
x = 0, 1, 2, 3...
e is the Euler's number(e = 2.718)
λ is an average rate of the expected value and λ = variance, also λ>0
Properties of Poisson Distribution

•The events are independent.


•The Poisson distribution is limited when the number of
trials n is indefinitely large.
•For Poisson distribution, the mean and the variance of the
distribution are equal (mean = variance = λ)
•np = λ is finite, where λ is constant.
•For the Poisson distribution, λ is always greater than
‘zero’.
•In a normal distribution, data is symmetrically distributed with no skew.
When plotted on a graph, the data follows a bell shape, with most values
clustering around a central region and tapering off as they go further away
from the center.
•Normal distributions are also called Gaussian distributions or bell curves
because of their shape.
•The mean, median and mode are exactly the same.
•The distribution is symmetric about the mean—half the values fall below the mean
and half above the mean.
•The distribution can be described by two values: the mean and the standard
deviation.
(The mean is the location parameter while the standard
deviation is the scale parameter. The mean determines where
the peak of the curve is centered. Increasing the mean moves
the curve right, while decreasing it moves the curve left)
Empirical rule

•The empirical rule, or the 68-95-99.7 rule, tells you where most of your
values lie in a normal distribution:
•Around 68% of values are within 1 standard deviation from the mean.
•Around 95% of values are within 2 standard deviations from the mean.
•Around 99.7% of values are within 3 standard deviations from the mean.
Formula of the normal curve
•Once you have the mean and standard deviation of a normal distribution, you can
fit a normal curve to your data using a probability density function.
•In a probability density function, the area under the curve tells you probability.
The normal distribution is a probability distribution, so the total area under
the curve is always 1 or 100%.
What is the standard normal distribution?

•The standard normal distribution, also called the z-distribution, is a


special normal distribution where the mean is 0 and the standard deviation is 1.
•Every normal distribution is a version of the standard normal distribution
•While individual observations from normal distributions are referred to as x,
they are referred to as z in the z-distribution. Every normal distribution can be
converted to the standard normal distribution by turning the individual values
into z-scores.
•Z-scores tell you how many standard deviations away from the mean each value
lies.

Standard normal distribution
•Each normal random variable such as X can easily be converted into a z-score
using the normal distribution z formula.
•z=(X−μ)/σ
•X is a normal random variable.
•Μ is mean of data.
•σ is the standard deviation of the data.
Eg: If X= 69, what will be the value of Z if mean = 66 and S.D is 1.5?
The value of Z will be 3
https://greenbeltacademy.com/gb-body-of-knowledge/measure/
normal-distribution/

You might also like