Module 1 Lecture 4-Probability Distributions
Module 1 Lecture 4-Probability Distributions
Lecture - 4
Probability theory
1. Sample space
2. Event
3. Outcome
4. Trial
Probability Theory Basics
Random Experiments Example
• Suppose a coin is tossed. The two possible outcomes are getting a head or a tail.
The outcome of this experiment cannot be predicted before it has been performed.
Furthermore, it can be conducted many times under the same conditions. Thus,
tossing a coin is an example of a random experiment.
If the sample space S consists of a finite (or countable infinite) number of outcomes, assign a
probability to
each outcome
• The sum of all probabilities equals to 1.
• If there is k (finite) outcomes in the sample space S, all equally likely, then each individual
outcome
has probability 1/k
• Exhaustive Events
B. Discrete Probability
Distributions
C. Bernoulli Distribution
D. Binomial Distribution
E. Geometric Distribution
F. Poisson Distribution
Discrete Probability Distributions
A. Random Variable
• In probability, a random variable is a real-valued function whose domain is the sample space of
the random
experiment.
• It means that each outcome of a random experiment is associated with a single real number, and
the single real number may vary with the different outcomes of a random experiment.
• Hence, it is called a random variable and it is generally represented by the letter “X”.
For example, let us consider an experiment for tossing a coin two times.
Hence, the sample space for this experiment is S = {HH, HT, TH, TT}
If X is a random variable and it denotes the number of heads obtained, then the values are represented as follows:
Similarly, we can define the number of tails obtained using another variable, say Y.
(i.e) Y(HH) = 0, Y(HT) = 1, Y(TH) = 1, Y(TT)= 2. Dr. Selva Kumar S
(SCOPE)
Discrete Probability Distributions
Random Variable
A random variable can be either discrete or
continuous
A.Discrete random variables have a countable number of outcomes
Examples: Dead/alive, treatment/placebo, dice, counts,etc.
• A binary random variable is a discrete random variable where the finite set of
outcomes is in
{0, 1}.
• A categorical random variable is a discrete random variable where the finite set of
outcomes
is in {1, 2, …, K}, where K is the total number of unique outcomes.
Discrete Probability Distributions roll of a die
• The relationship between the events for a discrete random variable and their
probabilities is called the discrete probability distribution or probability mass
function (PMF).
• For outcomes that can be ordered, the probability of an event equal to or less than
a given
value is defined by the cumulative distribution function (CDF).
• The inverse of the CDF is called the percentage-point function and will give the
discrete
outcome that is less than or equal to a probability.
Discrete Probability Distributions
Probability Mass
Function
• The returned probability lies in the range [0, 1] and the sum of all probabilities for every state
equals
one. • The relationship between the events for a
discrete random variable and their probabilities
is called the discrete probability distribution or
probability mass function (PMF).
Discrete Probability Distributions
Probability Mass Function(PMF)
• Let is dice is rolled then the probability of getting a number equal to 4 is an
example of probability mass function. The sample space for the given event is
S={1, 2, 3, 4, 5, 6} and X be the random variable for getting a 4.
3 p(x=3)=1/6
4 p(x=4)=1/6
5 p(x=5)=1/6
6 p(x=6)=1/6
1.0
Discrete Probability Distributions
There are many common discrete probability distributions
3. Geometric Distribution
4. Poisson Distribution
Discrete Probability Distributions
1.Bernoulli Distribution
• The Bernoulli distribution is a discrete probability distribution of a single binary
random
variable, which either takes the value 1 or 0.
• The Bernoulli distribution as a model giving the set of possible outcomes for a single
experiment, that can be answered with a simple yes-no question.
• The geometric distribution gives the probability of the first success occurrence,
requiring n independent trials, with a success probability of p.
• Independence
• For each trial, there are only two possible outcomes
• The probability of success is the same for every trial
Discrete Probability Distributions
4.Poisson Distribution
• It is usually used in scenarios where we are counting the occurrences of certain
events in an interval of time or space. In practice, it is often an approximation of a
real-life random variable.
Example:
• The number of emails that I get in a weekday can be modeled by a Poisson distribution with an
average
of 0.2 emails per minute.
Let X be the number of emails that I get in the 5 -minute interval. Then, by the
assumption X is
a Poisson random variable with parameter λ=5(0.2)=1 ,
Continuous Random Variable
Continuous Random Variable
• A continuous random variable can be defined as a random variable
that can take on an infinite number of possible values.
• The probability that a continuous random variable will take on an
exact value
is 0.
• The cumulative distribution function (CDF) and the probability density
function (PDF) are used to describe the characteristics of a continuous
random variable.
Cumulative Distribution Function
(CDF)
• The cumulative distribution function of a continuous
random variable can be determined by integrating the
probability density function.
• It can be defined as the probability that the random variable, X, will take on a value
that is
lesser than or equal to a particular value, x.
• The formula for the cdf of a continuous random variable, evaluated between two points
a and b,
is given below:
CDF Properties
• Non-decreasing
• FX (x) tends to 1, as x → ∞
Probability Density
Function(PDF)
• The probability density function of a continuous random variable can be defined as a
function that gives the probability that the value of the random variable will fall between a
range of values.
• Let X be the continuous random variable, then the formula for the pdf, f(x), is given as follows:
For the pdf of a continuous random variable to be valid, it must satisfy the following conditions.
This means that the total area under the graph of the pdf must be equal to 1.
• f(x) > 0. This implies that the probability density function of a continuous random variable
cannot be negative.
Mean and Variance of Continuous Random Variable
Mean of Continuous Random Variable
• The mean of a continuous random variable can be defined as the weighted average value of
the random variable, X.
• Exponential Random
Variable
Continuous Random Variable Types
Uniform Random Variable
• A continuous random variable that is used to describe a uniform distribution is known as a
uniform random variable.
• If the parameters of a normal distribution are given as X ∼ N (μ,σ2) then the formula for the
pdf.