0% found this document useful (0 votes)
6 views9 pages

Lecture 12 (PROBABILITY DISTRIBUTIONS)

This lecture introduces the concepts of probability distributions, including discrete and continuous types, and their applications in various business situations. It explains the relationship between probability distributions and random variables, emphasizing the importance of identifying relevant random variables for decision-making. Additionally, the lecture outlines how to calculate the expected value of a random variable and provides references for further study.

Uploaded by

okayybyeee312
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views9 pages

Lecture 12 (PROBABILITY DISTRIBUTIONS)

This lecture introduces the concepts of probability distributions, including discrete and continuous types, and their applications in various business situations. It explains the relationship between probability distributions and random variables, emphasizing the importance of identifying relevant random variables for decision-making. Additionally, the lecture outlines how to calculate the expected value of a random variable and provides references for further study.

Uploaded by

okayybyeee312
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Lecture 12

CO1: Apply the knowledge about the concepts of probability and statistics for the computer
applications.
CO2: Evaluate the ideas of probability and random variables and various discrete and
continuous probability distributions and their properties.

PROBABILITY DISTRIBUTIONS

INTRODUCTION

A probability distribution is essentially an extension of the theory of probability

which we have already discussed in the previous unit. This unit introduces the

concept of a probability distribution, and to show how the various basic

probability distributions (binomial, poisson, and normal) are constructed. All

these probability distributions have immensely useful applications and explain a

wide variety of business situations which call for computation of desired

probabilities.

By the theory of probability

P(H1) + P(H2) + …+ P(Hn) = 1

This means that the unity probability of a certain event is distributed over a set

of disjointed events making up a complete group. In general, a tabular recording

of the probabilities of all the possible outcomes that could result if random
(chance) experiment is done is called “Probability Distribution”. It is also

termed as theoretical frequency distribution.

Frequency Distribution and Probability Distribution

One gets a better idea about a probability distribution by comparing it with a

frequency distribution. It may be recalled that the frequency distributions are

based on observation and experimentation. For instance, we may study the

profits (during a particular period) of the firms in an industry and classify the

data into two columns with class intervals for profits in the first column, and

corresponding classify frequencies (No. of firms) in the second column.

The probability distribution is also a two-column presentation with the values of

the random variable in the first column, and the corresponding probabilities in

the second column. These distributions are obtained by expectations on the

basis of theoretical or past experience considerations. Thus, probability

distributions are related to theoretical or expected frequency distributions.

In the frequency distribution, the class frequencies add up to the total number of

observations (N), where as in the case of probability distribution the possible

outcomes (probabilities) add up to ‘one’. Like the former, a probability

distribution is also described by a curve and has its own mean, dispersion, and

skewness.
Let us consider an example of probability distribution. Suppose we toss a fair

coin twice, the possible outcomes are shown

Possible Outcomes from Two-toss Experiment of a Fair Coin

Now we are interested in framing a probability distribution of the possible

outcomes of the number of Heads from the two-toss experiment of a fair coin.

We would begin by recording any result that did not contain a head, i.e., only

the fourth outcome. Next, those outcomes containing only one head, i.e., second

and third outcomes, and finally, we would record that the first outcome contains

two heads. We recorded the same in to highlight the number of heads contained

in each outcome.

Probability Distribution of the Possible No. of Heads from Two-toss Experiment of a Fair Coin
We must note that the above tables are not the real outcome of tossing a fair

coin twice. But, it is a theoretical outcome, i.e., it represents the way in which

we expect our two-toss experiment of an un-biased coin to behave over time.

TYPES OF PROBABILITY DISTRIBUTION

Probability distributions are broadly classified under two heads:

(i) Discrete Probability Distribution, and

(ii) Continuous Probability Distribution.

i) Discrete Probability Distribution: The discrete probability is allowed to

take on only a limited number of values. Consider for example that the

probability of having your birthday in a given month is a discrete one, as one

can have only 12 possible outcomes representing 12 months of a year.

ii) Continuous Probability Distribution: In a continuous probability

distribution, the variable of interest may take on any values within a given

range. Suppose we are planning to release water for hydropower generation.

Depending on how much water we have in the reservoir viz., whether it is above

or below the normal level, we decide on the amount and time of release. The

variable indicating the difference between the actual reservoir level and the

normal level, can take positive or negative values, integer or otherwise.

Moreover, this value is contingent upon the inflow to the reservoir, which in
turn is uncertain. This type of random variable which can take an infinite

number of values is called a continuous random variable, and the probability

distribution of such a variable is called a continuous probability distribution.

Before we attempt discrete and continuous probability distributions, the concept

of random variable which is central to the theme, needs to be elaborated.

CONCEPT OF RANDOM VARIABLES

A random variable is a variable (numerical quantity) that can take different

values as a result of the outcomes of a random experiment. When a random

experiment is carried out, the totality of outcomes of the experiment forms a set

which is known as sample space of the experiment. Similar to the probability

distribution function, a random variable may be discrete or continuous.

The example given in the Introduction; we have seen that the outcomes of the

experiment of two-toss of a fair coin were expressed in terms of the number of

heads. We found in the example, that H (head) can assume values of 0, 1 and 2

and corresponding to each value, a probability is associated. This uncertain real

variable H, which assumes different numerical values depending on the

outcomes of an experiment, and to each of whose value a possibility assignment

can be made, is known as a random variable. The resulting representation of all

the values with their probabilities is termed as the probability distribution of H.


It is customary to present the distribution as shown.

Probability Distribution of No. of Heads

In this case, as we find that H takes only discrete values, the variable H is called

a discrete random variable, and the resulting distribution is a discrete

probability distribution. The function that specifies the probability distribution

of a discrete random variable is called the probability mass function (p.m.f.).

In the above situations, we have seen that the random variable takes a limited

number of values. There are certain situations where the variable under

consideration may have infinite values. Consider for example, that we are

interested in ascertaining the probability distribution of the weight of one kg.

coffee packs. We have reasons to believe that the packing process is such that a

certain percentage of the packs slightly below one kg., and some packs are

above one kg. It is easy to see that it is essentially by chance that the pack will

weigh exactly 1 kg., and there are an infinite number of values that the random

variable ‘weight’ can take. In such cases, it makes sense to talk of the

probability that the weight will be between two values, rather than the

probability of the weight taking any specific value. These types of random

variables which can take an infinitely large number of values are called
continuous random variables, and the resulting distribution is called a

continuous probability distribution. The function that specifies the probability

distribution of a continuous random variable is called the probability density

function (p.d.f.).

Sometimes, for the sake of convenience, a discrete situation with a large number

of outcomes is approximated by a continuous distribution. For example, if we

find that the demand of a product is a random variable taking values of 1, 2, 3,

…to 1,000, it may be worthwhile to treat it as a continuous variable.

In a nutshell, if the random variable is restricted to take only a limited number

of values, it is termed as discrete random variable and if it is allowed to take any

value within a given range it is termed as continuous random variable.

It should be clear, from the above discussion, that a probability distribution is

defined only in the context of a random variable or a function of random

variable. Thus in any situation, it is important to identify the relevant random

variable and to find the probability distribution to facilitate decision making.

Expected Value of a Random Variable

Expected value is the fundamental idea in the study of probability distributions.

For finding the expected value of a discrete random variable, we multiply each

value that the random variable can assume by its corresponding probability of
occurrence and then sum up all the products. For example to find out the

expected value of the discrete random variable (RV) of ‘Daily Visa Cleared’

given:

Now, we will examine situations involving discrete random variables and

discuss the methods for assessing them.

TEXT BOOKS

 T1 = H. K Dass, Higher Engineering Mathematics, S. Chand Publishers,3rd revised

edition.2014.

 T2 = B.S. Grewal, Higher Engineering Mathematics, Khanna Publishers,42th

ed.2013, New Delhi.

 T3= N. P. Bali and Manish Goyal, A textbook of engineering Mathematics, Laxmi

Publications, Reprint 2008.

REFERENCE BOOKS
 R1=R. K. Jain and S. R. K. Lyenger, Advanced Engineering Mathematics ,3rd Edition

Narosa Publishing House ,2004,New Delhi.

 R2 =B. V. Ramana Advanced Engineering Mathematics, McGrawHill, July2006, New

Delhi.

 S.P.Gupta,StatisticalMethods,S.Chand&Sons,2017,NewDelhi,ISBN9789351610281In

siders’Guide

Video Lecture :

https://www.youtube.com/watch?v=c06FZ2Yq9rk

You might also like