UNIVERSITY OF CENTRAL PUNJAB
Assignment no: 03
Topic: Probability
Submitted by:
EMMAN TARIQ L1F20BSAF0059
Submitted to:
Dr.Zahid Ahmad
Section: “B”
Date of Submission:
12 July 2021
ASSIGMENT.NO.3
PROBABILTY
INTRODUCTION OF PROBABILTY:
In literal probability meaning is possibilities of favorable outcome. The word probability has two
basic meanings i.e. a quantitative measure of uncertainty and a measure of degree of belief in a
particular statement or problem The foundations of probability were laid by two French
mathematicians of the seventeenth century-Blaise Pascal (1623-1662) and Pierre De Fermat
(1601-1665)-in connection with gambling problems .Later on it was developed by Jakob
Bernoulli (1654-1705), Abraham De Moivre (1667-1754) and Pierre Simon Laplace (1749-
1827).The modern treatment of probability during the twenties and thirties of twentieth century.
WHAT IS PROBABILITY?
A likelihood could be a number that reflects the chance or probability that a specific occasion
will happen. Probabilities can be communicated as extents that extend from to 1, and they can
moreover be communicated as rates extending from 0% to 100%.
“Probability is a mathematical tool used to study randomness. It deals with the chance of an event
occurring”
EXAMPLES;
There are 8 chairs in a bed, 3 are red, 2 are yellow and 3 is blue. What is the probability of
picking a yellow pillow? The probability is equal to the number of yellow chairs in the bed
divided by the total number of chairs, i.e. 2/8 = ¼
Three Definitions of probability
• Classical definition
• Empirical definition
• Subjective definition
The Classical or a Prior Definition of Probability:
If a random experiment can produce n mutually exclusive and equally likely outcomes and if m
out of these outcomes are considered favorable to the occurrence of a certain event A, then the
probability of the event A , denoted by P(A), is defined as the ratio of m/n .Symbolically, we
write
P (A) =m/n = Number of favorable outcomes/Total number of possible outcomes
Explanation:
The definition was formulated by the French Mathematicians P.S Laplace (1749-1827) and can
be very conveniently used in experiments where we total number of possible outcomes and the
number of outcomes favorable to an event can be determined.
Empirical probability
Empirical probability, also known as experimental probability, refers to a probability that is
based on historical data. In other words, empirical probability illustrates the likelihood of an
event occurring based on historical data
Formula
P′(A) = n(A) / n
P′(A) = Number of Times Occurred / Total No. of Times Experiment Performed
Where;
• Number of Times Occurred refers to the number of times a favorable event occurred;
and
• Total No. of Times Experiment Performed refers to the total amount of times the event
was performed.
Subjective probability
This is a type of probability derived from an individual's personal judgment or own experience
about whether a specific outcome is likely to occur. It contains no formal calculations and only
reflects the subject's(person) opinions and past experience
Example
A person give the cv in company he observe that all candidates are talented he assumes that
competition is very tuff so probability of his selection for this job is 75%
What are the basic element of probability?
The basic elements used in probability are following
❖ Experiment
❖ Random Experiment:
❖ Sample Space
❖ Outcome
❖ Event
Experiment
An operation which can produce some well-defined outcomes, is called an Experiment.
Example:
When we toss a coin, we know that either head or tail shows up. So, the operation of tossing a coin
may be said to have two well-defined outcomes, namely, (a) heads showing up; and (b) tails
showing up.
Random Experiment:
When we roll a die we are well aware of the fact that any of the numerals 1,2,3,4,5, or 6 may appear
on the upper face but we cannot say that which exact number will show up. Such an experiment in
which all possible outcomes are known and the exact outcome cannot be predicted in advance, is
called a Random Experiment.
Sample Space:
All the possible outcomes of an experiment as an whole, form the Sample Space.
the set of all possible outcome of statistical experiment.
Example:
When we roll a die we can get any outcome from 1 to 6. All the possible numbers which can appear
on the upper face form the Sample Space (denoted by S). Hence, the Sample Space of a dice roll
is S= {1,2,3,4,5,6}
Outcome
Any possible result out of the Sample Space S for a Random Experiment is called an Outcome.
Example:
When we roll a die, we might obtain 3 or when we toss a coin, we might obtain heads.
Event
Any subset of the Sample Space S is called an Event (denoted by E). When an outcome which
belongs to the subset E takes place, it is said that an Event has occurred. Whereas, when an
outcome which does not belong to subset E takes place, the Event has not occurred.
Classification:
Event are classified into the following terms:
• Mutually Exclusive Events
• Equally likely event
• Independent events
• Dependent event
Example:
Consider the experiment of throwing a die. Over here the Sample Space S=
{1,2,3,4,5,6,7}. Let E denote the event of 'a number appearing less than 6' Thus the Event E=
{1,2,3,4,5}. If the number 1 appears, we say that Event E has occurred. Similarly, if the outcomes
are 2 or 3, we can say Event E has occurred since these outcomes belong to subset E.
Trial:
By a trial, we mean performing a random experiment.
Example:
(i) Tossing a fair coin,
(ii) rolling an unbiased dice
PURPOSE OF PROBABILTY:
Probability is valuable to degree how likely it is that a given occasion will really happen. The
concept of probability is as important as it is misunderstood. It is vital to have an understanding
of the nature of chance and variation in life, in order to be a well-informed, citizen. One area in
which this is extremely important is in understanding risk and relative risk.
CLASSIFICATION OF PROBABILITIES
The probability is classified into three0 types;
❖ Theoretical Probability.
❖ Experimental Probability.
❖ Axiomatic Probability.
Theoretical probability:
It is probability that is determined on the basis of reasoning.
Example,
if you have four truffle tickets and 200 tickets were sold:
• Number of favorable outcomes: 4
• Number of possible outcomes: 200
• Ratio = number of favorable outcomes / number of possible outcomes = 4/200 = .5.
Experimental probability:
Experimental probability is probability that is determined on the basis of the results of an experiment
repeated many times. Probability is a value between (and including) zero ( 0) and one (1)
Example;
if a dice is rolled 7000 times and the number '6' occurs 890 times, then the experimental probability that
'6' shows up on the dice is 890/7000 = 0.1271
Axiomatic probability
Axiomatic probability is a unifying probability theory. It sets down a set of axioms (rules) that
apply to all of types of probability, including frequentist probability and classical probability.
These rules, based on Kolmogorov’s Three Axioms, set starting points for mathematical
probability.
Kolmogorov’s Three Axioms
The three axioms are:
• For any event A, P(A) ≥ 0. In English, that’s “For any event A, the probability of A is
greater or equal to 0”.
• When S is the sample space of an experiment; i.e., the set of all possible outcomes, P(S)
= 1. In English, that’s “The probability of any of the outcomes happening is one hundred
percent”, or—paraphrasing— “anytime this experiment is performed, something
happens”.
• If A and B are mutually exclusive outcomes, P(A ∪ B ) = P(A) + P(B).
• Here ∪ stands for ‘union’. We can read this by saying “If A and B are mutually exclusive
outcomes, the probability of either A or B happening is the probability of A happening
plus the probability of B happening”
RELATION BETWEEN STATISTICS AND PROBABILTY:
Probability and statistics are fundamentally inter-related. Probability is often called the vehicle
of statistics. The area of inferential statistics in which we are mainly concerned with drawing
inferences from experiment or situations involving an element of uncertainty, leans heavily upon
probability theory .Uncertainty is also an inherent part of statistical inference as inferences are
based on a sample, and a sample being a small part of the larger population, contains incomplete
information. A similar type of uncertainty occurs when we toss a coin, draw a card or throw dice,
etc. The uncertainty in all these cases is measured in terms of probability.
LAWS OF PROBABILTY:
Law of Complementation
P (not A) = 1 - P(A)
Addition Law
i. We use this law For “M.E” (mutually event)
P (A∪B) = P(A) + P(A)
ii. We use this law For “Not M.E” (Not mutually event)
P (A∪B) = P(A) + P(B) – P(A∩B) [For “Not M.E” events]
Multiplication Laws:
P(A∩B) = P(A) ∙ P(B) [it is used for independent events]
P(A∩B) = P(A) ∙ P(B/A) [it is used for dependent events]
CONDITIONAL PROBABILTY:
Conditional likelihood is characterized as the probability of an occasion or result happening,
based on the event of a past occasion or result. Conditional likelihood is calculated by
duplicating the likelihood of the going before occasion by the upgraded likelihood of the
succeeding, or conditional, occasion.
FORMULA OF CONDITIONAL PROBABILTY:
The formula of conditional probability is stated below
P (B|A) = P (A and B) / P (A)
P (B|A) = P (A∩B) / P (A)
INTRODUCTION OF RANDOM VARIABLE:
Usually, we are not interested in a particular outcome of a random experiment but our interest
relates to some numerical description of the outcome. For example, when two coins are tossed,
we may be interested only in the number of heads which appear and not in the actual sequence of
heads and tails which is not a numerical quantity.
DEFINITION OF RANDOM VARIABLE:
A numerical quantity whose value is determined by the outcome of a random experiment is
called a random variable. A random variable is also called a chance variable, a stochastic
variable or simply a variate
REPRESENTED;
Random variable is denoted by rv
EXPLANATION OF RANDOM VARIABLE:
A random variable are usually denoted by capital Latin letters such as X,Y,Z; while the values
taken by them are represented by the corresponding small letters such as x.y,z. It is clear that the
number 1, 2, 3,4 …….50 are random quantities determined by the outcomes of the random
experiments.
FEATURES OF RANDOM VARIABLE:
❖ It only takes the real value.
❖ If X is a random variable and B is a constant, then BX is also a random variable.
❖ If X1 and X2 are two random variables, then X1 + X2 and X1 X2 are also random.
❖ For any constants B1 and B2, B1X1 + B2X2 is also random.
❖ |X| is a random variable.
TYPES OF RANDOM VARIABLE:
There are two types of random variable:
❖ Discrete
❖ Continuous
DISCRETE RANDOM VARIABLE:
A discrete variable may be a variable taking on numerical values decided by the result of a
random phenomenon. .A discrete irregular variable includes a countable number of conceivable
values. The likelihood of each esteem of a discrete arbitrary variable is between and 1, and the
whole of all the probabilities is rise to 1.
CONTINOUS RANDOM VARIABLE:
A continuous variable may be a random variable where the information can take boundlessly
numerous values. For illustration, a random variable measuring the time taken for something to
be done is nonstop since there are an interminable number of conceivable times that can be
taken.
RANDOM VARIABLE AND IT’S PROBABILTY DISTRIBUTION:
The probability distribution for a random variable describes how the probabilities are distributed
over the values of the random variable. Ready to discover the comparing likelihood for any
occasion of a arbitrary try. Additionally, for diverse values of the arbitrary variable, one can
discover the individual probability of it. Assist, the values of irregular factors beside the
comparing probabilities allude to the likelihood dissemination of the arbitrary variable.
PROPERTIES OF PROBABILTY DISTRIBUTION:
❖ The probability distribution of a random variable X is P(X = xi) = pi for x = xi and P(X =
xi) = 0 for x ≠ xi.
❖ The range of probability distribution for all possible values of a random variable is from
0 to 1, i.e., 0 ≤ p(x) ≤ 1.
SPECIAL PROBABILTY DISTRIBUTION:
Following are some special probability distributions:
❖ The binomial distribution
❖ The possion distribution
❖ The normal distribution
THE BINOMIAL DISTRIBUTION:
Two of the foremost broadly utilized discrete likelihood conveyances are the binomial and
Poisson. The binomial likelihood mass work (condition 6) gives the likelihood that x triumphs
will happen in n trials of a binomial try.
THE POSSION DISTIBUTION:
The Poisson likelihood dissemination is regularly utilized as a demonstrate of the number of
entries at a office inside a given period of time. For occurrence, a arbitrary variable may be
characterized as the number of phone calls coming into an carrier reservation framework amid a
period of 15 minutes. In case the cruel number of entries amid a 15-minute interim is known, the
Poisson likelihood mass work given by condition 7 can be utilized to compute the likelihood of x
entries.
THE NORMAL DISTIBUTION:
Typical dispersion, moreover known as the Gaussian dissemination, may be a likelihood
dispersion that's symmetric around the cruel, appearing that information close the cruel are more
visit in event than information distant from the cruel. In chart frame, typical dissemination will
show up as a chime bend.
Examples
Example#1
Ali needs 4 chairs from total 15 chairs (8 red chairs and 7 blue chairs.) If he selects 4 at
random, what is the probability?
i. All four will be red chairs
ii. At least one will be blue chairs
Solution:
8 Men + 7 Women = 15
n (S) = (𝟏𝟓
𝟒
) = 1365
iii. All three will be red chairs:
Let A be the event that 4 selected will be red chairs
n (A) = (𝟖𝟒) ; as men will be selected from 8 men.
n (A) = 70
𝟕𝟎
P (A) = 𝟏𝟑𝟔𝟓
iv. At least one will be blue chairs:
Let B be the event that at least one blue chairs will be selected.
n (B) = (𝟕𝟐) (𝟖𝟐) + (𝟕𝟑) (𝟖𝟏) + (𝟕𝟏) (𝟖𝟑) + (𝟕𝟒) (𝟖𝟎)
n (B) = 588 +280+392+ 35
n (B) = 1295
𝒏 (𝑩)
P (B) = 𝒏 (𝑺)
𝟏𝟐𝟗𝟓
P (B) = 𝟏𝟑𝟔𝟓
Example#2
A company ned employees 20 men and 40 women of which half the men and half the women
have experience of job Find the probability that a person chosen at random is man or have
brown eyes.
Solution:
Class [20 men + 40 women = 60]
Having experience [10 men + 20women = 30]
Let A be the event that a person is a man.
P (A) = 10/60
Let B be the event that person have experience.
P (B) = 30/60
Let (AnB) be the event that person is a man and have experience.
Since these are 10 men having. Experience
P (AUB) = P (A) + P (B) – P (A∩B)
= 10/60 + 30/60 – 10/60 = 30/60
P (A∩B) = ½
Example#3
A tool box contains 5 screw driver and 5 nuts . 2 are selected from box. What is the
probability that both are nuts?
Solution:
Box [5 screw driver + 5 nuts = 10 ]
S = (𝟏𝟎
𝟐
) = 45
Let A be the event that 2 good tubes are selected.
n (A) = (𝟓𝟐) = 10
𝟏𝟎
P (A) = 𝟒𝟓 = 0.22
Bays Theorem
Statement:
Bayes' theorem, named after 18th-century British mathematician Thomas Bayes,
is a mathematical formula for determining conditional probability
a theorem describing how the conditional probability of each of a set of possible causes for
a given observed outcome can be choose from knowledge of the probability of each cause
and the conditional probability of the outcome of each cause.
Expression
𝑩
𝑷 (𝑨𝒊 ) ∙𝑷 ( ⁄𝑨 )
𝑨
P ( 𝒊⁄𝑩) = Ʃ𝑷 (𝑨 ) ∙ 𝑷 (𝑩 𝒊
𝒊 ⁄𝑨𝒊)