0% found this document useful (0 votes)
111 views46 pages

Probability by Ken Black

The document discusses the development of probability theory by several mathematicians including Jacob Bernouli, Abraham de Moivre, Reverend Thomas Bayes, and Joseph Lagrange. It notes that probability theory was successfully applied to gambling and other social and economic problems. The mathematical theory of probability provides the basis for statistical applications in both social and decision making research.

Uploaded by

Datta Creations
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
111 views46 pages

Probability by Ken Black

The document discusses the development of probability theory by several mathematicians including Jacob Bernouli, Abraham de Moivre, Reverend Thomas Bayes, and Joseph Lagrange. It notes that probability theory was successfully applied to gambling and other social and economic problems. The mathematical theory of probability provides the basis for statistical applications in both social and decision making research.

Uploaded by

Datta Creations
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 46

Probability

Probability

Jacob Bernouli (1654-1705), Abraham de Moivre (1667-1754),


Reverend Thomas Bayes (1702-1761) and Joseph Lagrange (1736-1813)
developed probability formulas and techniques.
Probability theory was successfully applied at the gambling tables and,
more relevant to social and economic problems.
The mathematical theory of probability is the basis for statistical
applications in both social and decision making research.

Probability:
Introduction
Statistical inference is a methodology through which
we learn about the characteristics of a population by
analyzing samples of elements drawn from that
population.

Suppose that a friend asks you to invest $10000 in a


joint business venture. Although your friend’s
presentation of the potential for profit is convincing,
you investigate and find that he has initiated three
previous business ventures, all of which failed. Would
you think that the current proposed venture would
have more than a 50/50 chance of succeeding?
Intro (Contd.)
In pondering this question you must wonder about
the likelihood of observing three failures in a sample
of three elements from the process by which your
friend chooses and executes business ventures if, in
fact, more than half the population of ventures
originating from that process will be successful.

This line of thinking is an essential part of statistical


inference because we are constantly asking ourselves,
in one way or other, what the likelihood is of
observing a particular sample if the population
characteristics are what they are claimed to be.
Intro(Contd.)

Much of statistical inference involves making an


hypothesis about the characteristics of a population
(which we will later call the null hypothesis) and then
seeing whether the sample has a low or high chance
of occurring if that hypothesis is true.
Let us begin our study of probability by starting with
a population whose characteristics are known to us
and inquire about the likelihood or chances of
observing various samples from that population.
Basic Terminology in Probability

Two broad categories of decision making


problems:
Deterministic Model
Probabilistic (Random) Models.
Probability is the chance of happening something
Probabilities are expressed as fractions/decimals
between zero and one.
Assigning a probability of zero means that
something can never happen, a probability of one
indicates that something will always happen.
In probability theory, an event is one or more of
the possible outcomes of doing something.

Basic Terminology in
Basic Terminology in Probability

The activity that produces such an event is referred to in probability


theory as an experiment.
Events are said to be mutually exclusive if one and only one of them
can take place at a time.
When a list of the possible events that can result from an experiment
includes every possible outcome, the list said to be collectively
exhaustive.

Basic Terminology in
Three Types of Probability

There are three basic ways of classifying probability.


These three represent rather different conceptual
approaches to the study of probability theory:

Three Types of
Methods of Assigning Probabilities

Classical method of assigning probability


(rules and laws)
Relative frequency of occurrence
(cumulated historical data)
Subjective Probability (personal intuition
or reasoning)
Three Types of Probability
Classical Approach:
Defines ‘Probability’ as ratio of favorable outcomes
to the total outcomes.
Also known as priori probability.
It assumes a number of assumptions so most
restrictive approach and it is least useful in real life
situations.

Number of outcomes leading to the event divided by


the total number of outcomes possible
P(E) = ne/N
where N = total number of outcomes, and
ne = number of outcomes in event E
Each outcome is equally likely
Applicable to games of chance
Objective -- everyone correctly using the method
assigns an identical probability
Three Types of
Relative Frequency Approach:
Defines ‘Probability’ as observed relative frequency of an
event in a very large number of trials.
It assumes less assumptions but requires the event to be
capable of being repeated a large number of times.
Relative frequency of occurrence is not based on rules or
laws but on what has occurred in the past.
For example, a company wants to determine the probability
that its inspectors are going to reject the next batch of raw
materials from a supplier. Data gathered from company
record books show that the supplier sent the company 90
batches in the past, and inspectors rejected 10 of them. By
the method of relative frequency of occurrence, the
probability of the inspectors rejecting the next batch is 10/
90, or .11. If the next batch is rejected, the relative
frequency of occurrence probability for the subsequent
shipment would change to 11/91 = .12.
Three Types of Probability

Subjective Probability:
Deals with specific or unique situations typical of the business or
management world.
Based upon some belief or educated guess of the decision maker.
Subjective assessments of probability permit the widest flexibility
of the three concepts, also known as Personal Probability.
Useful for unique (single-trial) experiments
New product introduction
Site selection decisions
Sporting events

Three Types of
Structure of Probability
Experiment – is a process that produces an outcome
Ex: Rolling two six-sided dice and calculating their sum
Event – an outcome of an experiment
Ex: The sum is at least 10
Elementary event – events that cannot be decomposed or broken down into other
events
Ex: The first die is a six
Sample Space – a complete roster/listing of all elementary events for an experiment
Ex: {(1,1), (1,2), (1,3), …, (2,1), (2,2), …., (6,5), (6,6)}
Trial: one repetition of the process
Ex: Roll the dice…
Structure of Probability

Mutually Exclusive Events – events such that the occurrence of one


prohibits the occurrence of the other
These events have no intersection
Independent Events – the occurrence or nonoccurrence of one has no
affect on the occurrence of the others
Collectively Exhaustive Events – listing of all possible elementary events
for an experiment
Complementary Events – two events, one of which comprises all the
elementary events of an experiment that are not in the other event
Difference between Mutually exclusive events
and Independent events
Suppose an office building is for sale and two different
potential buyers have placed bids on the building. It is
not possible for both buyers to purchase the building;
therefore, the event of buyer A purchasing the building
is mutually exclusive with the event of buyer B
purchasing the building.
Now if buyer A liking are independent of buyer B liking.
A likes will not impact B likes in any way.
Probabilities Under Conditions of Statistical
Independence
Occurrence of one event has no effect on the probability of the occurrence of any other
event.
There are three types of probabilities under statistical independence
1. Marginal:
Probability of the occurrence of an event
2. Joint :
P (AB) = P(A) X P(B)
3. Conditional:
P(A/B) = P(A) and P(B/A) = P(B)
In statistical independence, assumption is that events are not related.

Probabilities Under Conditions of Statistical


Probabilities Under Conditions of Statistical
Dependence
When the probability of some event is dependent on or affected by the occurrence of
some other event.
There are three types of probabilities under statistical dependence
Conditional:
P (A/B) = P(AB)/ P(B) and P(B/A) = P(AB)/P(A)
1. Joint:
P(AB) = P(A/B) X P(B) and P(AB) = P(B/A) X P(A)
2. Marginal:
Marginal probabilities under statistical dependence are computed by summing up the probabilities of all the
joint events in which the simple event occurs.

Probabilities Under Conditions of Statistical


Union of Sets

The union of two sets contains an instance of each


element of the two sets.
X  1,4,7,9 X Y
Y  2,3,4,5,6
X  Y  1,2,3,4,5,6,7,9
C   IBM , DEC , Apple
F   Apple, Grape, Lime
C  F   IBM , DEC , Apple, Grape, Lime
Intersection of Sets

The intersection of two sets contains only those


element common to the two sets.
X  1,4,7,9 X Y
Y  2,3,4,5,6
X  Y  4

C  IBM , DEC , Apple  XY


F  Apple , Grape , Lime 
C F  Apple 
Mutually Exclusive Events

Events with no common outcomes


Occurrence of one event precludes the occurrence
of the other event

X Y

C  IBM , DEC , Apple X  1,7,9 P( X Y )  0


F  Grape, Lime Y  2,3,4,5,6
CF   X Y   
Four Types of Probability

Marginal Union Joint Conditional

P( X ) P( X  Y ) P( X  Y ) P( X | Y )
The probability The probability The probability The probability
of X occurring of X or Y of X and Y of X occurring
occurring occurring given that Y
has occurred

X X Y X Y
Y
General Law of Addition

P( X  Y )  P( X )  P(Y )  P( X  Y )

X Y
General Law of Addition -- Example

P( N  S )  P( N )  P( S )  P( N  S )

S
P ( N )  .70
N
P ( S )  .67
.70 .56
.67 P ( N  S )  .56
P ( N  S )  .70  .67  .56
 0.81
For example, in the office design problem, noise
reduction would be on one side of the table and
increased storage space on the other.

In this problem, a Yes row and a No row would be


created for one variable and a Yes column and a No
column would be created for the other variable, as
shown in Table
Office Design Problem Probability Matrix

Increase
Storage Space
Yes No Total
Noise Yes .56 .14 .70
Reduction No .11 .19 .30
Total .67 .33 1.00

P( N  S )  P( N )  P( S )  P( N  S )
 .70  .67  .56
 .81
Demonstration Problem

If a worker is randomly selected from the company


described in Demonstration Problem 4.1 (below),
what is the probability that the worker is either
technical or clerical? What is the probability that the
worker is either a professional or a clerical?
Type of Gender
Position Male Female Total
Managerial 8 3 11
Professional 31 13 44
Technical 52 17 69
Clerical 9 22 31
Total 100 55 155
Demonstration Problem

Examine the raw value matrix of the company’s human


resources data shown in Demonstration Problem 4.1. In
many raw value and probability matrices like this one, the
rows are non-overlapping or mutually exclusive, as are
the columns. In this matrix, a worker can be classified as
being in only one type of position and as either male or
female but not both. Thus, the categories of type of
position are mutually exclusive, as are the categories of
sex, and the special law of addition can be applied to the
human resource data to determine the union
probabilities.
Demonstration Problem

Let T denote technical, C denote clerical, and P denote


professional. The probability that a worker is either
technical or clerical is

P(T U C) = P (T) + P (C) = 69/155 + 31/155 = 100/155 = .645

The probability that a worker is either professional or


clerical is

P (P U C) = P (P) + P (C) = 44/155 + 31/155 = 75/155 = .484


Demonstration Problem

Type of Gender
Position Male Female Total
Managerial 8 3 11
Professional 31 13 44
Technical 52 17 69
Clerical 9 22 31
Total 100 55 155

P(T  C )  P(T )  P(C )


69 31
 
155 155
.645
Demonstration Problem

Type of Gender
Position Male Female Total
Managerial 8 3 11
Professional 31 13 44
Technical 52 17 69
Clerical 9 22 31
Total 100 55 155

P( P  C )  P( P)  P(C )
44 31
 
155 155
.484
Law of Multiplication
Demonstration Problem

P( X  Y )  P ( X )  P(Y | X )  P(Y )  P( X | Y )

80
P( M )   0. 5714
140
P( S| M )  0. 20
P ( M  S )  P ( M )  P ( S| M )
 ( 0. 5714 )( 0. 20 )  0.1143
Law of Multiplication

The intersection of two events is called the joint


probability
General law of multiplication is used to find the joint
probability
General law of multiplication gives the probability
that both events x and y will occur at the same time
P(x|y) is a conditional probability that can be stated
as the probability of x given y
Law of Multiplication

If a probability matrix is constructed for a problem,


the easiest way to solve for the joint probability is to
find the appropriate cell in the matrix and select the
answer
Law of Multiplication
Demonstration Problem
Probability Matrix
of Employees

Married
Supervisor Yes No Total
Yes .1143 .1000 .2143
No .4571 .3286 .7857
Total .5714 .4286 1.00

30
P( S )   0.2143
140 P ( M  S )  P ( M )  P ( S| M )
80  ( 0. 5714 )( 0. 20 )  0.1143
P( M )   0.5714
140
P ( S | M )  0.20
Law of Conditional Probability

If X and Y are two events, the conditional probability


of X occurring given that Y is known or has occurred
is expressed as P(X|Y)
The conditional probability of X given Y is the joint
probability of X and Y divided by the marginal
probability of Y.

P( X  Y ) P(Y | X )  P( X )
P( X | Y )  
P(Y ) P(Y )
Law of Conditional Probability - Example

70% of respondents believe noise reduction would


improve productivity.
56% of respondents believed both noise reduction
and increased storage space would improve
productivity
A worker is selected randomly and asked about
changes in the office design
What is the probability that a randomly selected
person believes storage space would improve
productivity given that the person believes noise
reduction improves productivity?
Law of Conditional Probability

P ( N ) .70
S N P ( N  S ) .56
P( N  S )
P( S | N ) 
.56
P( N )
.70
.56

.70
.80
Independent Events

Recall: Two events are independent when the occurrence of one does not affect the probability
of occurrence of the other one
When X and Y are independent, the conditional probability is equal to the marginal probability
Demo Problem

Test the cross tabulation for the 200 executive


responses to determine whether industry type is
independent of geographic location.
Independent Events
Demonstration Problem
Geographic Location
Northeast Southeast Midwest West
D E F G
Finance A 24 10 8 14 56

Manufacturing B 30 6 22 12 70

Communications C 28 18 12 16 74

82 34 42 42 200

Next PPT proves that industry and geographic location


are not independent because at least one exception to
the test is present
Independent Events
Demonstration Problem
Geographic Location
Northeast Southeast Midwest West
D E F G
Finance A .12 .05 .04 .07 .28

Manufacturing B .15 .03 .11 .06 .35

Communications C .14 .09 .06 .08 .37

.41 .17 .21 .21 1.00

P ( A  G ) 0.07
P( A| G )    0.33 P( A)  0.28
P(G ) 0.21
P( A| G )  0.33  P( A)  0.28
Revising Prior Estimates of probabilities: Bayes’
Theorem
Bayes’ Theorem expresses how a subjective degree of
belief should rationally change to account for evidence.
Probabilities can be revised as more (additional)
information is gained. New probability is known as
‘Posterior Probability’.
In the Bayesian interpretation, Bayes' theorem is
fundamental to Bayesian statistics, and has application
in fields including science, engineering, medicine and
law.
In the Bayesian (or epistemological) interpretation,
probability measures a degree of belief. Bayes' theorem
then links the degree of belief in a proposition before
and after accounting for evidence.

Revising Prior Estimates of probabilities:


Revision of Probabilities: Bayes’ Rule

Allows for reversing the order of conditioning (i.e.


P(A|B) can be calculated if you know P(B|A) & P(A))
An extension to the conditional law of probabilities

P(Y | Xi ) P( Xi )
P( Xi | Y ) 
P(Y | X 1) P( X 1)  P(Y | X 2) P( X 2)    P(Y | Xn) P( Xn)
Bayes’ Rule

Note, the numerator of Bayes’ Rule and the law of


conditional probability are the same
The denominator is a collective exhaustive listing of
mutually exclusive outcomes of Y
The denominator is a weighted average of the conditional
probabilities with the weights being the prior probabilities
of the corresponding event
Bayes’ Rule: Ribbon Problem

P ( A)  0.65
P ( SJ )  0.35
P ( d | A)  0.08
P ( d | SJ )  0.12
P ( d | A)  P ( A)
P( A | d ) 
P ( d | A)  P ( A)  P ( d | SJ )  P ( SJ )
(0.08)(0.65)
  0.553
(0.08)(0.65)  (0.12)(0.35)
P ( d | SJ )  P ( SJ )
P ( SJ | d ) 
P ( d | A)  P ( A)  P ( d | SJ )  P ( SJ )
(0.12)(0.35)
  0.447
(0.08)(0.65)  (0.12)(0.35)
Bayes’ Rule: Ribbon Problem
Alternative Approach using Tree Diagram

Defective (d)
0.08 0.052
Alamo (A)
0.65
Acceptable + 0.094=P(d)
0.92
Defective (d) 0.042
0.12
South 0.042
P ( SJ | d )   0.447
Jersey (SJ) 0.094
0.35 Acceptable
0.88

You might also like