0% found this document useful (0 votes)
19 views100 pages

Probability B and W PDF For Students

Uploaded by

alikiani1379.ak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views100 pages

Probability B and W PDF For Students

Uploaded by

alikiani1379.ak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 100

Probability

Reference:
Probability and Statistics for Engineers and
Scientists
By:
Ronald E. Walpole , Raymond H. Myers
Contents

 Probability
 Random Variables and Probability Distributions
 Mathematical Expectation
 Some Discrete Probability Distributions
 Some Continuous Probability Distributions
1- Probability
Sample Space
 The set of all possible outcomes of a statistical
experiment is called the sample space: “S”
 A coin is tossed:
S={H,T}
 Tossing a die:
S={1,2,3,4,5,6}

 Three items selected at random from a manufacturing


process. Each item is classified as defective (D) or
nondefective (N):
S={DDD,DDN,DND,NDD,DNN,NDN,NND,NNN}
Events
 An event is a subset of a sample space.
 Example:
{H} in tossing a coin, {1,3,5} in tossing a die and {DDD} in
selecting tree items from a manufacturing process.

 The complement of an event A with respect to S is


the set of all elements of S that are not in A.(A´)

 The intersection of two events A and B is the event


containing all elements that are common to A and B:
Events (cont)
 Two events A , B are mutually exclusive or disjoint
if :

 The union of two events A and B is the event


containing all of the elements that belong to A or B
or both:
Continuing sample points
 An operation can be performed in n1 ways, for each of
them a second can be performed in n2 ways, the two
operations can be performed in n1n2 ways.
 Example: A pair of dice is thrown once:
n1n2=(6)(6)=36

 For k operations instead of two one has n1n2…nk


ways for operations
Continuing sample points (cont)
 A permutation is an arrangement of all or part of a set
of objects
 The number of permutation of n distinct objects is
n!
 The number of permutation of n distinct objects taken r at a
time is:

 The number of permutation of n distinct objects arranged in


a circle is:
(n-1)!
Continuing sample points (cont)
 The number of distinct permutations of n things of
which n1 are of one kind, n2 of a second kind,…, nk of
a kth kind is:

 Example: How many different ways can 3 red, 4 yellow


and 2 blue bulbs be arranged in a string with 9 sockets?
Continuing sample points (cont)
 The number of ways of partitioning a set of n objects
into r cells with n1 elements in the first cell, n2
elements in second,… is:

when n1+n2+…+nr=n

 The number of combinations of n distinct objects


taken r a time is:
Probability of an event
 The probability of an event A is the sum of weights
of all sample points in A. Therefore:

If an experiment can result in any of N different equally likely


outcomes, and if exactly n of these correspond to event A, then
Additive rules
 If A and B are any two events:

 If A and B are mutually exclusive:

 If A1, A2, …, An are mutually exclusive:

 If A1, A2, …, An is a partition of sample space S:


Additive rules (cont)
 For three events A, B and C:

 If A and A´ are complimentary events:


Conditional probability
 The Conditional probability of B, given A, is
defined by: S
B A

 Example: The probability that a regular scheduled flight


Departs on time: P(D)=0.83
Arrives on time: P(A)=0.82
Departs and arrives on time: P(D∩A)=0.78
The probability that
A plane arrives on time given that is departed on time is:
Conditional probability (cont)
A plane departed on time given that it has arrived on time is:

 Two events A and B are independent if and only if:


P(B|A)=P(B)
And
P(A|B)=P(A)
Otherwise A and B are dependent.
Multiplicative rules
 If in an experiment the events A and B can both
occur:

• Generally, =

 Two events A and B are independent if and only if


2- Random Variables
and Probability
Distributions
Concept of a random variable
Sample space Real numbers

 A random variable is a function X


that associates a real number with
each element in the sample space.

Sample Space y
 Example: Two balls are drawn from
an urn containing 4 red and 3 black RR 2
balls. The possible outcomes and the RB 1
values y of the random variable Y BR 1
(the number of red balls) are: BB 0
Concept of a random variable (cont)
Example: A stockroom clerk returns three safety helmets at random to
three steel mill employees, who had previously checked them. If Smith,
Jones and Brown, in that order, receive one of three hat, list the sample point
for possible order of returning the helmets and find the values m of the
random variable M that represent the number of correct matches.

Sample Space m
SJB 3
SBJ 1
JSB 1
JBS 0
BSJ 0
BJS 1
Concept of a random variable (cont)
 Types of sample spaces:

 If a sample space contains a finite number of possibilities or


an unending sequence with as many elements as there are
whole numbers, it is called a discrete sample space:
Example: Throwing a die

 If a sample space contains an infinite number of


possibilities equal to the number of points on a line
segment, it is called a continuous sample space.
Example: The distance that a certain make of automobile will travel
on 5 liters of gasoline.
Discrete probability distributions
 The set of ordered pairs (x, f(x)) is a probability
function, probability mass function or probability
distribution of the discrete random variable X if for
each possible outcome x:

1-

2-

3-
Discrete probability distributions (cont)
 Example: In safety helmets example the possible values m
of M and their probabilities are given by:

m 0 1 3
P(M=m) 1/3 1/2 1/6

 Example: A shipment of 8 similar microcomputers to retail


outlets contains 3 that are defective. If a school make a
random purchase of 2 of these computers find the
probability distributions for the number of defects.

X: Random Variable


x: Defective Parts
Discrete probability distributions (cont)

x 0 1 2

f (x)
Discrete probability distributions (cont)
 Example: If 50% of the automobiles sold by an agency for a
certain foreign car are equipped with diesel engine. Find a
formula for the probability distribution of the number of
diesel models among the next 4 cars sold by this agency.

Probability of selling a diesel model: 0.5


Thus points in sample space equally likely to occur: 24=16

x gasoline model can be occurred in: ways

Thus:
Discrete probability distributions (cont)
 The cumulative distribution F(x) of a discrete random
variable X with probability distribution f(x) is:

 Example: In safety helmets example:


Discrete probability distributions (cont)
 Example: In diesel engine cars example:

x 0 1 2 3 4

f(x)

 Barr Chart:
Discrete probability distributions (cont)

 Probability Histogram

 Discrete cumulative
distribution
Continuous probability distributions
 The function f(x) is a probability density function for
the continuous random variable X, defined over the
set of numbers R, if

1-

2-

3-
Continuous probability distributions (cont)
 Typical density functions:
Continuous probability distributions (cont)
Continuous probability distributions (cont)
 Example: A continuous random variable X having the
probability density function :

Verify the condition two of the probability density function


definition.
Find
Continuous probability distributions (cont)
 The cumulative distribution F(x) of a continuous
random variable X with density function f(x) is given
by:

 This definition has two results:

1-

2-
Continuous probability distributions (cont)
 Example: for the density function of the previous example
find F(x) and use it to evaluate

Therefore

And
Continuous probability distributions (cont)

Continuous cumulative distribution


Empirical distributions
 Consider the data of the below table, which represent the lives
of 40 similar car batteries recorded to the nearest tenth of a
year:

2.2 4.1 3.5 4.5 3.2 3.7 3.0 2.6


3.4 1.6 3.1 3.3 3.8 3.1 4.7 3.7
2.5 4.3 3.4 3.6 2.9 3.3 3.9 3.1
3.3 3.1 3.7 4.4 3.2 4.1 1.9 3.4
4.7 3.8 3.2 2.6 3.9 3.0 4.2 3.5
Empirical distributions (cont)
 Relative frequency distribution and histogram for battery lives
are as follows: Class Class mid Frequency Relative
Interval point f Frequency
1.5-1.9 1.7 2 0.050
2.0-2.4 2.2 1 0.025
2.5-2.9 2.7 4 0.100
3.0-3.4 3.2 15 0.375
3.5-3.9 3.7 10 0.250
4.0-4.4 4.2 5 0.125
4.5-4.9 4.7 3 0.075
Empirical distributions (cont)
 Estimating the probability density function:

 Skewness of data:
a) Skewed to the left
b) Symmetric
c) Skewed to the right
Empirical distributions (cont)
 Relative cumulative frequency Class
Relative
distribution of battery lives: Boundaries
Cumulative
Frequency
Less than 1.45 0.000
Less than 1.95 0.050
Less than 2.45 0.075
Less than 2.95 0.175
Less than 3.45 0.550
Less than 3.95 0.800
Less than 4.45 0.925
Less than 4.95 1.000
Joint probability distributions
 The function f(x,y) is a joint probability distribution or joint
probability mass function of the discrete random variables X
and Y if:
f(x,y)
1-

2- x

y
3-

 For any region A in the XY plane,


Joint probability distributions (cont)
 The function f(x,y) is a joint density function of the
continuous random variables X and Y if:

1-

2- f(x,y)

3-

For any region A in the XY plane


y x
Joint probability distributions (cont)
Example: A joint density function is given by:

 Verify condition 2 of the definition.

 Find , where A is the region


Joint probability distributions (cont)
 The marginal distribution of X alone and Y alone are
given by:

for the discrete case and by

for the continuous case


Joint probability distributions (cont)
 The conditional distribution of random variable Y, given that
X=x, is given by

 Similarly, the conditional distribution of random variable X,


given that Y=y, is given by

 For discrete or continuous random variable X, one has:


Joint probability distributions (cont)
 Example: given the joint density function:

find g(x), h(y), f(x|y) and evaluate


Joint probability distributions (cont)
 It is known from previous that

If f(x|y) does not depend on y, we may write:

 The random variables X and Y are Statistically Independent if


and only if

 Generally, are mutually statistically independent


if and only if
Joint probability distributions (cont)
Example: The shelf life of a certain food product is a random
variable with probability density function given by:

X1, X2, X3 represent the shelf lives for three of those containers
selected independently find P(X1<2, 1<X2<3, X3>2)
3- Mathematical
Expectation
Mean of a random variable
 Let X be a random variable with probability
distribution f(x). The mean or expected value of X is:

 If X is discrete

 If X is continuous
Mean of a random variable (cont)
Example: In a gambling game a man is paid $5 if he gets all
heads or all tails when three coins tossed and he pays out $3 if
either one or two heads show. What is his expect gain?

The random variable of interest is Y, the amount of the


gambler can win.
The possible values of Y are $5 if event E1={HHH,TTT} occurs
and -$3 if event E2={HHT,HTH,THH,HTT,THT,TTH} occurs.
Thus:
Mean of a random variable (cont)
Example: X is a random variable denotes the life of a certain
device. The probability density function is:

Find the expected life of this type of device


Mean of a random variable (cont)
 Theorem: Let X be a random variable with
probability distribution f(x). The mean or expected
value of the random variable g(X) is:
(discrete)
= E[g(X)]=
(continuous)

 Example: ,Find The Expected

value of g(X)=4X+3:
Mean of a random variable (cont)
 Expected Value of g(X,Y)
= E[g(X,Y)] = (discrete)

(continuous)

For constants a and b, E[aX+bY]=aE[X]+bE[Y]

 Example: Let

Find :
Variance and Covariance
 Let X be a random Variable with probability
distribution function f(X) and mean m. The Variance
of X is:

(discrete)
=E[(X – μ)2] =
(continuous)

 Theorem: The variance of a random variable X is:

= E( ) –
Variance and Covariance (cont)
 The Variance of a random Variable g(X) is:
(discrete)

(continuous)

For constants a and b, Var[aX+b]=a2Var[X]

 Let X and Y be random variables with joint probability


distribution f(x,y). The Covariance of X, Y is:

= E[(X – ) (Y – )] =
Variance and Covariance (cont)
 Theorem: The covariance of two random variables X
and Y is:

=E(XY) –

Y
 If Cov(X, Y) > 0
positive relationship

 If Cov(X, Y) < 0
negative relationship 0 X
Variance and Covariance (cont)
 X and Y are independent Cov(X , Y) = 0
 Correlation coefficient:

 It can be shown that:

note ; scale dependent


; scale-free
Linear Combinations of R.V’s
 Properties of Expectation:
① E(aX) = aE(X) , a is a constant.
② E(X + b) = E(X) + b, b is a constant.
③ E[g(X) h(Y)] = E[g(X)] E[h(Y)].
④ E(XY) = E(X)E(Y) if X & Y are independent

 Exercise: Find the expected value of Y =

x 0 1 2 3
f(x) 1/3 1/2 0 1/6
Linear Combinations of R.V’s (cont)
 Properties of Variance
① Var(aX) = a2Var(X), a is a constant.
② Var(X + b) = Var(X) , b is a constant.
③ Var(aX + bY) = Var(aX) + Var(bY) + 2 Cov(aX, bY)
= a2Var(X) + b2Var(Y) + 2abCov(X, Y)
Var(aX – bY) = a2Var(X) + b2Var(Y) – 2abCov(X, Y)

 Also One has:


Chebyshev’s Theorem
 The probability that any random variable X will
assume a value within k standard deviation of the
mean is at least

at least
Chebyshev’s Theorem (cont)
f(x)
Full information

?
P{a<X<b}
Partial information

 Proof:
4- Some Discrete
Probability
distributions
Discrete Uniform Distribution
 If X assumes the values with equal probability, then:

f(x) In case, k = 5
 Mean= =
1/5
 Variance= =

x1 x2 x3 x4 x5 x
 Example: Toss a Die
Binomial and Multinomial Distributions
 Bernoulli process:
N separate independent trials
Only two possible outcomes for each trial

 Example: Toss a die three times.


“success” = outcome is 6 , X = number of successes.

Possible
NNN NN6 N6N 6NN N66 6N6 66N 666
Outcome

X 0 1 1 1 2 2 2 3

Distribution of X  Binomial Distribution


Binomial and Multinomial Distributions (cont)

 Binomial Distribution
number of “successes” in n Bernoulli trials where
P(success) = p, P(failure) = q = 1 – p
 The probability distribution of the binomial random variable
X, the number of successes in n independent trials ,is:

= ,

 For this process:


Binomial and Multinomial Distributions (cont)
 Example: The probability that a certain kind of component
will survive a given shock is . Find the probability that
exactly 2 of the next 4 component tested survive
 Solution:

 Note:
Binomial and Multinomial Distributions (cont)

 Multinomial Distribution
 k possible outcomes ( )
 Each outcome has probability pi ( )
 In n independent trials,
where Xi = number of times that Ei occurs.

 The probability distribution function is:

with =n , = 1
Binomial and Multinomial Distributions (cont)
 Example: If a pair of dice are tossed 6 times, what is the
probability of obtaining a total of 7 or 11 twice, a matching
pair once, and any other combination 3 times?

 Solution: Possible events are:


E1: a total of 7 or 11 E2:a matching pair
E3:neither a pair nor a total of 7 or 11
The corresponding probabilities are:
The required probability is:
Hypergeometric Distribution
 n items are selected from N possible items (w/o replacement)
having k successes, (N – k) failures
 X = number of successes in n items selected

= ,
Hypergeometric Distribution (cont)

 Mean =

 Variance = =

 Relationship with Binomial


 When is small, p in binomial.

ie.,
Hypergeometric Distribution (cont)

 Multivariate Hypergeometric distribution


 N items, of which a1 are of type 1 ,···, ak are of type k.

 From N items, select n with x1 of type 1 ,···, xk of type k.

Geometric and Negative Binomial Distributions

 Geometric Distribution
Perform independent trials until a success occurs
X = number of trials until a success occurs

Mean = =
Variance = =
Geometric and Negative Binomial Distributions (cont)

 Negative Binomial Distribution


 X = number of trials until kth success occurs.

 Y = number of failures before kth success occurs.


Poisson Distribution and Poisson Process
 Distribution of the Number of outcomes in a
specified region  Poisson Distribution
 number of a monthly car-accident in Tehran
 number of customers who visit a pharmacy per hour
 number of yearly breakdowns of elevator

 this region may be:


 time interval, length, area, volume, page of a book

interval
( )
time
outcome
Poisson Distribution and Poisson Process (cont)
 Poisson Process
 The number of outcomes occurring in one time interval is independent of
the number that occurs in any other disjoint time interval.
( ) ( )
Independent
time

 The probability that a single outcome will occur in a short time interval is
proportional to the length of the time interval.

A( ) B( ) time

 The probability that more than one outcome will occur in a short time
interval is negligible.
 N(t): number of outcomes occurred by t
Poisson Distribution and Poisson Process (cont)

 Poisson Distribution
X =number of outcomes occurring in a given time
interval t:
,

where = average number of outcomes / unit time.

 Mean = =
Variance= =
 Thus one has:
Poisson Distribution and Poisson Process (cont)
 Example: The average number of oil tankers arriving each
day at a certain city is 10. the facilities can handle at most
15 tankers per day. What is the probability that on a given
day tankers will have to sent away?

 Solution:
X: number of tankers arriving each day.
Poisson Distribution and Poisson Process (cont)

 Poisson Distribution as a Limiting Form of Binomial


 X = a binomial random variable.
 When n ,p 0, and = np remains constant.
5- Some Continuous
Probability
distributions
(Continuous) Uniform Distribution
 X has Uniform Distribution with parameters α and β if

,
f(x) =
0 , otherwise

• random numbers follow


Uniform between 0 and 1
f(x)

α β x
Normal Distribution

 Describe many phenomena


 Good approximation to binomial and hypergeometric
 Limiting distribution of sample averages
 Two Parameters : mean (μ), standard deviation (σ)

 X has normal distribution with mean μ & variance σ2:

,
Normal Distribution (cont)
Normal Distribution (cont)
f(x)

x
μ-σ μ μ+σ
 symmetric about μ,
 Mode ; x = μ , Inflection :

Areas under Normal Curve

μ x

=
Areas under Normal Curve (cont)
 To tabulate Normal Curve areas, transform

Standard , then 0, 1
normal


Areas under Normal Curve (cont)
=

0 z

z .00 .01 .02 … .09


Example:
0.0 0.5000 0.5040 0.5080 0.5359
0.1 0.5398 0.5438 0.5478 0.5753
0.2 0.5793 0.5832 0.5871 0.6141
Areas under Normal Curve (cont)
 Example: Z = a standard normal R.V.
(a)
(b)
(c)

 Example: X ~ normal, ,
(a)
(b)
(c)
Application of Normal Distribution
 Example:
A certain type of storage battery lasts on the average 3.0
years, with standard deviation of 0.5. assuming the normal
distribution, find the probability that a given battery will
last less than 2.3 years.

 Solution:
Application of Normal Distribution (cont)
 Example:
The diameter of a ball bearing has a normal distribution
with mean 3.0 and standard deviation 0.005. The buyer set
the specifications on the diameter to be 3.00.01. how
many ball bearings will be scrapped?
 Solution:

4.56% will be Scrapped


Normal Approximation to Binomial
 Poisson vs. Binomial
 Normal vs. Binomial

 Theorem X = binomial, ,

Then, as

 Approximation becomes better as:


n gets larger
Gamma and Exponential Distributions
 Exponential Distribution
 Useful in modeling time between arrivals at service facilities
 One Parameter ; β

 A special case of Gamma


f(x) : rate

,
0 , otherwise

,
β
x
mean=standard deviation
Gamma and Exponential Distributions (cont)
 For an Exponential Distribution one can define:
 CDF:
, x>0

 Tail probability:
, x>0

f(x) P{X>x}
F(x)
x
Gamma and Exponential Distributions (cont)
 Example: The time to failure of a certain type of component in
years, distributed exponentially with =5. if 5 of these
components are installed in different systems, what is the
probability that at least 2 are still functioning at the end of 8
years?

 Solution:
Gamma and Exponential Distributions (cont)
 Relationship to the Poisson Process
number of events in any time interval t has a Poisson
distribution with parameter  the distribution of the
elapsed time between two successive events is
exponential with parameter
X t
Exponential Number of events in t: Poisson
with 1/ with mean t

Why? Poisson : P(no events in t) =


Let X = time until the first event.
Then P(no events in t) =
i.e., = CDF of exponential with or
Gamma and Exponential Distributions (cont)
 Model for Component Lifetime
 Exponential Dist’ is useful due to “Memoryless property”
 Memoryless property
if T~ exponential with (> 0)

0 t t+t
0 t

 The distribution of the additional lifetime = The original


distribution of lifetime (Memoryless Property)
Gamma and Exponential Distributions (cont)
 Gamma Distribution
 Gamma function : ,

Properties for positive integer n

- X ~ Gamma with parameters α and β if

0 , otherwise
where ,
Gamma and Exponential Distributions (cont)
 In Gamma Distribution:
gamma β=1

,

 If α =1 :
, x > 0, β > 0 Exponential Distribution

 CDF: incomplete gamma function


Chi-Square Distribution
 Special case of Gamma
 α = n / 2, β = 2 where n = a positive integer
 X ~ Chi-Square with parameter n (degree of freedom) if

f (x) = , x>0

0 , otherwise
Chi-Square Distribution (cont)
 Useful in statistical inference, hypothesis testing
 The shape of Chi-Square Distribution
Weibull Distribution

 X has Weibull Distribution with parameters α and β if


, x>0
f (x) =
0 , otherwise

 If β = 1 ; (exponential with parameter )



 Useful in Reliability, life testing problems


Weibull Distribution (cont)
 The shape of Weibull Distribution

You might also like