Probability B and W PDF For Students
Probability B and W PDF For Students
Reference:
Probability and Statistics for Engineers and
Scientists
By:
Ronald E. Walpole , Raymond H. Myers
Contents
Probability
Random Variables and Probability Distributions
Mathematical Expectation
Some Discrete Probability Distributions
Some Continuous Probability Distributions
1- Probability
Sample Space
The set of all possible outcomes of a statistical
experiment is called the sample space: “S”
A coin is tossed:
S={H,T}
Tossing a die:
S={1,2,3,4,5,6}
when n1+n2+…+nr=n
• Generally, =
Sample Space y
Example: Two balls are drawn from
an urn containing 4 red and 3 black RR 2
balls. The possible outcomes and the RB 1
values y of the random variable Y BR 1
(the number of red balls) are: BB 0
Concept of a random variable (cont)
Example: A stockroom clerk returns three safety helmets at random to
three steel mill employees, who had previously checked them. If Smith,
Jones and Brown, in that order, receive one of three hat, list the sample point
for possible order of returning the helmets and find the values m of the
random variable M that represent the number of correct matches.
Sample Space m
SJB 3
SBJ 1
JSB 1
JBS 0
BSJ 0
BJS 1
Concept of a random variable (cont)
Types of sample spaces:
1-
2-
3-
Discrete probability distributions (cont)
Example: In safety helmets example the possible values m
of M and their probabilities are given by:
m 0 1 3
P(M=m) 1/3 1/2 1/6
x 0 1 2
f (x)
Discrete probability distributions (cont)
Example: If 50% of the automobiles sold by an agency for a
certain foreign car are equipped with diesel engine. Find a
formula for the probability distribution of the number of
diesel models among the next 4 cars sold by this agency.
Thus:
Discrete probability distributions (cont)
The cumulative distribution F(x) of a discrete random
variable X with probability distribution f(x) is:
x 0 1 2 3 4
f(x)
Barr Chart:
Discrete probability distributions (cont)
Probability Histogram
Discrete cumulative
distribution
Continuous probability distributions
The function f(x) is a probability density function for
the continuous random variable X, defined over the
set of numbers R, if
1-
2-
3-
Continuous probability distributions (cont)
Typical density functions:
Continuous probability distributions (cont)
Continuous probability distributions (cont)
Example: A continuous random variable X having the
probability density function :
1-
2-
Continuous probability distributions (cont)
Example: for the density function of the previous example
find F(x) and use it to evaluate
Therefore
And
Continuous probability distributions (cont)
Skewness of data:
a) Skewed to the left
b) Symmetric
c) Skewed to the right
Empirical distributions (cont)
Relative cumulative frequency Class
Relative
distribution of battery lives: Boundaries
Cumulative
Frequency
Less than 1.45 0.000
Less than 1.95 0.050
Less than 2.45 0.075
Less than 2.95 0.175
Less than 3.45 0.550
Less than 3.95 0.800
Less than 4.45 0.925
Less than 4.95 1.000
Joint probability distributions
The function f(x,y) is a joint probability distribution or joint
probability mass function of the discrete random variables X
and Y if:
f(x,y)
1-
2- x
y
3-
1-
2- f(x,y)
3-
X1, X2, X3 represent the shelf lives for three of those containers
selected independently find P(X1<2, 1<X2<3, X3>2)
3- Mathematical
Expectation
Mean of a random variable
Let X be a random variable with probability
distribution f(x). The mean or expected value of X is:
If X is discrete
If X is continuous
Mean of a random variable (cont)
Example: In a gambling game a man is paid $5 if he gets all
heads or all tails when three coins tossed and he pays out $3 if
either one or two heads show. What is his expect gain?
value of g(X)=4X+3:
Mean of a random variable (cont)
Expected Value of g(X,Y)
= E[g(X,Y)] = (discrete)
(continuous)
Example: Let
Find :
Variance and Covariance
Let X be a random Variable with probability
distribution function f(X) and mean m. The Variance
of X is:
(discrete)
=E[(X – μ)2] =
(continuous)
= E( ) –
Variance and Covariance (cont)
The Variance of a random Variable g(X) is:
(discrete)
(continuous)
= E[(X – ) (Y – )] =
Variance and Covariance (cont)
Theorem: The covariance of two random variables X
and Y is:
=E(XY) –
Y
If Cov(X, Y) > 0
positive relationship
If Cov(X, Y) < 0
negative relationship 0 X
Variance and Covariance (cont)
X and Y are independent Cov(X , Y) = 0
Correlation coefficient:
x 0 1 2 3
f(x) 1/3 1/2 0 1/6
Linear Combinations of R.V’s (cont)
Properties of Variance
① Var(aX) = a2Var(X), a is a constant.
② Var(X + b) = Var(X) , b is a constant.
③ Var(aX + bY) = Var(aX) + Var(bY) + 2 Cov(aX, bY)
= a2Var(X) + b2Var(Y) + 2abCov(X, Y)
Var(aX – bY) = a2Var(X) + b2Var(Y) – 2abCov(X, Y)
at least
Chebyshev’s Theorem (cont)
f(x)
Full information
?
P{a<X<b}
Partial information
Proof:
4- Some Discrete
Probability
distributions
Discrete Uniform Distribution
If X assumes the values with equal probability, then:
f(x) In case, k = 5
Mean= =
1/5
Variance= =
x1 x2 x3 x4 x5 x
Example: Toss a Die
Binomial and Multinomial Distributions
Bernoulli process:
N separate independent trials
Only two possible outcomes for each trial
Possible
NNN NN6 N6N 6NN N66 6N6 66N 666
Outcome
X 0 1 1 1 2 2 2 3
Binomial Distribution
number of “successes” in n Bernoulli trials where
P(success) = p, P(failure) = q = 1 – p
The probability distribution of the binomial random variable
X, the number of successes in n independent trials ,is:
= ,
Note:
Binomial and Multinomial Distributions (cont)
Multinomial Distribution
k possible outcomes ( )
Each outcome has probability pi ( )
In n independent trials,
where Xi = number of times that Ei occurs.
with =n , = 1
Binomial and Multinomial Distributions (cont)
Example: If a pair of dice are tossed 6 times, what is the
probability of obtaining a total of 7 or 11 twice, a matching
pair once, and any other combination 3 times?
= ,
Hypergeometric Distribution (cont)
Mean =
Variance = =
ie.,
Hypergeometric Distribution (cont)
Geometric Distribution
Perform independent trials until a success occurs
X = number of trials until a success occurs
Mean = =
Variance = =
Geometric and Negative Binomial Distributions (cont)
interval
( )
time
outcome
Poisson Distribution and Poisson Process (cont)
Poisson Process
The number of outcomes occurring in one time interval is independent of
the number that occurs in any other disjoint time interval.
( ) ( )
Independent
time
The probability that a single outcome will occur in a short time interval is
proportional to the length of the time interval.
A( ) B( ) time
The probability that more than one outcome will occur in a short time
interval is negligible.
N(t): number of outcomes occurred by t
Poisson Distribution and Poisson Process (cont)
Poisson Distribution
X =number of outcomes occurring in a given time
interval t:
,
Mean = =
Variance= =
Thus one has:
Poisson Distribution and Poisson Process (cont)
Example: The average number of oil tankers arriving each
day at a certain city is 10. the facilities can handle at most
15 tankers per day. What is the probability that on a given
day tankers will have to sent away?
Solution:
X: number of tankers arriving each day.
Poisson Distribution and Poisson Process (cont)
,
f(x) =
0 , otherwise
α β x
Normal Distribution
,
Normal Distribution (cont)
Normal Distribution (cont)
f(x)
x
μ-σ μ μ+σ
symmetric about μ,
Mode ; x = μ , Inflection :
Areas under Normal Curve
μ x
=
Areas under Normal Curve (cont)
To tabulate Normal Curve areas, transform
Standard , then 0, 1
normal
Areas under Normal Curve (cont)
=
0 z
Example: X ~ normal, ,
(a)
(b)
(c)
Application of Normal Distribution
Example:
A certain type of storage battery lasts on the average 3.0
years, with standard deviation of 0.5. assuming the normal
distribution, find the probability that a given battery will
last less than 2.3 years.
Solution:
Application of Normal Distribution (cont)
Example:
The diameter of a ball bearing has a normal distribution
with mean 3.0 and standard deviation 0.005. The buyer set
the specifications on the diameter to be 3.00.01. how
many ball bearings will be scrapped?
Solution:
Theorem X = binomial, ,
Then, as
,
0 , otherwise
,
β
x
mean=standard deviation
Gamma and Exponential Distributions (cont)
For an Exponential Distribution one can define:
CDF:
, x>0
Tail probability:
, x>0
f(x) P{X>x}
F(x)
x
Gamma and Exponential Distributions (cont)
Example: The time to failure of a certain type of component in
years, distributed exponentially with =5. if 5 of these
components are installed in different systems, what is the
probability that at least 2 are still functioning at the end of 8
years?
Solution:
Gamma and Exponential Distributions (cont)
Relationship to the Poisson Process
number of events in any time interval t has a Poisson
distribution with parameter the distribution of the
elapsed time between two successive events is
exponential with parameter
X t
Exponential Number of events in t: Poisson
with 1/ with mean t
0 t t+t
0 t
0 , otherwise
where ,
Gamma and Exponential Distributions (cont)
In Gamma Distribution:
gamma β=1
,
If α =1 :
, x > 0, β > 0 Exponential Distribution
f (x) = , x>0
0 , otherwise
Chi-Square Distribution (cont)
Useful in statistical inference, hypothesis testing
The shape of Chi-Square Distribution
Weibull Distribution