0% found this document useful (0 votes)
43 views10 pages

Chapter .4

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views10 pages

Chapter .4

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Lecture Statistics and Probability for Engineering Stat 2010

CHAPTER 4
4. Random Variable and Some Basic Probability Distributions
Simple probabilities can be computed from elementary consideration or either of the methods
described in the previous chapter. However, in dealing with probabilities of whole classes of
events, we have to consider more efficient ways of analysis of probability. For this purpose we
should know the concept of a probability distribution. Moreover probability distribution is
defined for random variables (r.v.s).
4.1. Random Variables (Discrete and Continuous)
Definition 4.1:
Let S be a sample space of an experiment and X is a real valued function defined over the
sample space S, then X is called a random variable (or stochastic variable). A random
variable, usually shortened to r.v. (rv), is a function defined on a sample space S and taking
values in the real line  , and denoted by capital letters, such as X, Y, Z. Thus, the value of the
r.v. X at the sample point s is X(s), and the set of all values of X, that is, the range of X, is
usually denoted by X(S) or RX.
Example 4.1: Assume tossing of three distinct coins once, so that the sample space is
S = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT}. Then, the random variable X can be
defined as X(s), X(s) = number of heads (H’s). Hence, with each sample point we can associate
a number for X as shown in the table below.
S HHH HHT HTH THH HTT THT TTH TTT
X(s) 3 2 2 2 1 1 1 0
Thus, X(HHH) = 3, X(HHT) = X(HTH) = X(THH) = 2, X(HTT) = X(THT) = X(TTH) = 1, and
X(TTT) = 0, so that X(S) = {0, 1, 2, 3}.
4.1.1. Discrete Random Variables
Definition 4.2:
 A random variable X is called discrete (or discrete type), if X takes on a finite or countably
infinite number of values; that is, either finitely many values such as x 1, ..., xn, or countably
infinite many values such as x0, x1, x2, . . . .
Or discrete random variable can be described as follows:
 Take whole numbers (like 0, 1, 2, 3 etc.)
 Take finite or countable infinite number of values
 Jump from one value to the next and cannot take any values in between.
Example 4.2:
Experiment Random Variable (X) Variable values
Children of one gender in a family Number of girls 0, 1, 2, …
Answer 23 questions of an exam Number of correct 0, 1, 2, ..., 23

4.1.1.1 Probability Distribution of Discrete Random Variables


1
Lecture Statistics and Probability for Engineering Stat 2010
Definition 4.3:

If X is a discrete random variable, the function given by f(x) = P(X = x) for each x within the
range of X is called the probability distribution or probability mass function (pmf) of X.

Example 4.3: Find the probability mass function corresponding to the random variable X of
Example 4.1. That is the r.v X, X (s) = {0, 1, 2, 3}.

p(0) = P(X=0)= , p(1) = P(X=1)= , f(2) = P(X=2)= , p(3) = P(X=3)=

0 1 2 3

Remark:- The probability distribution function or probability mass function p(x), of a discrete
random variable X should satisfy the following two conditions:

i) p(x) ≥ 0
ii) ∑ (the summation is taken over all possible values of x)

Example 4.4: Could p(x) be a p.m.f ? Justify your answer. for x = 1, 2, 3, 4, 5

4.1.1.2. Distribution Functions of Discrete Random Variables


Definition 4.4:

If X is a discrete random variable, the function given by


F ( x)  P ( X  x)   f (t ) for all x in  and t ∈ X.
t x

Where f(t) is the value of probability distribution or p.m.f of X at t, is called the distribution
function, or the cumulative probability distribution function of X.

If X takes on only a finite number of values x1, x2, . . . ,xn, then the distribution function is
given by

2
Lecture Statistics and Probability for Engineering Stat 2010
Example 4.5: Find the distribution function F of the total number of heads obtained in four
tosses of a balanced coin?

4.1.2. Continuous Random Variables

Definition 4.5:

A r.v X is called continuous (continuous type) if X takes all values in a proper interval I ⊆  .
Or continuous random variables can be described as follow:
 Take value from the set of real numbers.
 Usually obtained by measuring.
 Take infinite number of values in an interval.
 Too many to list, like discrete variable.
Example 4.6: The following examples are continuous r.v.s
Experiment Random Variable X Variable values
Weigh 100 People Weight 45.1, 78, ...
Measure Time Between Arrivals Inter-Arrival time 0, 1.3, 2.78, ...

4.1.2.1 Probability Distribution Functions of Continuous Random Variables


Definition 4.6:

A function f(x), defined over the set of real numbers, is called probability density function of
a continuous random variable X if and only if

P (a ≤ X ≤ b) = ∫ for any real constant a ≤ b.

Probability distribution function also called probability densities (p.d.f.), or simply densities.
Remark:
 The probability density function f (x) of a continuous random variable X, has the following
properties (always satisfy the following conditions):
i) f(x) ≥ 0 for all x, or for −∞ < x < ∞

ii)  f ( x) dx  1


If X is a continuous random variable and a and b are real constants with a ≤ b, then
P (a ≤ X ≤ b) = P (a < X ≤ b) = P (a ≤ X < b) = P (a < X < b)
Example 4.7: If X is a random variable with probability density function f(x),

3
Lecture Statistics and Probability for Engineering Stat 2010
Find the constant k and compute P (0.5 ≤ X ≤ 1)?
Example 4.8: Show that for 0 < x < 1 can represent density function?
4.1.2.2 Distribution Functions of Continuous Random Variables
Definition 4.7:
If X is a continuous random variable and the value of its probability density is f (t), then
function given by
x
F ( x)  P ( X  x)   f (t ) dt


is called the distribution function, or the cumulative distribution function of the r.v. X.
Theorem 4.1:
If f (x) and F(x) are the values of the density and the distribution function of X then
P (a ≤ X ≤ b) = F(b) - F(a)
For any real constant a and b with a ≤ b, and
Where the derivative exist at x
Example 4.9: Find the distribution function of the random variable X and evaluate P (0.5≤ X≤1)
using Theorem 5.1. If is the probability density of X is f (x):
{
Example 4.10: If the function f(x) is the density function of a r.v. X, where f(x) is given by
{ . Then find
(a) The constant C (b) Compute P(1 < x < 2)?
(c) The distribution function of the r.v X? (d) Use the result of (c) to find P(1 < x ≤ 2)?
4.2. Expectation
The expectation of a random variable X is very often called the mean of X and is denoted by
E(X). The mean, or expectation, of the random variable X gives a single value that acts as a
representative or average of the values of the random variable X.
4.2.1. Expectation of a Random Variable
Definition 4.9:
Let X be a discrete random variable which takes values xi (x1, . . . ,xn) with corresponding
probabilities P(X = xi) = p(xi), i = 1, . . . , n. Then the expectation of X (or mathematical
expectation or mean value of X) is denoted by E(X) and is defined as:
n
E(X) = x1p(x1) + . . . + xnp(xn) =  x p( x )
i 1
i i
=  x p( x )
x

The last summation is taken over all appropriate values of x.


Example 4.11 Let the random experiment involves tossing of two distinct coins once. If the
random variable X is number heads. What will be the expected value of the Random Variable X.
Definition 4.10:
Let the random variable X is continuous with p.d.f. p(x), it`s expected value is defined by:
4
Lecture Statistics and Probability for Engineering Stat 2010

E ( X )   x f ( x) dx , provided that the integral exists.

Example 4.12: Suppose that you are expecting a message at some time past 5 P.M. From
experience it is know that X, the number of hours after 5 P.M. until the message arrives, is a
random variable with the following probability density function:

What will be the expected amount of time past 5 P.M. until the message arrives?
Properties of Expectation
 If X = K is a constant, then E(X) = K
 Suppose K is a constant and X is random variable Then E(KX) = K E(X)
 If X is random variable and suppose a and b are constants. Then E(aX + b) = a E(X) + b
 Let X and Y are any two random variables. Then E(X + Y) = E(X) + E(Y). This can be
generalized to n random variables, That is, if X1, X2, X3,. . . ,Xn are random variables then,
E(X1 + X2 + X3+ . . . + Xn) = E(X1) + E(X2) + E(X3)+ . . . + E(Xn)
 Let X and Y are any two random variables. If X and Y are independent. Then
E(XY) = E(X)E(Y)
Examples 4.13: If the random variables X and Y have density functions f(x) = and

respectively for 0 < x < 4 and 1 < y <5. Find

a. E (X) b. E(Y) c. E(X + Y) d. E(2X + 3Y) e. E(XY) if X and Y are independent?


4.3. Variance of a Random Variable
The variance (or the standard deviation) of a r.v is a measure of the dispersion, or scatter, of the
values of the random variable about the mean.
Definition 4.11:
Let X is a random variable. The variance of X, denoted by V(X) or Var(X) or , defined as:
V(X) = E (X – E(X))2
 The positive square root of V(X) is called the standard deviation of X and denoted by .
Theorem 4.2:
Let X be a random variable, then the variance of X is equivalently given by
V(X) = E(x2) – [E(X)]2
Examples 4.14: Find the variance and standard deviation of the random variable X. X is number
of heads come up in tossing of three distinct coins once.
4.3.1. Properties of Variance
 Let X be a random variable and K is a constant then, V(X + K) = V(X)
 For a constant K and a random variable X then, V(KX) = K2 V(X)
 Let X1, X2, X3, . . . , Xn be n independent random variable, then

5
Lecture Statistics and Probability for Engineering Stat 2010
V(X1 + X2 + X3 + . . . + Xn) = V(X1) + V(X2) + V(X3) + . . . + V(Xn)
 Let X be a random variable with finite variance. Then for any real number a,
V(X) = E[(X – a)2] – [E(X) – a]2.
Examples 4.15: Let a continuous random variable X has probability density function given by
{ for a constant K. Find
(a) The variance of X (b) The standard deviation of X (c) Var (KX) (d) Var(K + X)
4.4. Common Discrete Probability Distributions

4.4.1. Binomial Probability Distribution


Repeated trials play an important role in probability and statistics, especially when the number of
trial is fixed, the parameter (the probability of success) is same for each trial, and the trial are
all independent. Several random variables are a rise in connection with repeated trials. The one
we shall study here concerns the total number of success.
Definition 4.4.1:
 A random variable X has Binomial distribution and it referred to as a Binomial random
variable if and only if its probability distribution given by

( ) for x = 0, 1, . . . , n

 Properties of Binomial Probability Distribution


Le X be a Binomial distribution with n number of trials and probability of success then the:
n n
n
 x  x  p (1 p)
nx
 x p ( X  x)
x
 Mean: E(X) = µ =   np
x 1 x 1  
 Variance: Var (X) = E(X – E(x))2 =
Example 4.16: Suppose the probability that an item produced by a certain machine will be
defective is 0.1. Find the probability that a sample of 10 items will contain at most one defective
item. Assume that the quality of successive items is independent.
4.4.2. Hyper geometric Probability Distribution
If an experiment consider a number of success being x in a random variable with sample size n
and selected from N items among which M are labeled as “success” where the sampling is
without replacement, the probability distribution of such experiments will be hyper geometric.
Definition 5.13: A random variable X has hyper geometric distribution and it referred to as a
hyper geometric random variable if and only if its probability distribution given by

( )( )
for x = 0, 1, 2, . . ., n; x ≤ M, n–x ≤ N–M
( )

 Properties of Hypergeometric Distribution

6
Lecture Statistics and Probability for Engineering Stat 2010
Let X be a hypergeometric distribution with N items, selected sample size n among M labeled as
success then:
( )( )
 Mean : E(X) = µ = ∑ =
( )

 Variance: Var(X) = E(X – E(x))2 =

Example 4.17: The components of a 6-component system are to be randomly chosen from a bin
of 20 used components. The resulting system will be functional if at least 4 of its 6 components
are in working condition. If 15 of the 20 components in the bin are in working condition, what is
the probability that the resulting system will be functional?

4.4.3. Poisson Distribution

When the number of trials n is large, computation of binomial probabilities using Definition 4.12
will involve a prohibitive amount of work. Poisson distribution shall present as a probability
distribution that can be used to approximate binomial probabilities of this kind. Especially when
, and , remains constant. By letting the constant ( ) to be , That is , =λ

and hence, the binomial distribution will be written as

( ) ( ) for x = 0, 1, . . . , n

After some mathematical derivations we would have the following probability distribution.
Definition 4.14:
 A random variable X has Poisson distribution and it referred to as a Poisson random variable
if and only if its probability distribution given by

for x = 0, 1, 2, . . .

Thus when n , and , remains constant the number of success is a random variable
having a poison distribution with the parameter λ.

 Properties of Poisson distribution


Le X is a Poisson distribution with an average number of time an event occur (parameter) λ then:

 Mean: E(X) = ∑ =λ

 Variance: Var (X) = E(X – E(x))2 =


Example 4.18:

Poisson random variable X, can be

7
Lecture Statistics and Probability for Engineering Stat 2010
a. Number of car accident per a given period of time
b. Number of customers arrived during a given period of time.

Example 4.19:
Suppose that the average number of accidents occurring weekly on a particular stretch of a
highway equals 3. Calculate the probability that there is at least one accident this week?

4.5. Common Continuous Probability Distributions


4.5.1. Normal Distribution
The normal distribution in many ways is the cornerstone of modern statistical theory.
Definition 4.15:
 A continuous random variable X has a normal distribution and it referred to as a normal
random variable if and only if its probability distribution is given by

for - ∞ < x < ∞, where


where µ is the mean and is standard deviation.

 Properties of Normal Distribution


Let X be a normal random variable with parameters then:


 Mean: E(X) =∫ ∞ √

 Variance: Var(X) = E(X – E(x))2

Example 4.20: Suppose we are looking to a national examination result, whose scores would be
approximately normally distributed with mean µ = 500 and standard deviation = 100. What is
the probability that a score falls between 600 and 750?

4.5.1.1. Standard Normal Distribution


Definition 4.16:
The normal distribution with mean µ = 0 and standard deviation and it referred to as
standard normal distribution which is given by:

for - ∞ < x < ∞


√ 

 Transformation to Standard Normal:

8
Lecture Statistics and Probability for Engineering Stat 2010
We can transform any normal random variable X with mean µ and standard deviation σ into a
standard normal random variable Z with mean 0 and standard deviation 1. The linear
transformation is

Example 4.21:
1. Find the area under the standard normal distribution which lies
a. Between Z = 0 and Z = 0.96 d. Between Z = -0.67 and Z = 0.75
b. Between Z = - 1.45 and Z = 0 e. To the left of Z = 0.35
c. To the right of Z = - 0.35 f. Between Z = 0.25 and Z = 1.25
2. Find the value of Z if
a. The normal curve between 0 and Z(positive) is 0.4726
b. The area to the left of Z is 0.9868
3. Suppose we are looking at to a national examination whose scores would be approximately
normal with µ = 500 and σ = 100. What is the probability that a score falls between 600
and 750? (use standard normal distribution table)
5.5.2. Gamma Distribution
The Gamma distribution is defined based on Gamma function. Gamma function is defined as

∫ for α > 0

Moreover, it is shown that the integral ∫ is finite for α > 0.

Definition 5.17: A random variable X has a Gamma distribution and it referred to as a Gamma
random variable if and only if its probability distribution given by

{ for x > 0, α > 0, β > 0

The Gamma distribution is suitable for describing waiting times between successive occurrences
of a random event and is also used for describing survival times.

 Properties of Gama Distribution


Let X be an Gamma random variable with parameters α and β then

 Mean: E(X) = ∫ =
 Variance: Var(X) = E(X – E(x))2 =

9
Lecture Statistics and Probability for Engineering Stat 2010
Example 4.22:

The lifetime of certain equipment is described by a r.v. X whose distribution is Gamma with
parameters α = 2 and β = 13, so that a) Find the corresponding p.d.f. for x > 0? b) Determine
the expected lifetime?

c) The variance? d) The probability that lifetime is at least 1 unit of time?

4.5.3. Exponential Distribution

When events occur uniformly at random over time at a rate of λ events per unit time, then the
random variable X giving the time to the first event has an exponential distribution. It is a special
case of the gamma distribution for and

Definition 4.18: A random variable X has an Exponential distribution and it referred to as an


Exponential random variable if and only if its probability distribution is given by:

 Properties of Exponential Distribution


Let X be an exponential random variable with parameter λ then

 Mean: E(X) = ∫ for
 Variance: Var(X) = E(X – E(x))2 = for

Example 4.23: The time between customer demands for call connections X assume an
exponential distribution with λ = 2. Find P(1 < X < 4)?
4.5.3.1. Application of Exponential Distribution on Reliability Theory
Reliability may refer to the probability that a specified mission will be completed or to the
probability that a lifetime of a continuous random variable exceeds a specified time limit.
Reliability is applied in many areas of engineering, including design of mechanical devices,
electronic equipment, power transmission systems and so on. The exponential distribution is
often used in reliability studies as the model for the time until failure of a device.

Example 4.24: The time to failure (in hours) of fans in a personal computer can be modeled by
an exponential distribution with λ = 0.0003.
(a) What is the probability that the fan will last at least 10,000 hours?
(b) What is the probability (Proportion) that the fan will last at most 7,000 hours?
(c) What is the probability that the fan will last between 7,000 and 10,000 hours?

10

You might also like