0% found this document useful (0 votes)
43 views19 pages

Unit 4 Notesb

Uploaded by

zaidmunshi212
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views19 pages

Unit 4 Notesb

Uploaded by

zaidmunshi212
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

U4-1

Variability and Randomness


Statistics involves the study of variability. But how
can we work with something that involves so much
uncertainty?

To understand this, we consider the idea of random


behaviour.

The key lies in the fact that random behaviour is


unpredictable in the short-run, but has a regular
and predictable distribution in the long-run.

U4-2 U4-3
Probability Probability
Roll a die, toss a coin, or buy a lottery ticket with We toss a coin and record the proportion of heads that
your lucky numbers. Outcomes cannot be predicted has been observed after each toss. Assuming the coin
ahead of time, but can nevertheless be described by is fair, the likelihood of observing a Head is the same
a regular pattern, one which emerges only after as that for observing a Tail. There is a 50% chance of
many repeated trials. either outcome.
Suppose we observe the following sequence of tosses:
This fact is the foundation for the concept of
probability. HTTHTH

We record the proportions 1.0, 0.5, 0.33, 0.5, 0.4, 0.5

U4-4 U4-5
Probability Probability
The proportions vary quite a bit early on, but in the
long-run, we are bound to see proportions very close 1
to 0.5 consistently.
proportion

Eventually the proportion gets close to 0.5 and 0.5


stays there. (Toss a fair coin 100,000 times and you
will almost surely observe between 49,000 and 51,000
Heads).
0

We say that 0.5 is the probability of observing a Head. 1 2 3 4 5 6 10,000

Trial #
U4-6 U4-7
Probability Probability
We call a phenomenon random if individual This may be due to the fact that we do not observe
outcomes are uncertain, but there is a regular the phenomenon enough times to observe a long-run
distribution of outcomes in a large number of pattern, after which regularity would be sure to
repetitions. emerge.

Note the term “random” is not the same in The probability of any outcome of a random
phenomenon is the proportion of times the outcome
statistics as it is in everyday language. We
would occur in an infinitely long series of trials.
often associate “randomness” with a haphazard
or even chaotic event.

U4-8 U4-9
Proportion vs. Probability Probability
What is the difference between proportions and Probability theory is the branch of mathematics that
probability? describes random behaviour. We must deal with
mathematical models, because probability itself can
A proportion is a known or observed value, while a never be observed.
probability is a theoretical value of a proportion
after an infinitely long series of trials. We could toss a coin forever, but our understanding
of randomness enables us to describe the long-run
We speak of proportions in the present tense, behaviour.
whereas probability relates to future events.

U4-10 U4-11
Probability Model Sample Space
The mathematical model used to describe random The sample space S of a random phenomenon is the
behaviour is called a probability model. set of all possible outcomes.

A probability model has two components: Sample spaces can be very simple. If we toss a coin
one time, our sample space is simply S = {H, T}.
 a list of possible outcomes
 a probability for each outcome They can also be very complicated. Consider the set
of all possible combinations of six numbers drawn in
Lotto 6/49 from a population of 49 numbers. There
are almost 14 million of them!
U4-12 U4-13
Sample Space Example
If we toss a coin three times, the sample space is We roll two six-sided dice. The sample space of
possible outcomes is
S = {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT}
S = {11, 12, 13, 14, 15, 16,
Suppose we are only interested in how many Heads 21, 22, 23, 24, 25, 26,
are observed. Then our sample space becomes 31, 32, 33, 34, 35, 36,
S = {0, 1, 2, 3} 41, 42, 43, 44, 45, 46,
51, 52, 53, 54, 55, 56,
61, 62, 63, 64, 65, 66}

U4-14 U4-15
Example Example
The first number in each outcome represents the We have a fair six-sided die with three faces painted
number on the top face of the first die and the second blue, two faces painted red, and one face painted
number represents the number on the top face of the green. We roll the die twice, and the outcome of
second die. Note that there are 6 x 6 = 36 outcomes. interest is the colour facing up on each roll. The
sample space is
If we were specifically interested in the sum of the
numbers on the two dice, our sample space would be S = {BB, BR, BG, RB, RR, RG, GB, GR, GG}
S = {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}

U4-16 U4-17
Probabilities of Outcomes Events
Suppose the sample space for an experiment is An event is any subset of outcomes in the sample
space. For example, when rolling two dice, the event
S = {O1, O2, . . . , On} “At Least One 4” can be written as

The probability of outcome Oi is denoted as pi. A = {14, 24, 34, 41, 42, 43, 44, 45, 46, 54, 64}
Probabilities of the outcomes must satisfy the
following two conditions: The event “Sum is 9” can be written as
B = {36, 45, 54, 63}
(i) 0 ≤ pi ≤ 1 for all i = 1, 2, . . ., n
and (ii) p1  p2    pn  1
U4-18 U4-19

Equal Probabilities Example


The probability of any event is equal to the sum of Since 11 of the 36 possible outcomes are contained in
probabilities of the outcomes contained in that event. the event A = {At Least One 4}, it follows that
In particular, if all outcomes in the sample space are P(A) = 11/36 = 0.3056
equally likely, then the probability of an event A is
equal to S = {11, 12, 13, 14, 15, 16,
# of outcomes in A 21, 22, 23, 24, 25, 26,
P(A) 
P(A)
# of outcomes in S 31, 32, 33, 34, 35, 36,
41, 42, 43, 44, 45, 46,
51, 52, 53, 54, 55, 56,
61, 62, 63, 64, 65, 66}

U4-20 U4-21
Example Intersection
Similarly, 4 of the 36 possible outcomes are contained The intersection of two events, A  B, consists of
in the event B = {Sum is 9}, it follows that the outcomes that are contained in both A and B.
P(B) = 4/36 = 0.1111 The probability of this event, P(A  B), is the
probability that both events occur simultaneously.
S = {11, 12, 13, 14, 15, 16,
21, 22, 23, 24, 25, 26, In the previous example, there are only two outcomes
31, 32, 33, 34, 35, 36, that contain at least one 4 and for which the sum is
41, 42, 43, 44, 45, 46, nine:
51, 52, 53, 54, 55, 56, A  B = {45, 54}
61, 62, 63, 64, 65, 66}

U4-22 U4-23
Example Union
It follows that The union of two events, A  B, consists of the
outcomes that are contained in at least one of the
P(A  B) = 2/36 = 0.0556 events A or B. The probability of this event,
P(A  B), is the probability that either of the two
S = {11, 12, 13, 14, 15, 16, events occurs. Note that this includes the probability
21, 22, 23, 24, 25, 26, that they both occur.
31, 32, 33, 34, 35, 36,
In the previous example, there are 13 outcomes that
41, 42, 43, 44, 45, 46, contain at least one 4 or for which the sum is 9:
51, 52, 53, 54, 55, 56,
61, 62, 63, 64, 65, 66} A  B = {14, 24, 34, 36, 41, 42, 43 44, 45, 46, 54, 63, 64}
U4-24 U4-25
Example Probability of Union
It follows that The probability of the union of two events can be
calculated as
P(A  B) = 13/36 = 0.3611
P(A  B) = P(A) + P(B) – P(A  B)
S = {11, 12, 13, 14, 15, 16,
21, 22, 23, 24, 25, 26,
The reason for this can be seen in the following
31, 32, 33, 34, 35, 36, diagram, known as a Venn diagram.
41, 42, 43, 44, 45, 46,
51, 52, 53, 54, 55, 56,
61, 62, 63, 64, 65, 66}

U4-26 U4-27
Probability of Union Probability of Union
If we add P(A) + P(B), we have included the We could have used this rule to calculate the
probability of the intersection P(A  B) twice, probability in the previous example:
so we must subtract it once.
P(A  B) = P(A) + P(B) – P(A  B)
S
= 11/36 + 4/36 – 2/36 = 13/36

AA BB

U4-28 U4-29
Mutually Exclusive Events Example
Two events are mutually exclusive (or disjoint) if Consider the events
they have no outcomes in common: A = {First Die Shows a 1}
B = {Sum is at least 8}
AB =
S = {11, 12, 13, 14, 15, 16,
where  is the empty set, a set containing no 21, 22, 23, 24, 25, 26,
outcomes, i.e., P() = 0. 31, 32, 33, 34, 35, 36,
41, 42, 43, 44, 45, 46,
It follows that two events are mutually exclusive if 51, 52, 53, 54, 55, 56,
and only if P(A and B) = 0. 61, 62, 63, 64, 65, 66}

P(A  B) = 0  A and B are mutually exclusive.


U4-30 U4-31
Mutually Exclusive Events Exhaustive Events
We have seen that Two events A and B are exhaustive of the sample
space S if together, they contain all outcomes in S.
P(A  B) = P(A) + P(B) – P(A  B) For example, consider the events
So when two events are mutually exclusive,
A = {First Die Shows a 5 or a 6}
P(A  B) = P(A) + P(B)
B = {Sum is less than 11}
In the previous example,
P(A  B) = P(A) + P(B) = 6/36 + 15/36 = 21/36

U4-32 U4-33
Example Complements
The events A and B are exhaustive, as each outcome The complement Ac of an event A is the event
in the sample space is covered by at least one of them. consisting of all outcomes in the sample space
which are not contained in A. It follows that
S = {11, 12, 13, 14, 15, 16,
21, 22, 23, 24, 25, 26, P(Ac) = 1 – P(A)
31, 32, 33, 34, 35, 36,
41, 42, 43, 44, 45, 46, In other words, two events are complements if they
51, 52, 53, 54, 55, 56, are both mutually exclusive and exhaustive of the
61, 62, 63, 64, 65, 66} sample space.

U4-34 U4-35
Example Example
Consider the events A = Bc and B = Ac

A = {Exactly one die shows an odd number} S = {11, 12, 13, 14, 15, 16,
B = {Sum is even} 21, 22, 23, 24, 25, 26,
31, 32, 33, 34, 35, 36,
These events are complements of one another, as 41, 42, 43, 44, 45, 46,
together, they contain all possible outcomes, but they 51, 52, 53, 54, 55, 56,
contain no outcomes in common. 61, 62, 63, 64, 65, 66}
U4-36 U4-37
Some Properties of Unions & Intersections Some Properties of Unions & Intersections
 P(A) = P(A  B) + P(A  Bc)  P(A  B) = P(B  A)
 P(A  B) = P(B  A)
If an event A is contained within an event B, A  B,
S then
 P(A  B) = P(A)
 P(A  B) = P(B)
AA BB

U4-38 U4-39
Some Properties of Unions & Intersections Example
 P(A  Ac) = 0 A hardware store accepts both Visa and MasterCard.
 P(A  Ac) = 1 Suppose that 62% of its customers have a Visa card,
33% have a MasterCard, and 21% have both.
 P(A  (B  C)) = P((A  B)  C)
 P(A  (B  C)) = P((A  B)  C) Calculate the probability that a randomly selected
customer has
 P(A  B)c = P(Ac  Bc)
 P(A  B)c = P(Ac  Bc) (i) only one of the two cards
(ii) neither card

U4-40 U4-41
Example Example
(i) P(only one card) = P(V  Mc) + P(Vc  M) (ii) P(neither card) = P(Vc  Mc) = P(V  M)c
= (P(V) – P(V  M)) + (P(M) – P(V  M)) = 1 – P(V  M) = 1 – (P(V) + P(M) – P(V  M))
= (0.62 – 0.21) + (0.33 – 0.21) = 0.41 + 0.12 = 0.53. = 1 – (0.62 + 0.33 – 0.21) = 1 – 0.74 = 0.26

S S 0.26

V M V M
0.21 0.21
0.41 0.12 0.41 0.12
U4-42 U4-43
Conditional Probability Conditional Probability
The probability assigned to an event depends on The conditional probability P(B|A) is the probability
what is known about the experimental situation that the event B occurs, given that an event A has
when the assignment is made. Subsequent to the occurred.
initial assignment, additional information relevant For example, odds-makers have estimated that the
to the outcome of the experiment may become probability the Toronto Blue Jays win their next game
available. Such information may cause us to revise against the New York Yankees is 0.6. The day before
our probability assignments. the game, it is announced that the Blue Jays’ star
pitcher cannot play due to an injury. The probability
that Toronto wins, given that their star pitcher is hurt,
will be lower than 0.6.

U4-44 U4-45
Conditional Probability Example
If we roll two fair dice, what is the probability that the sum is
The conditional probability that an event B occurs,
at least ten, given that both dice show an even number?
given that an event A has occurred, is
(A  B
PP(A B)
) Let A = {Both Numbers Even} and B = {Sum at Least 10}
P(A | B) 
P(B|A)
PP(A)
( B)
A = {22, 24, 26, 42, 44, 46, 62, 64, 66}
S B = {46, 55, 56, 64, 65, 66}
A  B = {46, 64, 66}
A B

U4-46 U4-47
Example Example
A box contains 100 Christmas lights, in three colours. Some You randomly select one light from the box. If the bulb is
of the lights are still functioning, and some are burnt out. white, what is the probability that it is functioning?
The following table displays the number of lights in each
colour, as well as how many are functioning or burnt out:

Red White Green Total


Functioning 15 30 15 60 What is the probability that the bulb is red, given that it is
Burnt Out 20 10 10 40 burnt out?

Total 35 40 25 100
U4-48 U4-49
Example Probability of Intersections
From past records, the professor of a large class knows that It follows from the definition of conditional probability that
15% of students get an A+ on the midterm, 12% get an A+
on the final exam, and 9% get an A+ on both the midterm P(A  B) = P(A)P(B|A) = P(B)P(A|B)
and the final.
For example, if we randomly select two bulbs from the box,
What is the probability that a student gets an A+ on the what is the probability that they are both white?
final, given he got an A+ on the midterm?
Let A = {first bulb is white} and B = {second bulb is white}.

P(A  B) = P(A)P(B|A) = (40/100)(39/99) = 0.1576.

U4-50 U4-51
Probability of Intersections Probability of Intersections
By definition of conditional probability, More generally, the probability of the intersection of n events
A1, A2, . . . , An is calculated as

and so

U4-52 U4-53
Example Example
Suppose that you are putting up Christmas lights and one Then
strand has four bulbs that aren’t working. To replace them,
you randomly select four bulbs from the box. What is the P(all four bulbs are functioning) = P(A1  A2  A3  A4)
probability that none of the bulbs is burnt out? = P(A1)P(A2|A1)P(A3|A1  A2)P(A4|A1  A2  A3)
= (60/100)(59/99)(58/98)(57/97) = 0.1244
Let A1 = {first bulb is functioning}
A2 = {second bulb is functioning} Note that the conditional probabilities are changing
A3 = {third bulb is functioning} because we are sampling without replacement. When
A4 = {fourth bulb is functioning} sampling with replacement, the probabilities remain
constant for each draw.
U4-54 U4-55
Independence Independence
Two events A and B are said to be independent if It follows that A and B are independent if and only if

P(B|A) = P(B) P(A  B) = P(A)P(B)

In other words, A and B are independent if the knowledge since P(A  B) = P(A)P(B|A) and P(B|A) = P(B) for
that A has occurred does not change the probability that B independent events.
will occur. If A and B are independent, we must also have
In general, if A1, A2, . . . , An are all independent, then
P(A|B) = P(A)

U4-56 U4-57
Example Example
For the dice experiment, consider the events P(A  B) = 1/36
P(A)P(B) = (1/6)(1/6) = 1/36
A = {first die shows a 4}
B = {sum is 7}
Since P(A  B) = P(A)P(B), the events A and B are
C = {sum is 8}
independent.
Are A and B independent? Are A and C independent?
P(A  C) = 1/36
We have
P(A)P(C) = (1/6)(5/36) = 5/216
A = {41, 42, 43, 44, 45, 46} B = {16, 25, 34, 43, 52, 61}
C = {26, 35, 44, 53, 62} Since P(A  C)  P(A)P(C), the events A and C are not
A  B = {43} A  C = {44} independent (i.e. they are dependent).

U4-58 U4-59
Example Example
When randomly selecting one light bulb from the box,
Recall the Christmas lights example:
P(R  B) = 0.2
Red White Green Total P(R)P(B) = (0.35)(0.4) = 0.14
Functioning 15 30 15 60 Since P(R  B)  P(R)P(B), R and B are not independent.
Burnt Out 20 10 10 40
P(G  B) = 0.1
Total 35 40 25 100 P(G)P(B) = (0.25)(0.4) = 0.1
Are the event that the selected bulb is red and the Since P(G  B) = P(G)P(B), G and B are independent.
event that the bulb is burnt out independent? What
about getting a green bulb and a burnt out bulb?
U4-60 U4-61
Example Example
1. We randomly select one person in the city. The outcome
Suppose we have the following facts of people in a of interest is which newspapers they read (if any). List the
certain city: complete sample space of outcomes.
2. What is P(A  B)?
3. What is P(C)?
 60% read Newspaper A. 4. What is the probability someone reads Newspaper C if we know
 37% read Newspaper B. they read Newspaper A?
 58% read Newspaper B or C. 5. What is the probability someone reads Newspaper A if we know
they don’t read Newspaper B?
 14% read Newspapers A and B.
6. What is the probability a person reads exactly one of the three
 18% read Newspapers A and C. newspapers?
 9% read Newspapers B and C. 7. What is the probability a person reads none of the three papers?
 3% read all three newspapers. 8. Are any two of the events A, B and C independent?

U4-62 U4-63
Example Example
In some cases, independence is clearly present. For It is clear that whether one person is left-handed is
example, suppose we randomly select three people. independent of any other person. Therefore, we can
The outcome of interest is whether each of the three find the probability of each outcome in the sample
people is left-handed or right-handed. The sample space by multiplying the probabilities for each of the
space for this experiment is three people. For example, consider the outcome in
which the first person selected is left-handed and the
S = {LLL, LLR, LRL, RLL, LRR, RLR, RRL, RRR} next two are right-handed.

Suppose it is known that 10% of people are left-handed. P(LRR) = P(L)P(R)P(R) = (0.1)(0.9)(0.9) = 0.081

U4-64 U4-65
Example Example
The probabilities for all outcomes are calculated What is the probability that exactly one of the three
similarly and are shown below: people in the sample is left-handed?

Outcome Probability Outcome Probability Since outcomes are mutually exclusive,


LLL 0.001 RRL 0.081
LLR 0.009 RLR 0.081 P(one person is left-handed)
LRL 0.009 LRR 0.081 = P(LRR  RLR  RRL)
RLL 0.009 RRR 0.729 = P(LRR) + P(RLR) + P(RRL)
= 3(0.081) = 0.243
U4-66 U4-67
Calculating Probabilities of Events Probability Distribution
In general, to find the probability of an event: Let X be the number of people in our sample that are left-
handed.
1) List all outcomes contained in the event. P(X = 0) = P(RRR) = 0.729
P(X = 1) = P(RRL) + P(RLR) + P(LRR) = 3(0.081) = 0.243
2) Find the probability of each of these outcomes. P(X = 2) = P(LLR) + P(LRL) + P(RLL) = 3(0.009) = 0.027
P(X = 3) = P(LLL) = 0.001

3) Add these probabilities. The probability distribution of X is shown below:

x 0 1 2 3
P(X = x) 0.729 0.243 0.027 0.001

U4-68 U4-69
Example Example
Recall the example where we had a fair six-sided die P(same colour) = P(BB) + P(RR) + P(GG)
with three faces painted blue, two faces painted red,
and one face painted green. We roll the die twice,
and the outcome of interest is the colour facing up on = P(B)P(B) + P(R)P(R) + P(G)P(G)
each roll. The sample space is
S = {BB, BR, BG, RB, RR, RG, GB, GR, GG} = (3/6)(3/6) + (2/6)(2/6) + (1/6)(1/6) = 14/36 = 7/18

What is the probability of getting the same colour on


both rolls?

U4-70 U4-71
Infinite Geometric Series Infinite Geometric Series
A series of the form For example, consider the series

is called an infinite geometric series, where a is the first term This is an infinite geometric series with first term a = 1 and
and r is the common ratio. If – 1 < r < 1, it can be shown that common ratio r = 1/2, and so
the series converges, and
U4-72 U4-73
Infinite Geometric Series Example
Note that an infinite geometric series doesn’t always start with You repeatedly roll a fair six-sided die. What is the probability
n = 0. For example, consider the series it takes an even number of rolls to get the first 6?
Let X be the number of rolls required to get the first 6. Then

This is an infinite geometric series with first term a = (1/3)(4/5)3


and common ratio r = 4/5, and so
This is an infinite geometric series with first term
a = (5/6)(1/6) = 5/36 and common ratio r = (5/6)2 = 25/36
and so

U4-74 U4-75
Example Partitions
A bird starts at position 2 in the diagram below. Each second, Let A1, A2, . . . , An be a partition of a sample
he flies one position to the left with probability 0.53 or one
position to the right with probability 0.47. His movements space S, so that the events Ai are mutually exclusive
are independent. and exhaustive of S, i.e.,

S = A1  A2    A n

and Ai  Aj =  for all i  j


The bird’s flight will continue until he either returns to his
cage (position 3) or he encounters a hungry waiting cat
(position 0). What is the probability the bird makes it safely
back to his cage?

U4-76 U4-77
Law of Total Probability Example
Then the probability of an event B is equal to We randomly select one coloured ball from the first
box shown below and place it in the second box. We
then randomly select one ball from the second box.

A1 A2 A3 A4 A5
U4-78 U4-79
Law of Total Probability Law of Total Probability
The probability that a red ball is selected from the In this case, we partition the sample space according
second box is to the ball selected from the first box.

SS
R2
G1  R2 R1  B
R2 B1  R2

G
A11 R1 A3 B1

U4-80 U4-81
Example Example
Of all components produced in a large factory, 60% are The probability that the selected item is defective is
produced by Machine 1, 30% are produced by Machine 2
and 10% are produced by Machine 3.
It is known that 3% of the components produced by
Machine 1 are defective, as are 5% of components
produced by Machine 2 and 2% of components produced
by Machine 3.
What is the probability that a randomly selected
component produced in the factory is defective?

U4-82 U4-83
Law of Total Probability Bayes’ Theorem
You randomly select a component produced in the factory and
In this case, we partition the sample space according find that it is defective. What is the probability it was produced
to the machine on which the item is made. by Machine 2?

SS
D
M1  D M2 BD M3  D

M
A11 M2A3 M3
The equation used to solve this problem is called Bayes’ Rule.
U4-84 U4-85
Bayes’ Theorem Example
We may want to use the probabilities P(Ai) and P(B|Ai) to Suppose we have the following information about jury trials in
find the conditional probabilities a certain U.S. state:

P(A1|B), P(A2|B), . . . , P(An|B)


• It is known that 85% of defendants are guilty.
From the definition of conditional probability and the Law of • If a defendant is innocent, the jury finds him guilty 8% of the
Total Probability, we calculate these probabilities as follows: time and not guilty 92% of the time.
• If a defendant is guilty, the jury finds him guilty 77% of the
time and not guilty 23% of the time.

U4-86 U4-87

Example Example
If a defendant is found to be guilty by a jury, what is the By Bayes’ Theorem,
probability he is actually innocent?

We have the following information:

P(G) = 0.85 P(I) = 0.15


P(JG|I) = 0.08 P(JNG| I) = 0.92
P(JG|G) = 0.77 P(JNG| G) = 0.23 Despite the fact that the jury is wrong 8% of the time when the
defendant is innocent and 23% of the time when the defendant
is guilty, less than 2% of convicted defendants are actually
innocent!

U4-88 U4-89
Application: System Reliability Application: System Reliability
Reliability analysis is an important component of engineering Consider the following system of components. The system
work. The reliability of a product is the probability of the operates successfully only if it is possible to progress from one
product fulfilling its intended function during its life cycle. side of the diagram to the other through components that have
not failed.
r3 r4
Suppose we have a system consisting of several components.
r9
The reliability of a component, denoted by r, is the probability r2
that the component is successful in performing its intended
task. The probability of the component failing is therefore r1 r5 r10 r12
equal to 1 – r.
r6 r7 r8
r11
U4-90 U4-91
Application: System Reliability Application: System Reliability
If the reliabilities ri of the components are known, then, The components in a system can be configured in either of two
assuming the components operate independently, we can ways. If the components are placed in series, then each
calculate the reliability rS of the entire system. component must function for the entire system to function.

Let Ci be the event that the ith component functions and let R
be the event that the entire system functions. Then r1 r2 r3 rn

P(Ci) = ri and P(R) = rS

U4-92 U4-93
Application: System Reliability Application: System Reliability
Since components function independently, when they If the components in a system are placed in parallel, then
are placed in series, the reliability of the system is the system will function if at least one of the components
functions.
r1
r2
Consider the following system:
r3

0.99 0.90 0.98 0.95


rn
rS = (0.99)(0.90)(0.98)(0.95) = 0.8295

U4-94 U4-95
Application: System Reliability Application: System Reliability
When components are placed in parallel, the reliability of the Consider the following system:
system is

0.80

0.90

0.85
Since we only need one of the components to function, the
probability that the system functions is one minus the
probability that the components all fail.
U4-96 U4-97
Example Example
A complex system may consist of components placed both in The reliability of a complex system can be calculated
series and in parallel. by decomposing the system into a series of simpler
modules.
0.97 0.98
0.96 The reliabilities of each module can be calculated
0.99
first, and then the system reliability can be calculated
0.95 0.92 0.95 0.98 by combining the modules in the appropriate manner.
0.90 0.95 0.99
0.94

U4-98 U4-99
Example Example
Let Mi be the event that a module functions. For this system, rS = P(M1)P(M6)P(M7)P(M8)

Module 4 Module 3 Module 4 Module 3


Module 7 Module 7
0.97 0.98 0.97 0.98
Module 2 0.96 Module 2 0.96
Module 1 0.99 Module 8 Module 1 0.99 Module 8

0.95 0.92 0.95 0.98 0.95 0.92 0.95 0.98

0.90 0.95 0.99 0.90 0.95 0.99


0.94 0.94
Module 5 Module 5

Module 6 Module 6

U4-100 U4-101
Example Example
Now we find P(M6) by first calculating P(M2), P(M3), P(M4)
We know P(M1) = 0.95 and P(M8) = 0.98. and P(M5).

Module 4 Module 3 Module 4 Module 3


Module 7 Module 7
0.97 0.98 0.97 0.98
Module 2 0.96 Module 2 0.96
Module 1 0.99 Module 8 Module 1 0.99 Module 8

0.95 0.92 0.95 0.98 0.95 0.92 0.95 0.98

0.90 0.95 0.99 0.90 0.95 0.99


0.94 0.94
Module 5 Module 5

Module 6 Module 6
U4-102 U4-103
Example Example
P(M3) = P(M2  C5) = 1 – (1 – P(M2))(1 – P(C5))
P(M2) = P(C3  C4) = r3  r4 = (0.97)(0.98) = 0.9506
= 1 – (1 – P(M))(1 – r5)
C3 C4 = 1 – (1 – 0.9506)(1 – 0.92) = 0.996048

C3 C4

C5

U4-104 U4-105
Example Example
P(M4) = P(C2  M3) = r2P(M3) = (0.99)(0.996048) = 0.986088 P(M5) = P(C6  C7  C8)
= r6  r7  r8 = (0.90)(0.95)(0.99) = 0.84645

C6 C7 C8
0.90 0.95 0.99
C2 Module 5
C3 C4
C5

U4-106 U4-107
Example Example
Now, P(M7) = P(C9  C10  C11)
P(M6) = P(M4  M5) = 1 – (1 – P(M4))(1 – P(M5)) = 1 – (1 – r9)(1 – r10)(1 – r11)
= 1 – (1 – 0.986088)(1 – 0.84645) = 1 – 0.002136 = 0.997864 = 1 – (1 – 0.96)(1 – 0.95)(1 – 0.94)
= 1 – 0.00012 = 0.99988

C9

C10

C11
U4-108
Example
So rS = P(M1)P(M6)P(M7)P(M8)
= (0.95)(0.997864)(0.99988)(0.98) = 0.9289

Module 4 Module 3
Module 7
0.97 0.98
Module 2 0.96
Module 1 0.99 Module 8

0.95 0.92 0.95 0.98

0.90 0.95 0.99


0.94
Module 5

Module 6

You might also like