0% found this document useful (0 votes)
55 views19 pages

HL Probability Notes

This document provides notes on probability concepts. It defines key terms like experiment, outcome, sample space, event, theoretical probability, experimental probability, and relative frequency. It also covers concepts like complementary events, independent events, mutually exclusive events, and using Venn diagrams to calculate probabilities. Sample problems are provided to illustrate calculating probabilities for complementary, independent, and mutually exclusive events.

Uploaded by

Joseph Ang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views19 pages

HL Probability Notes

This document provides notes on probability concepts. It defines key terms like experiment, outcome, sample space, event, theoretical probability, experimental probability, and relative frequency. It also covers concepts like complementary events, independent events, mutually exclusive events, and using Venn diagrams to calculate probabilities. Sample problems are provided to illustrate calculating probabilities for complementary, independent, and mutually exclusive events.

Uploaded by

Joseph Ang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

HL Mathematics – Probability Notes 1

Probability Notes
1 Prior Knowledge - Terms, symbols and notation
Probability theory is the study of the chance (or likelihood) of events happening.
Consider this: You throw a six-sided die and you want to find the probability of getting a 4.
You do this thirty times and record the number on the top face of the die each time.

Experiment What you are doing; e.g. throwing a die.

Number of Trials (n) How many times you do the experiment; e.g. n = 30.

Outcome What happens each time; e.g. the number on the top face of the die.

Equally Likely All possible outcomes have the same chance of happening.

Fair It is a fair experiment, if all possible outcomes are equally likely.

Sample Space (U) All the possible outcomes; e.g. U = {1, 2, 3, 4, 5, 6}.

n(U) The total number of all possible outcomes; e.g. n(U) = 6.

Event (E) The “thing” we want to happen; e.g. getting a 4.

n(E) The total number of “things” we want to happen; e.g. n(E) = 1.

n ( E) 1
Probability P( E)= ; e.g. P(4)= .
n(U ) 6

Expected Number Expected Number = P( E)×n ;

e.g. How many 4’s would you expect to get in 30 throws;


1
∴ Expected Number = P( 4)×30= ×30=5 .
6
Theoretical In theory, using arguments of symmetry, what the probability should be;
1
Probability e.g. P( 4)= .
6

Experimental What you actually get when you do the experiment a number of times;
Probability e.g. If you threw the dice 30 times and 4 came up 7 times,
7
then P( 4)= .
30
NOTE: Experimental probability = Relative frequency
Frequency The number of times that an outcome is observed;
e.g. If 4 came up 7 times the frequency = 7.

Relative Frequency The frequency of that outcome expressed as a fraction or a percentage of


the total number of trials;
7
e.g. If "4" came up 7 times, relative frequency = .
30
Long-run Relative In practice, many probabilities can only be determined by observation of

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 2

Frequency a large number of repeated trials called long-run relative frequency.

Random Variable A variable whose value is a number determined by the outcome(s) of an


experiment. We distinguish between the random variable X and one of
its possible values, x .

Discrete Random Take on some value as the result of a counting process. The question,
Variables "How many?" is being asked. We can find the probability that a discrete
random variable is equal to a particular value.
e.g. X = The number of days with rain in a year.

Continuous Random Are determined by a measurement of some kind. Typical questions


Variables being asked are; "How long?", How heavy?" or "How much (capacity)?".
All that can be predicted about the value of a continuous random variable
is that it lies between two values or within an interval.
e.g. X = The length of time a candle burns for.

1.1 An interpretation of different probabilities


0 0.5 1

not likely to happen likely to happen


very unlikely very likely
to happen to happen
equal chance of
happening as not
impossible happening certain

2 Probability functions
Every random variable has an associated probability function, usually shown as a formula or
in a table.
For a discrete random variable, X say, we would have:

Values of x x1 x2 ... ... ... xn − 1 xn


P( X = x ) P( x 1) P( x 2) ... ... ... P( x n − 1) P( x n )

Example: Let X be the random variable representing the number of heads obtained when a
fair coin is tossed twice.
For this situation, U = { TT , TH , HT , HH } , and can now be represented in the following
table:
Values of x 0 1 2
P( X = x ) 0.25 0.5 0.25

Note 1: The bottom row of the table should always add to 1.


Note 2: Continuous probability density functions will be covered later in the Normal
Distribution section of this topic and in the Integration topic covered later in this course.

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 3

3 Complementary events
Definition: Two events are called complementary if their union is the sample
space.
Given an event, A , and its complementary event, A' , we have the following:

• A ∪ A' = U , the sample space or universal set.

• P( A) + P( A ' ) = 1 OR P( A) = 1 − P( A' ) .

Example: The probability that it will rain tomorrow is 0.35.


Find the probability that it will not rain tomorrow.

Solution: Let R be the random variable that it will rain tomorrow.

P( R' ) = 1 − P(R)
= 1 − 0.35
= 0.65

4 Independent events
Definition: Two events A and B are independent if :

P( A ∩ B) = P( A) . P(B)

In practice, it means that the first event A should have no effect on the outcome of the
second event B .

Example: The probability that a car chosen at random is exceeding 50 km h-1 is 0.3.
The probability that a car chosen at random is without a current warrant of
fitness is 0.05. Find the probability that a car chosen at random is both
exceeding 50 km h-1 and does not have a current warrant of fitness.
Solution:
P(S ∩ W ' ) = P(S ) × P (W ' )
= 0.3 × 0.05
= 0.015

5 Mutually exclusive events


Definition: Two events A and B are mutually exclusive or disjoint if :

P( A ∪ B) = P( A) + P(B)

In practice, it means that the first event A should have nothing in common with the
second event B .

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 4

Note 1: Mutually exclusive means that P( A ∩ B) = ∅ or .

Note 2: ∅ is the symbol for the empty set.

Example: The probability that a bank receives its daily computer printouts for a
previous day's transactions before 9:00 am is 0.12. The probability that it
receives it after 5:00 pm is 0.35. If office hours are from 9:00 am to 5:00 pm,
find the probability that, on a day chosen at random, the printout arrives
outside of office hours.

Solution: Let T be the time at which the printout arrives.

P( T < 9: 00 am ∪ T > 5: 00 pm) = P( T < 9: 00 am) + P(T > 5 :00 pm )


= 0.12 + 0.35
= 0.47

6 Using Venn diagrams for probabilities

Definition: If A and B are not disjoint, then:

P( A ∪ B) = P( A) + P(B) − P( A ∩ B)

Note 1: We subtract the intersection to avoid adding it in twice.


Note 2: The ∪ means or so P( A ∪ B) means P( A or B) .
The outcome can lie in either A or B or both.
Note 3: The ∩ means and so P( A ∩ B) means P( A and B) .
The outcome must lie in both A and B.

Example: The probability of John passing the Statistics Test is 0.74.


The probability that he passes the Calculus Test is 0.65.
The probability that he passes both tests is 0.57.
Find the probability that he fails both tests.

Solution: Draw a Venn diagram showing all this information first.

P( fails both tests) = P( S ∪ C ) '


= 0.18

Note 1: 0.17 + 0.57 = 0.74 and 0.57 + 0.08 = 0.65

Note 2: 0.17 + 0.57 + 0.08 + 0.18 = 1

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 5

7 Probability Trees
• The tree branches from left to right.
• To find probabilities of events shown on the right, we multiply probabilities along the
branches leading to them.
• At each stage, probabilities must add vertically to 1.

Example 1: A basketball team plays 2 out of 5 games at home. If they play at home, the
probability of them winning is 0.7. If they are playing away, their chances of
winning are 0.5.
Find the probability that they win their game if it is not known whether the
game is going to be played at home or away.

Solution: Draw a probability tree for this situation.

Therefore, P( win ) = 0.28 + 0.30 = 0.58

Example 2: Roman and Jan play darts. They have alternate throws at the dart board.
The first person to hit 20 wins the game.
The probability of Roman hitting 20 on a single throw is 0.12.
The probability of Jan hitting 20 on a single throw is 0.16.
If Roman starts the game, find the probability that he wins.
Solution:

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 6

P( R wins) = P( R wins throw 1) + P( R wins throw 3) + P(R wins throw 5) + ... ...
= 0.12 + ( 0.88)(0.84)(0.12) + (0.88)(0.84)( 0.88)( 0.84)(0.12) + ... ...

This is an infinite geometric series where u 1 = 0.12 and r = (0.88)(0.84) .

u1
S∞ =
1−r
0.12
=
1 − (0.88)( 0.84)
= 0.460

Therefore, the probability that Roman wins the darts game is 0.460.

8 Conditional Probability
Some events cannot occur unless another event has occurred first. Such probabilities are
called conditional.

The probability of event A occurring is influenced by whether event B has occurred first.

The occurrence of event B may make event A certain, more likely, less likely or
impossible to happen.

The conditional probability that event A occurs, given that event B has occurred is
written P( A | B)

The probabilities in the branches of a probability tree are conditional ones.

Multiplying along the branches of the probability tree gives us the following formula.

P( B) . P( A | B) = P( A ∩ B)

P(A ∩ B)
P(A | B) =
P(B)

Note: If A and B are independent events then P( A | B) = P(A) = P(A | B')

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 7

Example: A pensioner goes to an ex-servicemen's club. The probability that he plays a


game of snooker 0.35, the probability that he plays darts is 0.65 and the
probability that he plays both is 0.20.

Find the probability that if he plays a game of darts, he also plays a game of
snooker.

Solution: We are given P( S) = 0.35 , P( D) = 0.65 and P (S ∩ D ) = 0.2 .

P(S ∩ D)
P( S | D) =
P(D)
0.20
=
0.65
= 0.308

9 Parameters for Discrete Random Variables


9.1 Expectation
The expectation of a random variable can be thought of as a theoretical average value that
the random variable can take; i.e. a long-run average if the experiment was repeated a large
number of times.
We write the expected value of X as E(X) . It is also called the mean of X or the
population mean. Another symbol for E(X) is μ .

Example 1: When a game of bridge is played (four players with 52 cards) and X is the
number of hearts in a players hand, we have:

13
E( X ) = Since there are 13 hearts in a normal pack of cards.
4
This is the theoretical average value of the number of
= 3.25 hearts in each players hand.

For a discrete random variable, X say, we would have:

Values of x x1 x2 ... ... ... xn − 1 xn


P( X = x ) P( x 1) P( x 2) ... ... ... P( x n − 1) P( x n )

Hence, E(X) can be defined more formally as:

E( X ) = x 1 . P(x 1 ) + x 2 . P( x 2 ) + ...... + x n − 1 . P( x n − 1 ) + x n . P(x n)


OR
n
E( X ) = ∑ x i . P( x i )
i=1

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 8

Example 2: A fair six-sided die is tossed once. Show that the expectation of X , the number
shown on the top face of the die, is 3.5 .

Solution: The probability distribution for a fair, six-sided die is given by:

Values of x 1 2 3 4 5 6
1 1 1 1 1 1
P( X = x )
6 6 6 6 6 6

1 1 1 1 1 1
E( X ) = 1( ) + 2( ) + 3( ) + 4( ) + 5( ) + 6( )
6 6 6 6 6 6
21
=
6
= 3.5
Note: We can deduce that:
E( a X ) = a E (X )
E( X + b) = E( X ) + b
E(a X + b) = a E ( X ) + b

9.2 Variance and Standard Deviation


The variance, Var ( X ) = σ 2 , and/or standard deviation, σ , of a random variable provides
information about how outcomes are spread out from one experiment to another.

Remember: standard deviation = √ variance .

Var(X) can be defined more formally as the expected squared distance from its mean; i.e.
it averages these distances out. It can be found using the following equivalent formulae:

Var ( X ) = E( X − μ)2 OR Var ( X ) = E( X 2 ) − μ2

Example 3: A single six-sided fair die is tossed once. Find the variance of the number of
dots shown on the top face of the die.

Solution: Using the table shown above, we have:

2 1 2 1 2 1 2 1 2 1 2 1 2
Var ( X ) = 1 ( ) + 2 ( ) + 3 ( ) + 4 ( ) + 5 ( ) + 6 ( ) − 3.5
6 6 6 6 6 6
= 2.92

Note: We can also deduce that:


Var (a X ) = a 2 Var ( X )
Var (a X + b) = a 2 Var ( X )

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 9

10 The Binomial Distribution


10.1 The Binomial General Formula
An experiment in which there are only two outcomes is called a Bernoulli trial;
e.g. tossing a coin. If the experiment is repeated several times, and the probability of
"success" at each trial remains the same, and each trial is independent, then the total number
of "successes" obtained over all the trials has a binomial distribution.
Consider: A spinner is made of green and red cardboard in such a way that:
1 2
P( G) = and P( R) = .
3 3
Example: Find the probability of getting in three spins:
(a) Two greens and a red in that order.
(b) Two greens and a red in any order.

Solutions:

( ) × ( 23 ) = 272
2
1
(a) P( GGR) =
3

(b) P( RGG or GRG or GGR)

() ()
2
1 2 1
= 3× × =
3 3 9

Extending this further, what would be the probability of getting 6 greens and 4 reds
from 10 spins?

() ()
6 4
1 2
Answer = ?? × × , where ?? is the number of ways of getting 6 greens
3 3
out of 10 spins.

() ()
6 4
1 2
Therefore, Answer = 10C6 × × .
3 3
Hence, the probability of getting r greens out of n spins will be given by

() ()
r n−r
n 1 2
Answer = Cr × × .
3 3
Therefore, the general formula is

P( X = x) = nC r p r q n − r
where X is the binomial random variable representing the number of successes,
n is the number of trials, p is the probability of success in an individual trial and
q is the probability of failure; that is, q = 1 − p .

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 10

10.2 Conditions for a Binomial Distribution


1. There is a fixed number of trials.

2. The trials are independent.

3. Each trial has only two outcomes; i.e. success or failure.

4. The probability of success, p , at each trial must be the same.

Example 1: A traffic officer checks 5 cars in succession. He knows from experience


1
that the probability of a car not having a warrant of fitness is .
6

Find the probability that three of the cars don't have a warrant of fitness.

1 5
Solution: We have n = 5 , p = and q = .
6 6

() ()
3 5−3
5 1 5
P( X = 3) = C3 × ×
6 6

10 × (
216 ) ( 36 )
1 25
= ×

125
=
3888
= 0.0322

Or, using your GDC:

1. Press 2nd vars which selects the "dist" function.

2. Scroll down and select "A:binompdf" (Note: p for an exact probability).

3. Enter the following data: trials: 5


P: 1/6
x value: 3

Scroll down to paste then press enter .

On the screen will appear "binompdf(5 , 1/6 , 3).

4. Press enter again to get the answer = 0.0322.

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 11

Example 2: The organisers of the Singapore F1 Grand Prix know that 35% of racing cars
develop some sort of engine trouble during a race. If there are 15 cars in the
race, find the probability that at least 2 cars develop engine trouble during the
race.

Solution: We have n = 15 , p = 0.35 and q = 0.65 .

P( X ≥ 2) 1 − P( X ≤ 1)
= 1 − ( P( X = 0) + P( X = 1))
= 1 − ( 15C0 × ( 0.35)0 × ( 0.65)15 + 15C1 × (0.35)1 × (0.65)14 )
= 1 − 0.0141787846
= 0.986

Or, using your GDC:

1. Press 1 ̶ 2nd vars which selects the "dist" function.

2. Scroll down and select "B:binomcdf" (Note: c for cumulative probability).

3. Enter the following data: trials: 15


P: 0.35
x value: 1

Scroll down to paste then press enter .

On the screen will appear "binomcdf(15 , 0.35 , 1).

4. Press enter again to get the answer = 0.9858212154

10.3 Mean, Variance and Standard Deviation of a Binomial Distribution


The following formula can be found in your formulae booklet.

• Mean = E(X) = μ = n p .

• Variance = σ 2 = n p q .

• Standard Deviation = σ = √ n p q .

Note: We sometimes write X ∼B( n , p) to represent the random variable, X , of a


binomial distribution with parameters n and p .

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 12

11 The Normal Distribution


11.1 The Probability Density Function
If X is normally distributed then its probability density function is given by:
2
− ( )
1 x −μ
1 σ
f (x) = e 2
σ √2 π

where μ is the mean and σ 2 is the variance of the distribution.

This function represents a family of bell-shaped curves. We sometimes write


X ∼N (μ , σ2 ) where each bell-shaped curve is specified by the parameters μ and σ 2 .

11.2 Areas and Probabilities


The area shaded under the bell-shaped curve represents the probability that X lies between
two values, say a and b; that is, a ≤ X ≤ b .

Note: The total area under the curve is equal to 1.

For a normal distribution with mean μ and standard deviation σ , the proportional
breakdown of where the random variable could lie is given below.

P (− σ ≤ X ≤ σ) = .6826 P (− 2 σ ≤ X ≤ 2 σ) = .9544 P (− 3 σ ≤ X ≤ 3 σ) = .9974

Example: The lengths of bolts made by a company are normally distributed with a
mean of 4 cm and a standard deviation of 0.05 cm.

(a) Find the probability that a bolt chosen at random will be between 3.9 cm and 4 cm.

(b) Find the probability that a bolt chosen at random will be less than 3.9 cm.

(c) Find the probability that a bolt chosen at random will be greater than 4.05 cm.

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 13

Solutions 1: (In a non-calculator paper, using what we know about areas and the number
of standard deviation from the mean.)

(a)

We can deduce that 3.9 cm is 2 × 0.05 = 2 σ from the mean.

P (3.9 < X < 4) = 0.9544÷2


= 0.4472

(b)

P ( X < 3.9) = 0.5 − (0.9544÷2)


= 0.0228

(c)

We can deduce that 4.05 cm is 1 × 0.05 = 1 σ from the mean.

P ( X > 4.05) = 0.5 − (0.6826÷2)


= 0.1587

Note: You should always draw a quick sketch and shade the area you are trying to find.

Solutions 2: (In a calculator paper, using your GDC.)

We have μ = 4 and σ = 0.05

(a) For P (3.9 < X < 4) , use normalcdf(3.9 , 4) = 0.477

(b) For P ( X < 3.9) , use normalcdf( ̶ E99 , 3.9) = 0.0228

(c) For P ( X > 4.05) , use normalcdf(4.05 , E99) = 0.159

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 14

11.3 Standard Normal Distribution


What happens when we require values other than exactly 1, 2 or 3 standard deviations?

Consider the following example:

A product has a normally distributed weight, with a mean 800 g and a standard
deviation of 100 g. Find the probability that the weight of a randomly chosen item
is between 800 g and 950 g.

If x represents the weight of a randomly chosen item, then, if x = 950,


150
it is 950 − 800 = 150 from the mean, which is = 1.5 standard deviations from the
100
mean. If we let z equal this value, we get the following formula:

x−μ .
z= σ

This is called the z score. If X is normally distributed, then z is said to have a


standard normal distribution where μ = 0 and σ = 1 .

Therefore, to do the problem above, we need to use the standard normal distribution
and the z score.

z = x−μ
σ
950 − 800
=
100
= 1.5

P (800 < X < 950) = P (0 < z < 1.5)


= 0.433 Using GDC with normalcdf(0 , 1 , 1.5)

Note: Using GDC where μ = 800 and σ = 100 with normalcdf(800, 100, 950), you will
also get 0.433 .

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 15

11.4 Inverse Normal Function


This is the inverse of finding probabilities. To do this, we use the inverse normal function
on our GDC's.

Note 1: The invNorm( function on your GDC uses the area from − ∞ to a in order to
find the value of a .
This is the white area in the diagram below.

Note 2: Sometimes we need to convert to z scores to find the answers. When trying to find
the value of μ or the value of σ , we always have to use z scores.

Example 1: Find the value of a if P (Z > a ) = 0.8 .

Solution: The diagram for this situation is .

Therefore, a must be negative.

P ( Z > a ) = 0.8
Therefore, P ( Z ≤ a ) = 0.2 Using GDC with invNorm(0.2 , 0 , 1) which is the
white area in the diagram above.
Therefore, a = − 0.842 If we used invNorm(0.8 , 0 , 1), we would get + 0.842.

Example 2: A university professor determines that no more than 75% of this year's
History students should pass the final examination. The examination results
were approximately normally distributed with mean 62 and standard
deviation 13. Find the lowest score, k , necessary to pass the test.

Solution: Let X denote the final examination result.

We have μ = 62 and σ = 13 .

The diagram for this situation is .

P ( X ≥ k ) = 0.75
Hence, P ( X < k ) = 0.25
Therefore, k = 53.23163... Using GDC with invNorm(0.25 , 62 , 13)

Hence, the lowest score to pass the test is 54%.

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 16

Example 3: An adult scallop population is known to be normally distributed with a


standard deviation of 5.9 g. If 15% of scallops weigh less than 58.2 g,
find the mean weight of the population.

Solution: Let the mean weight of the population be μ g.


Let X be the weight of an adult scallop where σ = 5.9 .

The diagram for this situation is .

P ( X ≤ 58.2) = 0.15

Therefore,
(
P Z<
58.2 − μ
5.9 ) = 0.15 Converting to a z score using z =
x −μ
σ .

58.2 − μ
= − 1.03643... Using GDC with invNorm(0.15 , 0 , 1)
5.9
58.2 − μ = − 6.11495...
Therefore, μ = 64.3

Therefore, the mean weight of an adult scallop is 64.3 g.

12 The Poisson Distribution


The Poisson distribution is often called the distribution of rare events.

A Poisson process is where discrete events occur within a continuous, but finite, interval of
time or space.

The following conditions must apply:

1. For a small interval, the probability of the event occurring is proportional to the size
of the interval.
2. The probability of more than one occurrence in the small interval is negligible.
3. Events must not occur simultaneously.
4. Each occurrence must be independent of others.
5. The event must occur at random.

Note: The Poisson random variable, X say, is the number of such occurrences within the
fixed interval. In theory, there is no upper limit on the number of such occurrences.

Examples of situations where the Poisson Distribution would apply:


• The number of telephone calls received by a switchboard in one hour.
• The number of surface defects on a sheet of stainless steel of a particular size.
• The number of organisms present in a litre of milk.

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 17

12.1 Parameters of a Poisson distribution


• The Poisson distribution is discrete, meaning that only whole number values of X are
possible.
• It is a one parameter distribution; i.e. probabilities can be evaluated given just one piece
of information.
• The parameter is λ (lambda) and is the mean number of occurrences over the whole
interval.
• λ is also the variance. Therefore, the standard deviation = √ λ .
• We sometimes write X ∼Po( λ) to represent the random variable, X , of a Poisson
distribution with parameter λ .

12.2 The Probability Function f ( x)


The probability function, f ( x ) , for the Poisson random variable, X , is given by:

e− λ . λ x
f (x) =
x!

for x = 0 , 1 , 2 , 3 , ... , ... where λ is the mean number of occurrences.

Example 1: The average number of accidents at a pedestrian crossing every year is 5.


Find the probability that there are exactly 3 accidents there this year.

Solution: λ = 5 and we want P ( X = 3)

−5 3
P ( X = 3) = e .5
3!
= 0.140

Or, using your GDC:

1. Press 2nd vars which selects the "dist" function.

2. Scroll down and select "C:poissonpdf" (Note: p for an exact probability).

3. Enter the following data: λ:5


x value: 3

Scroll down to paste then press enter .

On the screen will appear "poissonpdf(5 , 3).

4. Press enter again to get the answer = 0.140

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 18

Example 2: A 1200 m length of telephone cable has 10 faults in it. If a 100 m length of
similar cable is chosen at random:

(a) Find the mean and standard deviation of the number of faults per
100 m.

(b) Find the probability that it has at least two faults.

Solution: (a)

1
λ = × 10
12
5
=
6

σ =
√ 5
6
= 0.203

(b)

P ( X ≥ 2) = 1 − P( X ≤ 1)
= 1 − P( X = 0) −P ( X = 1)

() ()
5 0 5 1

6 5 −
6 5
e . e .
= 6 6
1− −
0! 1!
= 0.203

Or, using your GDC:

1. Press 1 ̶ 2nd vars which selects the "dist" function.

2. Scroll down and select "D:poissoncdf" (Note: c for cumulative probability).

3. Enter the following data: λ : 5/6


x value: 1

Scroll down to paste then press enter .

On the screen will appear "poissoncdf(5/6 , 1).

4. Press enter again to get the answer = 0.203

CdK - OFS HS - March 2014


HL Mathematics – Probability Notes 19

Example 3: Over a long period of time, the number of road accidents per day on a certain
road has been found to have approximately a Poisson Distribution with mean
2 on weekdays and 3 at the weekend.
(a) Find the probability that there are 3 accidents on a day chosen at
random.
(b) If there are no accidents on a day, find the probability that it is a
weekday.

Solutions: Draw a probability tree to visualise the situation.

a)

P ( X = 3) = .1289 + .0640
= .193 Rounding to 3 s.f.

b)
P( weekday ∩ no accidents )
P( weekday/no accidents) =
P(no accidents )
.0966
=
.0142 + .0966
= .872 Rounding to 3 s.f.

Example 4: In samples of material taken from a textile factory, 40% had at least one
fault. Estimate the mean number of faults per sample.
Solution: Let the random variable X be the number of faults per sample.
X has a Poisson distribution.

P( X ≥ 1) = 0.4
P( X = 0) = 0.6 Since P ( X = 0 ) = 1 − P( X ≥ 1)
−λ 0
e .λ
= 0.6
0!
e− λ = 0.6
ln(e− λ ) = ln(0.6)
− λ = ln(0.6)
λ = 0.512 Rounding to 3 s.f.

CdK - OFS HS - March 2014

You might also like