Fluid Lab Manual
Fluid Lab Manual
COURSE CONTENT
CONTINUOUS FUNCTIONS
Combinatorics can be regarded as the art of arranging objects according to specified rules.
First, we want to know whether a particular arrangement is possible at all, and then, if so, in
how many ways it can be done.
Revision on Permutation
n!
n
Pr n(n 1)(n 2)...(n r 1)
(n r )!
The number of different permutations of n different (distinct) objects, taking r at a time with
repetition is:
n
Pr nr
n
Pn nn
The number of permutations of n different objects all at a time round a circle is (n-1)!
The number of permutations of n objects taking all at a time, when n1 objects are alike of one
kind, n2 objects are alike of second kind, ..., nk objects are alike of kth kind is given by
n!
n1 !n2 !...nk !
For example, total number of arrangements of the letters of the word COMMUNICATION
taking all at a time is given by:
13! 6227020800
194,594, 400
2!2!2!2!2! 32
Revision on Combination
n r 1 n r 1
And with repetition is Cr or
r
n
C0 , nC1 , . . . , nCn are known as Binomial Coefficients and the following holds:
n
(1) C0 =1= nCn .
n
(2) Cr = nCnr ; r=0, 1, 2, . . . , n
(3) Cr + nCr 1 = n1Cr
n
n
C0 + nC1 + nC2 +...+ nCn =2n
Example 1.1
Four cards can be drawn at random from a pack of 52 cards. Find the probability that
Solution
Total number of ways of drawing 4 four cards from a well shuffled pack of 52 cards without
52
any restriction is C4
(i) 1 king can be drawn out of the 4 kings in 4C1 = 4 ways . Similarly for 1 queen, 1 jack
and an ace. So the required probability is
4
C1x 4C1x 4C1 x 4C1 256
52
0.000946
C4 270725
4
C2 x 4C2 36
(ii) Required probability is 52 0.000133
C4 270725
(iii) There are 13cards of diamond in a pack of cards, so 4 cards can be drawn out of 13 cards
13
C4 715
13
in C4 ways , the required probability is = 52
0.002641
C4 270725
(iv) There are 26 red cards (of diamonds and hearts) and 26 black cards (of spades and
clubs) in a pack of cards,
26
C2 x 26C2 325x325 105625
Required probability = 52
= = =0.39016
C4 270725 270725
(v) In a pack of cards there are 13 cards of each suit (diamonds, hearts, spades and
13
C1x 13C1x 13C1x 13C1
clubs). Required probability is = 52
C4
13
C2 x 13C2 78x78 6084
(vi) Required probability = 52
0.022473
C4 270725 270725
Example 1.2
An urn has r red and b black balls. Suppose n balls are selected without replacement, what is
the probability that the sample will contain m red balls?
Solution
Let the total number of balls in the urn be N=r + b. Since there is a total of r red balls in the
r
urn then m r and there are ways to select m red balls from the r red balls, and for
m
b
each such a selection, there are ways to select the remaining k-m balls from the b
n m
black balls. The total number of samples with the property mentioned in the question is
r b
. . Thus, for n b m r , the probability that the sample will contain m red balls
m n m
r b
m n m
is P(m) 1.1
N
n
Example 1.3
n
n
Consider the binomial formular for the expansion of (a b) n a k b n k . This can be
k 0 k
Example 1.4
There are 4 roses and 5 lilies. Five flowers are randomly selected. Find the probability that
the bouquet will contain 2 roses and 3 lilies.
Solution
This can be solved using the hypergeometric probability density function given in equation
1.1 where r = 4, b = 5, m = 2, n - m =3, n = 5, N = 9
4 5
P2
2 3 6x10 60
0.4762
9 126 126
5
Ordered samples
Ordered sampling implies that the order of the elements is important. It can be with
replacement as well as without replacement.
This is the case where there are n elements and we want to draw r samples from the set such
that order is important and repetition is allowed. To illustrate this, let us consider the
following set of 4 types of computers A Apple, Dell , HP, Lenovo , if two computers are to
be selected at random one after the other with replacement, then r 2 and we have 16
possible sets as follows;
1 Apple, Dell
2 Apple, HP
3 Apple, Lenovo
4 Apple, Apple
5 Dell , Apple
6 Dell , HP
7 Dell , Lenovo
8 Dell , Dell
9 HP, Apple
10 HP, Dell
11 HP, Lenovo
12 HP, HP
13 Lenovo, Apple
14 Lenovo, Dell
15 Lenovo, HP
16 Lenovo, Lenovo
We can say generally that if there are n items in a set from which r are to be selected in an
ordered manner and where replacement is allowed, then total number of possible ways to
make the selection is n r because each of the n items in the set has n different possible
appearances as shown in the example above.
This is the case where order of the selection is important but repetition is not allowed. If the
selection above is based on this condition, we have the following 12 possible selection
1 Apple, Dell
2 Apple, HP
3 Apple, Lenovo
4 Dell , Apple
5 Dell , HP
6 Dell , Lenovo
7 HP, Apple
8 HP, Dell
9 HP, Lenovo
10 Lenovo, Apple
11 Lenovo, Dell
12 Lenovo, HP
Since the balls are selected without replacement, there are n possible ways to select the first
ball, there are n-1 ways to select the second ball, and so on. In the last, i.e., the rth draw, we
select one of n - (r - 1) = n – r + 1 balls. Thus the total number of samples is equal to the
quantity
n!
Which is the one referred to as n-permutations taking r at a time, n Pr
(n r )!
The events of interest can then be generated from these possible outcomes. For example, in
the case with replacement, what is the event that the selection will contain at least one Apple
computer?
This is
( Apple, Dell ), ( Apple, HP), ( Apple, Lenovo), ( Apple, Apple), ( Dell , Apple),
E
( HP, Apple), ( Lenovo, Apple)
7
The probability that the selection contains at least one Apple computer is then =
16
Event: an event is a set of outcomes, and is a subset of the sample space. For example in the
sample space given above, the events that the female student is from the south-western part of
the country is E1 {Mofe, Bisi}
Union of events: A union of events A, B, C,... is an event consisting of all the outcomes in all
these events. It occurs if any of A, B, C, . . . occurs, and therefore, corresponds to the word
“OR”: A or B or C or ... For example, if event A and B are defined as follows;
Collectively exhaustive events: Events A, B, C, . . . are exhaustive if their union equals the
whole sample space, i.e.,
A B C ... S (2.1)
(1) The sample space is sure to occur, therefore P(S) = 1, while an empty space never
occurs, i.e. P() 0
(2) The sum of the probabilities of individual outcomes of an event is equal to the
probability of the entire event, i.e., P( E ) P( wk ) P ( w1 ) ... P( wn )
wk E
(2.2)
(3) If events A and B are mutually exclusive, i.e., they have no common outcomes, then
their union A B consist of all the outcomes put together, hence,
P( A B) P( A) P( B) . This rule extends to any number of mutually exclusive
events
Example 2.1
If a job sent to a printer appears first in line with probability 60% and second in line with
probability 30%, what is the probability that it appears either first or second in line?
Solution
Since the two events (appearing first in line and appearing second in line) are mutually
exclusive, the required probability is P( A B) P( A) P( B) =30% + 60% = 90%.
Example 2.2
Consider a case of some construction where a network blackout occurs on Monday with
probability 0.7, and on Tuesday with probability 0.5. Then, does it appear on Monday or
Tuesday with probability 0.7 + 0.5 = 1.2? obviously not, because probability should always
be between 0 and 1! The rule above does not apply here because blackouts on Monday and
Tuesday are not mutually exclusive. In other words, it is not impossible to see blackouts on
both days.
In example 2.2, we see that ignorantly applying the rule of union of mutually exclusive
events clearly overestimated the actual probability, this is because in the sum P( A) P( B) ,
all the common outcomes are counted twice. Each outcome should be counted only once! To
correct this formula, subtract probabilities of common outcomes, which is P( A B) , so we
have probability of a union as;
P( A B) P( A) P( B) P( A B) (2.3)
= P( A B) P( A) P( B) for mutually exclusive events
For 3 events we have;
P( A B C ) P( A) P( B) P(C ) P( A B) P( A C ) P( B C )
P( A B C ) (2.4)
Example 2.3
In example 2.2, suppose there is a probability 0.35 of experiencing network on both Monday
and Tuesday. What is the probability of having blackout on Monday or Tuesday?
Solution
(4) Since events A and Ac are exhaustive, then A Ac = S. They are also mutually
exclusive (disjoint). Hence,
P( A) + P( Ac ) P( A Ac ) P(S ) 1
So if P( A) + P( Ac ) 1, we can solve for P ( Ac ) and obtain the complement rule given
as
P( Ac ) 1 P( A) (2.5)
Example 2.4
If a system appears protected against a new computer virus with probability 0.45. then, what
is the probability that it will have at least one virus?
Solution
This can be solved using the complement rule given in (2.5). The required probability is thus
1 - 0.45 = 0.55
(5) Events E1, . . . , En are independent if they occur independently of each other, i.e.,
occurrence of one event does not affect the occurrence of others. The rule for
independent events is given as
P( E1 E2 ... En ) P( E1 )x P( E2 ) x ... x P( En ) (2.6)
Example 2.5
There is a 1% probability for a hard drive to crash. Therefore, it has two backups, each
having a 2% probability to crash, and all three components are independent of each other.
The stored information is lost only in an unfortunate situation when all three devices crash.
What is the probability that the information is saved?
Solution
Two rules are applicable here, first we apply the rule for independent events to get the
probability that the information is lost. Then we apply the complement rule to obtain the
probability that the information is saved.
Example 2.6
Suppose a shuttle’s launch depends on three key devices that operate independently of each
other and malfunction with probabilities 0.01, 0.02, and 0.02, respectively. If any of the key
devices malfunctions, the launch will be postponed. Compute the probability for the shuttle to
be launched on time, according to its schedule.
Solution
and P(C ') 1 P(C ) 1 0.02 0.98 . Hence the required probability is;
The rule started in (3) is also referred to as additive law of probability, while the fifth rule is
also called the multiplicative law.
[2.3] Conditional probability
Given that two events A and B are dependent, then, the probability of A occurring is evaluated
with regards to the occurrence of B, likewise the probability of B occurring. Consider the
following example for illustration
Example 2.7
In the joint toss of two dice, calculate the probability of A or B, if A = both die show odd
numbers and B = sum of the scores is 4.
Solution
A = {(1,1), (1,3), (1,5), (3,3), (3,1), (5,1), (3,5), (5,3), (5,5)}, n(A) = 9
B= {(1,3), (3,1), (2,2)}, n(B) = 3
A B ={(1,3), (3,1)}, n( A B) = 2
9 3 2
P( A) , P ( B ) , P( A B )
36 36 36
9 3 2 5
The required probability is then; P( A B) P( A) P( B) P( A B)
36 36 36 18
We may want the probability that the sum of the scores is 4 given that both die show odd
numbers. This is the same as P( B given A) which is usually written as P( B A). It is a
conditional probability, which is the probability of B occurring given that A has occurred or
must always occur.
2
P( A B ) 2
For this example, P( B A) 36
P( A) 9 9
36
Note that since A must always occur, the total number of possible outcomes (sample space)
for evaluating the conditional probability is no longer n(S) = 36 but n(A) = 9
P( A B) P( A B)
P( B A) and P( A B) (2.7)
P( A) P( B)
Note that P( A B) can simply be written as P( AB) , that is, probability that A will occur
and B will occur. Thus, the multiplicative law can be written from (2.7) for dependent events
as
Example 2.8
Ninety percent of flights depart on time. Eighty percent of flights arrive on time. Seventy-five
percent of flights depart on time and arrive on time.
(a) If you are to meet a flight that departed on time, what is the probability that it will
arrive on time?
(b) If you have met a flight, and it arrived on time, what is the probability that it
departed on time?
(c) Are the events, departing on time and arriving on time, independent?
Solution
A {arriving on time},
D {departing on time}.
P( A D) 0.75
(a) P( A D) 0.8333
P ( D) 0.9
P( A D) 0.75
(b) P( D A) 0.9375
P( A) 0.8
(c) The two events are not independent because
P( A D) P( A), P( D A P( D), P( A B) P( D)
P( Ai ).P( B Ai )
P( Ai B) n
(2.9)
P( A ).P( B A )
i 1
i i
where the denominator in the formula is the total probability earlier mentioned, i.e.
n
P( B) P( Ai ).P( B Ai )
i 1
Proof
The proof of Bayes’ theorem is easily done with the use of Venn diagram in set theory.
An B
A4
A1
A2 A3
Fig 1
n
P( B) P( B Ai )
i 1
But P( B Ai ) P( Ai ) P( B Ai )
n
Therefore P( B) P( Ai ) P( B Ai ) = the total probability
i 1
P( Ai B) P( Ai ) P (B Ai )
For any i, P( Ai B) n
P( Ai ) P( B Ai )
P( B)
i 1
Example 2.9
In a bolt factory, machines A, B, C manufacture respectively 25%, 35% and 40% of the total.
Of their output 5, 4, and 2 percent respectively are known to be defective bolts. A bolt is
drawn at random from the product and is found to be defective. What is the probability that it
was manufactured by:
(i) Machine A
(ii) Machine B or C
Solution
Let M1 , M 2 and M 3 denote the events that the bolt selected at random is manufactured by
machine A, B and C respectively and let D denote the event that it is defective. Then we have:
(i) The probability that a defective bolt chosen at random is manufactured by factory
A is given by Bayes’ rule as:
P( M 1 ) P( D M 1 ) 0.0125
P( M 1 D) 3 0.36
P( M i ) P( D M i )
0.0345
i 1
P( M 2 ) P( D M 2 ) 0.0140
(ii) We first get P( M 2 D) 3
0.41 and
P( M ) P( D M )
0.0345
i i
i 1
P( M 3 ) P( D M 3 ) 0.0080
P( M 3 D) 3
0.23
P( M ) P( D M )
0.0345
i i
i 1
That is, the probability that a defective bolt chosen at random is manufactured by machine B
or C is P( M 2 D) P( M 3 D) 0.41 0.23 0.64
This probability can also be obtained directly using complement rule =
1 P( D M 1 ) 1 0.36 0.64
[3.0] DISTRIBUTION OF RANDOM VARIABLES
[3.1] Basic concepts
A random variable is a quantity that depends on chance; it is a function of an outcome.
Precisely, it’s value depends on which elementary outcome has occurred when the
experiment under consideration has been performed.
For example, when tossing a coin twice, we may be interested only in the number of heads,
regardless of the order they appeared. Possible results are given in the table below.
Outcomes Number of heads
1 HH 2
2 HT 1
3 TH 1
4 TT 0
So, while there are four outcomes, the number of heads may take on three values: either 0, 1,
or 2. We also see that the number of heads is completely determined by the outcome.
In the general case, when considering an experiment with a sample space , we define
a random variable as a function X ( ) on the space
In the case of rolling two dice, the sample space consists of 36 outcomes. The r.v.
X ( ), the sum of the numbers on the dice, may assume all integer values from 2 to 12.
A random variable can be discrete or continuous. It is discrete if it can assume a finite
number of values or countable infinity (as many values as there are whole numbers). e.g.
number of mobile phones sold weekly by a sales representative is a discrete random variable
that can take on a finite set of values. The number of times a die is rolled before a 6 comes up
is a discrete random variable that can take on countable infinity of values 1,2,3,4,.... A
random variable is continuous if it can be measured on a continuous scale, i.e. it can take on
infinite values in any interval.e.g. time, weight, height, distance etc
If x R, then f ( x) 0
Example 3.1
A six-sided die with outcomes 1, 2, 3, 4, 5, 6 is rolled twice. Let X be the sum of the two
outcomes. Then the possible values of X are 2, 3, 4, 5, 6, . . ., 12. The p.m.f of X can be
written as
f ( x)
6 x 7 , for x 2,3,...,12.
16
Hence, f(2)=1/36, f(3) = 2/36, f(4) = 3/36, f(5) = 4/36, f(6) = 5/36, f(7) = 6/36, f(8) = 5/36,
f(9) = 4/36, f(10) = 3/36, f(11) = 2/36, f(12) = 1/36
Bernoulli Distribution
This is based on a random experiment whose outcome can be classified in only one of two
mutually exclusive and exhaustive ways such as success/failure, defective/non-defective,
male/female, life/death, rain/no rain, e.t.c.
We can represent the probability of success by p, and the probability of failure by q = 1-p.
The p.m.f of X, if X follows a Bernoulli trial, is given as
f ( x) p x (1 p)1 x , x 0,1
Binomial Distribution
When we have a sequence of Bernoulli trials that is, Bernoulli trial is performed n times, it
gives rise to Binomial experiment. The interest here is the total no of success. The possible
this with X=x, then the no of failures is n=x. The trials are independent and the probability of
success and failure on each trial are p and q respectively with probability p x 1 p
n x
. Also,
n!
the no of ways of selecting x successes in the n trials is n C x .
x!n x
1 p p i.e. p q .
n n
n
f x p x 1 p , x=0,1,2,…,n.
n x
x
That is, X b(n, p) where n and p are the parameters of the binomial distribution.
trials.
Example of a Binomial experiment is a lottery with 20% winning tickets and 80% loosing.
Here, the probability of success p , is probability of wining. If n=8 tickets and X represents
8
f x 2 Px 2 0.2 0.8 0.2936
2 6
2
Thus, we say that the distribution of the random variable X is b (8, 0.2) i.e. n=8, p =0.2
Geometric Distribution
This is a distribution dealing with the probability that the first success will occur on a given
trial. If we represent the trial on which the first success occurs by x, this means that x-1
failures has already taken place. Hence, the probability that the first success will occur on the
xth trial is
f x p 1 p
x 1
For x= 1,2,3,4,…,n.
Example 3.11.2
What is the probability that if a balanced die is rolled repeatedly, the first six will occur on
Solution
f x p 1 p
x 1
p 1 ,1 p 5
6 6
Therefore, f x 1 5
6 6
51
1 X5
6 6
4 625
7,776
Hypergeometric Distribution
At times, one might be interested in choosing n object from a set of a objects of one kind
(successes) and b objects of another kind (failures), if the selection is without replacement
and we are interested in the probability of getting x success and n-x failures, then it is given
as
a b
x n x
f x for x= 0,1,2,…,n
a b
n
Example 3.1.3
In a lot of 100 light bulbs, there are 5 bad bulbs. An inspector inspects 10 bulbs selected at
i) 2 defective bulbs
Solution
5 95
2 8 10 x 1.216 x 1011 1.216 x 1012 1.216
P x 2 f 2 0.07
100 1.731 x 1013 1.731 x 1013 17.31
10
5 95
1.010 x 1013
P X 0 f 0
0 10
0.583
100 1.731 x 1013
10
P X 1 P X 2 P X 3 P X 4 P X 5 1 P X 0 1 0.583 0.417
Poisson distribution
This is a distribution of a discrete random variable that represents counting the number of
times particularly events occur in a given times or on given physical objects. E. g. counting
the no of phone calls arriving at a switchboard between 9 and 10 a.m, the number of flaws in
100 feet of wire. The following are the conditions that a random variable must satisfy for it to
approximately h
x e
f x , x= 0,1,2,…,n
x!
Example 3.1.4
Note
The binomial distribution can be approximated by the Poisson distribution when n is large
and p is small such that ʎ = np. We can say when n ≥ 100 and np< 10.
Example 3.1.5
Suppose that 28 of 4,000 sales invoices contain error. If a CPA randomly chooses 150 of
them for an audit, what is the probability that exactly two of them contain errors?
Solution
28
np 150 1.05 (<10 i.e., small so we can use Poisson approximation)
4, 000
This is a p.d.f that puts equal probability on each of the points in its space. Let X equal an
integer that is selected randomly from the first m positive integers, we say that X has a
f x
1
, x = 1,2,…,m
m
In a Lottery, a three digit integer is selected at random from 000 to 999; inclusive. Let X
equal the integer that is selected on a particular day. Find the p.d.f of X
Solution
p.d.f of X is P X x f x
1
, x 000,001,...,999
1000
3.2 CONTINUOUS DISTRIBUTION
When outcome spaces are not composed of a countable number of points but are intervals, or
a union of intervals, then the random variables representing such outcomes are the continuous
f x dx which is the P(a<X<b) and must satisfy the following conditions
a
b) f x dx 1
There are several distributions functions that are continuous, some of them are; Normal
Let the random variable X denote outcome when a point is selected at random from an
interval [a,b], - <a<b< . If the experiment is performed in a fair manner, then we can
assumethat the probability the pointis selected from the interval [a,x], a≤x≤b is x a b a
. That is, the probability is proportional to the length of the interval so that the p.d.f of X is,
f x
1
,a x b
ba
i.e. X is U(a,b). It is also referred to as rectangle because of the shape of the graph of f x
Example 3.2.1
Customers arrive randomly at a bank teller’s window. Given that one customer arrived during
a particular 10-minute period, let X equal the time within the 10-minutes that the customer
i) The pdf of X
ii) P (X ≥8)
iii) P(2≤ X≤ 8)
Solution
i) The pdf of X is
f x
1
,0 x 10
10
10
x
P x 8
10 1
ii) dx 1 8 2
8 10 10 8 10 10
8
x
P2 X 8 dx
1 8 8 2 6
iii) 0.6
2 10
10 2 10 10 10
Exponential Distribution
The pdf of a random variable X which has an exponential distribution is given as;
f x
1 x
,0 x <
The waiting time W until the first change is a Poisson process has an exponential distribution
with 1 .
Example 3.2.2
a) P(10<X<30)
b) P(X>30)
Solution
f x
1 x
a)
,0 x
30
1 x 20
30
1 x 20 1 x 20 20 e 1 x 20 20
30 30
30
P(10<x<30)= e dx e dx e x e 20
x
10
10
20 1
20 20 1 10 10
10
e e e 2 e 2
30 10 3 1
20 20
=0.2231+0.6065=0.6065-0.2231=0.3834
b) P(X>30)=
1 x 20
dx
x
20
0 20 0.2231
30
30 20 30
Normal Distribution
The normal distribution is the most important distribution in statistical applications because
x
2
1
2
f x
1
, - x , - , 0
2
Note:
The standard normal variable is given by
X
where (0,1)
If a random variable X has normal distribution, then to find probability of having values for
X, we transform it to the standard normal unit Z and use the Normal table to get the
probability.
Example 3.2.3
a) P(6≤X≤12)
b) P(0≤X≤8)
c) P(-2<X≤0)
Solution
X
, 6, 5
6 6 X 12 6
a) P6 X 12 P
5
5
6 5 0 0.8849 0.5000 0.3849
seen as the worth of that variable under a particular situation. It could be seen as an
average value the variable is expected to have as calculated mathematically. For example,
what is the mathematical expectation if someone buys one ticket out of 2000 raffle tickets
which contains only one winning ticket of a trip of $640’s worth? The probability of
1) The expected value of a function u X for a discrete random variable given as
u X u x f x where X is the function and f x is the p.d.f of the
xR
The expected value of the random variable X itself is the mean. i.e.
( x) x f ( x) Mean of X
The expected value of the random variable X of the continuous type is thus given as
X xf x dx = Mean of X
R
While the variance of X is X X . f x dx
2 2
a x a x . (iii) if a1 and a 2 are constant and u1 and u2 are functions, then
Example 4.1
Solution
4
x 1 2 3 4
X xf x x 1 2 3 4 3
x 1 10 10 10 10 10
Example 4.2
Let Y be a continuous random variable with p.d.f f(y) = 2y, 0 <y < 1. Find the E(Y) and the
variance of y
Solution
1
2
E(Y) = y 2 y dy y 3
1 2
0
3 0 3
Y Y 2
2 2
Variance of Y =
1
1
Y y 2 2 y dy y 4
2 1 1
0
2 0 2
Therefore variance of Y= Y 2 2
1 4 1
2 9 18
f x p x (1 p)1 x x=0,1
x. p 1 p 0. p 0 1 p 1. p1 1 p 0 p p
x 1 x 1 0
x 0
x p
1
x p p x 1 p
2 1 x
x 0
0 p p 0 1 p 1 p p1 1 p
2 1 2 0
p 2 1 p p 1 p 1 p p 2 p 1 p 1 p p 2 p p 2 1 p p p 1 p pq
2
pq
2) Mean and Variance of Binomial distribution
Mean,
n n
n
x xfx x p x 1 p
n x
x 1 x 0 x
x 1 x 1 ! n x !
n 1
n!
p k 1 1 p
n k 1
............(2)
k 0 k ! n k 1 !
n1 n 1!
np p k 1 p
n 1 k
........(3)
k 0 k ! n 1 k !
The term inside the bracket in (3) is a binomial p.d.f f(k) which has the property
f k 1
X np
n n
n! n!
X X 1 x x 1 p x 1 p 0 0 p x 1 p ....(4)
n x n x
x 0 x ! n x ! x 2 ! n x !
x2
n!
k ! n k 2! p 1 p
k 2 nk 2
= .....(5)
= n(n-1)(n-2)!........1 and p k 2 p k . p 2
n2
= n n 1 p
n 2 !
……(6)
k 0 k ! n 2 k ! p k 1 p
nk 2
f k n n i p i.e. X X 1 n n 1 p
2 2
X X 1 X 2 X X 2 X
X 2 X 2 X X n n 1 p 2 np n 2 p 2 np 2 np
Variance = X X 2 2 n 2 p 2 np 2 np n 2 p 2 np np 2
2
np 1 p npq
Mean
x xpq x 1....(1)
x 1
Equation (1) could be solved using sum to infinity of a geometric progression which
a
is , and the 1st derivative of a geometric progression,
1 r
a 2a
1st derivative = , 2nd derivative =
1 r 2 1 r
3
p p 1
Equation (1) becomes
1 q
2
p2 p
Variance
X X 1 x x 1 q x 1 p....(1)
x 1
pqx x 1 q
x2
x2
...(2)
2q p 2
2q 1
P 2 p p
Variance X 2 2 , where Mean 1
p
2q p 1
p2 p2
2q p 1 2q q q
2
P2 P2 p
q
Therefore variance of a Geometric Distribution is =
P2
1
xq
x 1
x 1
1 q 2
Alternatively,
multiply by q
q
x 1
xq x
1 q 2
Differentiate. both sides with respect to q
1 q
3
x2 q x1
x 1 1 q
multiply both side by P
1 q
x2
1 q 2
4) Mean and Variance of Poisson distribution
Mean
x x
x x x
x 0 x! x 1 x x 1!
k! 1
k 0 2!
3!
4!
....
Therefore,
X .
Variance
x xx 1x
x
X X 1= X X 1
k 2
k
X X 1=
2
k!
k 0 k! k o
Now, variance of X = X X 2 2
2
From
X 2 2 X 2
Therefore,VarianceX 2 2 2
2
A joint distribution of two random variables a pdf f x, y which is a function of two
variables. It is denoted as f X ,Y x, y . This pdf f x, y has the following properties.
f x, y ≥0
a) b) f x, y dydx 1
It is possible to have a joint distribution of two random variables in which one discrete and
one is continuous. The joint distribution of two random variables can be extended to a joint
Given two jointly distributed random variables X and Y, and a function u X , Y defined to be
u X , Y u x, y . f x, y dydx for the continuous case
x y
Marginal distribution
The marginal distribution of a random variable X can be found from the joint distribution of
random variables X and Y. If f x, y is the joint pdf of X and Y, then the marginal
= f x, y dx for the continuous case
Conditional distribution
Given that the random variables X and Y have joint distribution f x, y , and pdf of the
=x is
f x, y
fY y X x ; iff f X x 0
fX x
X
Y X x . fY X Y X dy in the continuous case.
fY x y x , or simply f y x .
fX x Y y f x, y ; if , f Y y 0
Y f y y
This is also written as f x y .
Note
Example 5.1
If X and Y are discrete random variables which are jointly distributed with the following
-1 0 1
Y 0 1/9 0 1/6
Find XY
Solution
XY xy. f x, y
x y
1 3 3 2 3
= -1/18 + 1/6 + 1/6 – 1/9 = 1
18 18 6
Example 5.2
Continuous random variables X and Y have a joint distribution with pdf
32 2 x y
f x, y in the region bounded by y=0 and x=0 and y=2-2x. Obtain the pdf for
2