Math Theory
Math Theory
Math Theory
= m1 + m2 = P(A) + P(B).
= m1 + m2
n n n
(2) If A, B, are any two events (not mutually exclusive), then
P (A + B) = P (A) + P (B) - P (AB)
or P (A u B) = P (A) + P (B) - P (A n B)
If the events A and B are any two events then, there are some outcomes which favour both A and B. If m 3
be their number, then these are included in both m 1 and m 2. Hence the total number of outcomes favouring
either A or B or both is
ml+ m2- m3.
Thus the probability of occurrence of A or B or both
= m 1 +r~-m3 =m1 +m2 _ma
n n n n
Hence P(A + B) = P(A) + P(B) - P(AB)
or P(A u B) = P(A) + P(B) - P(A n B)
Obs . When A and B are mutually exclusive P(AB ) or P(/1_ n B ) = 0 and we get
P(A + B ) or P(A v B ) = P(A ) + P(B).
In general, for a number of mutually exclusiv e events Al' A 2 , .• • A,,, we have
P(/1. 1 + A 2 + ... + A ,.) or P(A 1 v A 2 v ... u A,,) = P(A 1) + P(A 2 ) + ... + P(A ) .
For example, the discrete probability distribution for X, the sum of the numbers which turn on tossing a
pair of dice is given by the following table :
X= x; 2 3 4 5 6 7 8 9 10 11 12
p(x) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36
[·: There are 6 x 6 = 36 equally likely outcome~ and therefore, each has F(x)
the probability 1/36. We have X = 2 for one outcome, i.e. (1, 1); X = 3 for two
outcomes (1, 2) and (2, 1); X = 4 for three outcomes (1, 3), (2, 2) and (3, 1) and so
on.]
(2) Distribution function. The distribution function F (x) of the discrete
variate X is defined by
F (x) = p (X x) = I
i=l
p(x;) where xis any integer. The graph of F(x) will be
0 X
stair st ep fiorm (F'1g. 26 .2 )• The distribution function is also sometimes called Fig. 26.2
cumulative distribution function.
~N~INUOUS PROBABILITY _DIST~IBUTION . . _
When a vanate X takes _every _value m an mterval, it gives rise to continuous distribution of X . The
' ·butions defined by th e vanates hke heights or weights are continuous distributions.
st11
di A major conceptual difference, ~~wever, exists between discrete and continuous probabilities. When
inking in discrete terms, th e proba_b 1~1ty ~ssociated with an event is meaningful. With c~ntinuou~ event~,
tb r where the number of events 1s mfimtely large the probability that a specific event will occur 1s pract1-
howevero' For t h 1s ' reason, con t muous
· '
probability statements must be worded somewhat different lY from
callY:: o~es. Instead of finding the probability that x equals some value we find the probability of x falling in a
~ re '
small interv · .. . . . . . .
Thus the probability d1stnbut10n of a contrnuous vanate x is defined by a function f (x) such that the
bility of the variate x falling in the small interval x _ .!. dx to x + .!. dx is f (x) dx . Symbolically it can be
pro ba · 2 2
~xpressed as P(x -½ dx::;; x::;; x + ½dx) = f (x) dx. Then f (x) is called the probability density function and the
=0, x<0,
(ii) If so, determine the probability that the variate having this density will fall in the interval (1, 2) ?
(ill) Also fi nd the cumulative probability function F(2) ?
I~f (x ) dx = f r=e- x dx = 1
0
0 · dx +
- = - = Jo
ml HIGHER ENGINEERING M
ATJ.iEM,.%
Hence the function f (x) satisfies the requirements for a density function .
lt .
0 1 2
.
X · 2
0 1 3 X
(a)
(b)
Fig. 26.3
(1) EXPECTATION
The m ean value (µ) of the probability distribution of a variate Xis commonly known as its expectation
and is denoted by E(X). If f(x) is the probability density function of the variate X, then
or
E(X) = r~ xf (x) dx (continuous distribution!
In general, expectation of any function q,(x) is given by
E[q,(x)] = L<l>(x;) f (x) . 1·butionl
(discrete d1str
or
E[q,(x)] = r~ q,(x) f (x) dx . t n"butionl
(continuous dis
(2) Variance of a distribution is given by
2
cr = I<x; - µ) 2 f (x;) ·butionl
(discrete di 51n
or
cr2 = r~ (x -µ)2 f (x) dx (continuous distn
·butioD'
where cr is the standard d eviation of the distribution.
(3) The rth moment about the m ean (denoted byµ) is defined by
µr = r(x; - µY f (x) ·but10·nl
(discrete distn
r~
I
'b (ioD
or µr = (x -µ)' f (x) dx distr1 u
(continuous
(4) Mean deviation from the mean is given by . ,I
r I x ; - µ I f (x) . tfl'b''tJO'
V
(discrete dis . 11
0111
orby [ ~l x -µlf (x)dx distribU
(continuous
· f1f f• MOMENT GENERATING FUNCTION
(1) The moment generating function (m.g.f.) of the discrete probability distribution of the variate X about
the value x = a is defined as the expected value of et<x - a) and is denoted by M / t) . Thus
... (1)
Ma(t) = I.p i i <x, -a)
which is a function of the parameter t only.
Expanding the exponential in (1), we get
t2 tr
M (t) = I.p . + t'I.p . (x . - a)+ - I.p .(x . - a) 2 + ... + - I.x . (x; - a Y + ...
a t t t 2! t t r! I "
Thus the moment about any point x = a can be found from (2) or more conveniently from the formula (3).
Rewriting (1) as
M a (t) = e- at I.p t. ixi or M a (t) = e - at M 0(t) ... (4)
Thus the m.g.f. about the point a= e-at (m.g.f about the origin).
PROBABILITY AND \)ISTRIBUTIONS
ma
(2) If f (x ) is. t!ie d~nsi_ty f~nction of a con_tinuous variate X, then the moment generating f unction of th is
continuous probability distrib ution about x = a is given by
Example 26.36. Find the mom ent generating fun ction of the exponential distribution
f (x) = !_ e- xtc, 0 ::{x::; 00, c > 0. Hence find its mean and S.D. (Kurukshetra, 2009 )
C
2
µ,2 = [ -d 2 Mo (t )_
] = 2c2, and µ2 = µ'2 - (µ\)2 = 2c2 - c2 =c2.
dt t =O
;J
n=O at t=l
2
or (a = L n (n - 1) Pn = µ2' - µ/
lat 1=1
= µ2 + µ/ 2 - µ/ and so on
Obs. The p.g.f. of the sum of two independent random variables is the product of their p.g.f.'s.
- -'1.~u. ........... u .... .1.u,.-:, LV J / \ .4. ) w Ut: a UCOSILY IUDCLIOil.
- REPEATED TRIALS
It is concerned with trials of a repetitive nature in which only the occurrence or non-occurrence; success or
failure, acceptance or rejection, yes or no of a particular event is of interest.
If we perform a series of independent trials such that for each trial p is the probability of a success and q
thatofa failure, then the probability ofr successes in a series ofn trials is given bynCrpr qn-r, where r takes any
integral value from Oto n. The probabilities of 0, 1, 2, ... r, ... , n successes are, therefore, given by
q n, ncJ!lq n - 1, nc 2q n - 2, ... , ncr prqn - r ,..., pn.
The probability of the number of successes so obtained is called the binomial distribution for the simple
reason that the probabilities are the successive terms in the expansion of the binomial (q + p)n_
:. the sum of the probabilities
=qn + ncJ!lqn-1 + nc~2qn-2 + ... + pn =(q + p)n = 1.
(2) Constants of the binomial distribution. The moment generating function about the origin is
MaCt) =E(etX) = ncxpXqn-x etx [By (1) § 26.111
= ncx (peer' qn -x =(q + pet)n
HIGHER ENGINEERING M
ATHe,.,..,Tics
Differentiating with respect tot and putting t = 0 and using (3) § 26.11, we get the mean
µ'1 = np .
Since M/t) = e-at Mo<t), the m.g.f. of the binomial distribution about its mean (m) = np, is given by
M m (t) = e- npt (q + pet)n = (qe-pt + peqt)n
2 3 4
= 1 + pq -t + pq(q 2 - p 2) -t + pq(q 3 + p 3 ) -t + .. .Jn
( 2! 3! 4!
t2 t3 t4
or 1 + µl+ µ2 - + µ3 - + µ4 , + · · ·
2! 3! 4.
f f . t
= 1 + npq - + npq(q - p) - + npq (1 + 3 (n - 2) pq] - + ...
2! 3! 4!
Equating the coefficients of like powers oft on either side, we have
µ 2 = npq, µ 3 = npq(q - p), µ 4 = npq (1 + 3(n - 2)pq].
Also ~1= µ/ =(q-p)2 =(1-2p)2 and ~2= ~=3+1-6pq
µ 23 npq npq µ/ npq
Thus mean= np, standard deviation=
skewness= (1 - 2p)! kurtosis= ~2-
Obs. The skewness is positive for p < ½and negative for p > ½• When p = · ½, the skewness is zero, i.e., the
probability curve of the binomial distribution will be symmetrical (bell-shaped).
As n the number of trials increase indefinitely, ~1 0, and ~2 3.
(3) Binomial frequency distribution. If n independent trials constitute one experiment and this
experiment be repeated N times, then the frequency of r successes is N nc, p' qn - '. The possible number of
successes together with these expected frequencies constitute the binomial frequency distribution.
(4) Applications of Binomial distribution. This distribution is applied to problems concerning:
(i) Number of defectives in a sample from production line,
(ii) Estimation of reliability of systems,
(iii) Number of rounds fired from a gun hitting a target,
(iii) Radar detection.
-
I .
As the number of trials increases, the Binomial distribution becomes approximated to the Normal
distribution. The mean np and the variance npq of the binomial distribution will be quite close to the mean and
standard deviation of the approximated normal distribution. Thus for n sufficiently large (,? 30), the binomial
distribution with probability of success p , is approximated by the normal distribution withµ= np, CJ= ..}npq .
We must however, be careful to get the correct values of z . For any success x, real class interval is (x -
I/2, x + I/2). Hence
I I I
~- - -µ ~- - - ~ ~+ - - ~
2 _ 2 ·z _ 2
2
CJ - ..}npq ' - ..}npq
so that P (x 1 < x < x 2 ) = P (z 1 < z < z 2 ) = f" cj>(z) dz which can be calculated by using table Ill-Appendix 2.
z,