Discrete Distribution by MK Roy
Discrete Distribution by MK Roy
3 and leptodurticif pq< J. (ii) If, n the number of trials terid to infinity, then B, tends to zero and f, tends to 3 which shows that binomial distribution becomes normal distribution if n— . (iv) If p or q is very small, but n is very large stich that fp or nq is finite constant, then binomial distribution turns into Poisson distribution. (vy): Mean of the distribution is always greater than variance, since np>npq On be basis ofthe above results, ‘we can state a theorem. ‘Theorem 115.2. 1FX isa binomial variate with parameters n and p, thenDiscrete Distributions 5 Theorem 11.3.3. If X is a binomial variate with probability function fix m p)=(")p'gt*s x=0,1,2, i " dy ; Then. Meu = P+ PH ‘rool By definition, rth moment about origin is w= EDX] ( = £x()e - 5 «(")pa-pr™ "4132 Differentiating (11.3.2) with respect to p, we have See Bae ( pia pyr $ (staat pr op dena £ x(")pta-pr aye sent yera-pr" qu f Fen = Fela-npw Hence. = pq mph. Remarks : It is noted here that if we put r = 0,1, 2, 3, um we get the first, second, third, fourth, .. Faw moments of the distribution, Theorem 11.34, If X isa binomial variate with parameters n and p, then reasepa/Sfe enn] (1133) where ji va ard yy ate (e+ 1) rth arid (r= 1th centnal moments of the distribution. The expression (11.3.2) is called Ronovsky formula. Find the first four central moment th the help of this relation,‘Probability Distributions * Eo-wr( ‘penta py = Ee-»(2)p'-p m1 (x-np)'- Samp Ce (1-p)"™* == EQe-mere' a py FE Hence ti,.1=Pq [gerne]: Now, by putting r= 1, 2, 3, successively, we have ~ a= pal SBL+np,] mp9 since Hy =1, andy, =0. 799 +2m,J-pal 5 band] = palna-npl= apals-Pl and He =pa[ B+ 300, |= pal Sh topata-pi)+3n.npq] ~pa[n. gh ipa—pia-2p)+2n"pq] ; = pal-ghie-2e"+29)+50°pq] = pqi n(1 ~6p + 6p") + 3n’pq J = paint ~ 6pq) + 3n’pql ~ = mpaq[l - 6pq + 3npq] = npqll + 3pqin -2)]-Discrete Distributions ‘qheorem 11.3.5. IfX is a random variable with probability function fx; np) = (p's: x20,1,2, and p+q=1 thenthe () moment generating function of X is M,(t)= (q+pe'?. (i) characteristic function of X is $(t) = (q + pet" and “Gi the probability generating function of X is Pls] = (ps +qj'. Prof. (i) By definition, : My) = Ele") = £ ra ({)pt4 Fgh =i ( ) (peg = (q+ pe. By definition ti =Bler}= fem (C)psqr*= E (1) erat = (a+r. (ii By definition MI=Ets}= s*(")pta*= £ (°) ota =(a+psr Cumulant generating function and cumulants of binomial distribution Wehave moment genetating function of X from theorem (11.34) as Mxit) = (pet +9)". Bydefinition, the cumulant generating function of X is By) =og My() = log (pet +q)* =nlog [pe +q] = lg [p fedeted onl [ve fsbeote— 1 fs a 3t elie eSeht(9'-3p? + 2p°)= mpl ~3p-+ 2p?) p(1~pl(1=2p)= npq(q~P). =npalt 6901 ‘ple pat 69a) Hence, we kat Ski = npg — 6Pq) + Snipa "Where ky+1and k; are the (x + I)thand rth cumulants of the distribution Proof. By definition, the cumulant generating function of X is Kx(t)= log [Mx] =nlgipe'+a) = nog [pe! +1-ph” and, by definition, k= i] =n| sips +1- of : (se 1 Differentiating ke with respect to p in (11.3.6), we have ' Fean[ SA Loate'+1-p)] 0(37) Therefore, one.can find. ki, ka ky ks ete. by putt =0,1,2,3,etcin ais, 5s, ka, ks, ks ete. by putting © Theorem 11.3.7. FX is a binomial variate with probability function foc mp)=(")p'qh’s x= scant, (x; 0, p) (ipa ; x=012, a ‘bution is and p+q=1, then the rth factorial moment ofthe dist (11.38) HnenP’x= re). nlp qr K-rn—xy og | Pt -- & ra ee oe | pp Hp and pig) by putting x= 1,2,3and4iq mean, i’ =np 7 = n(n - 1)P* ae np? =n(n-1)(n- “2, Hig = Apt nla 1(n-2)(n -3)P*, Mean deviation about mean of Binomial distribution ‘Theorem 11.3.8. If X's a binomial variable with probability function foc.n.p)=(")p'g"; ” %=0,1,2,.. then the men deviation n about mean np of the binomial distributions - n-1) eagn-H ; 1 =2npq| nt) Tt where 1 is the greatest integer in np + 1. Proof. By definition, mean deviation of the distribution ‘about mean npis © n=E[IX -np|]= £ |x -np| 6s 1, p) n = E ix-npi (2) = £-6-nm ("eta Lemire = 2E —np(i)p'ag to E &- np)(" p'4 p*=0! tiieDiscrete Distributions 401 n° ‘n’ pe yy ee = 25 e-mp)(I)prat where p is the greatest j integer contained in np +1. aa bb @- xpi(? era nex 2 [pena ¢ ‘aan | "22% [teinte : [2 ae f [tet] “where t= pi =2 [trite : F * 2 Dyaae [This is obtained by summing over x, ty = 0] a nedber nope 2 nt H, qch=wP st | =2npq(n7}) Pa". | Mode of Binomial Distribution | | suppose X is isa binomial variate with parameters n and p. | Hee, we have : , probability of success p-0 and np) a finite «constant, then. binomial distribution tends to Poisson distribution. {ii When p and q are not so small and n tends to infinity, then binomial distribution tends to normal distribution. © (iv) Binomial distribution is a limiting case of hypergeometric distribution. ‘That is when population size N-+ =, then hypergeometric distribution tends to binomial distribution Theorem 113.13, Suppose X; is_a binomial variate with parameters n; and p and X; is another binomial variate with parameters n, and p. Then the conditional distribution of X, for given X = X,+X, isa hypergeometric distribution. Proof. We know that X = X,+X, is a binomial variate with parameters ny+n, and p. Hence0 Nk, hoy ‘This is the probability furiction of hypergeometric distribution. ont Ean 3 fe PIE SoBe IM ‘Example’11.3.6, Over.a' long period:of time'it has been observed that « given rifleman car hit a target on’a‘single trial with probability equal to 0.8. Suppose he fires four shots at the target. (i) What is the'probability he will hit the farget exactly two times? (ji) What is the probability that he will hit the target atleastonce? #6) 6 Solution. Hereé’a'trial is a single shot at the target, and we can definea success as a hit and a failure as a miss, so that n= 4’and p = 08. Here, We assume that the rifleman’s chance’ of hitting the target does not change from shot to shot then.the number X of times he hits the target is a binomial variable'with n=4 and p=08. 2 @_ = 0.1536. The probability is 0.1536 that he will hit the target exactly two times. wa P[X 21) =1-PIX=0] F =1- (9) wsroay “24 -0.0016 = 0.9984.Discrete Distributions 3 ¥ ample 11.3.7, The probability that a MBA graduate from a public university gets an executive job is 0.6. Four MBA graduates applied for gecittive jobs. What is the probability that (i) exactly 3 will get the job? @at least two will get the job? (ii) all will get the ob? (iv) none will get job? Solution. We define a MBA gets a job as success and does not get a job as hilute: Let X be the number of MBA graduates get a job. Then X isa binomial variate with p= 0.6, q=04andn=4. That is X ~ B(4,06). The probability function of X is fox 4,06)= (2) (osyoay™s x=0,1,234 @)PX=3) = (3) (o6yo4y* (0.6)(0.4)' = 4x0.216x04 = 03456. (PIX 22) = PIX = 2] + PX = 3] + PIX=4] Bs () (0.60.4)? + () (osyroayt?+ (06)! 6X0.36X0.16 + 4x0.216x04 + 0.1296 1.3456 + 0.3456 + 0.1296 = 0.8208: Gi)PIX= 4] = (0.6)'= 0.1296. Gi) /PIX=.0] = (0.4)* = 0.0256. Brample 11.3.8. The probability that a patient recovers from a rare blood disease is 0.4. 1f5 people are knows, fo have contacted this disease, what isthe probability that (i) exactly 3 survive, (i) at least two survive, (ji) at ‘most two survive and (iv) none survive. Solution. Here, we define survive as success and not survive as failure. let X be the number of people survive. Then X is a binomi variable with probability of success p = 0.4 and probability of failure q=1-p= 1-04=0.6 and n = 5. That is X ~ B(5, 0.4). The probability function of thebinomial distribution is : PX=x)= (3) Anos; x=01, 3 PX=3]= Q) (0.4)(0.6)°3 = 25 (0.4)%(067 = 10.0.064%0.56 = 0.2304k E & Probability Distributions au (W, PIX2 2] =1- PIX =0]- PIX = 1] : I “oer(8) 0.40.65 Sg) sas 1+ 0.07776 5x0.4x0.1296 =1-0.07776 - 0.2592 i = 0.66304. Gib PIXS 2) = PIX = 2) ¥ PCs 1] + PIX 0} a @) 047 (0.68 + @) (040.6 +(0.6° = 10x0.16x0.216 + 5x0.4x(0,6)' + 0.07776 = 0.3456 + 0.2592 + 0.07776 “ = 0.68256. Gv) PIX=0] = 0.6 = 0.0776. Example 11.39. In a locality 30%, people are literate. 10 persons ar ‘selected at random from that “area. What is the probability that (i) most 3 are literate, (ii) at least 3 are literate, (iii) exactly 3 are literate? Solution. We define a literate persori'as suc¢ess and an illiterate persn as failure. Let X be the number of literate persons. Then X is a binomial variate with probability of success p = 30% = 0.3 and the probability failure q=1-p=0.7 and i= 10. ; The probability function of X is 10, (?) OKO; x= 0/1,2 @ PIX <3] =PX= 0) 4 PIX=1) + P= 2] +P[X=3] eae 10 10 Le fio : 7" +( w) osyo7 +( A ,} (oaro7F + ( ) (0307 S ¥ = 0,028 +10x0.3X0.040 + 45 0.090.058 + 120%0.027 «0,082 2 649. =PIXS2] =P[X = 0] = P[X = 1] - PIX = 2] (0.028 + 0.120.+,0.235) 0.383 = 0.617. in PX =3]= (9) 9/077 = 120xa.027%i0.082 = 0.266:Discrete Distributions Application. The binomial distribution is often used as an approximation jo the hypergeometric distribution. Binomial distribution is also used to derive the distribution of Poisson and normal distributions. The limiting form of standardized binomial as n>, when p and q are not so small, is normal: The limiting form binomial when ne and p—Owith sp=) is Poisson distribution. Binomial distribution appears in many satistical modes — in fact whenever assumptions of independent trials with stable probabilities are introduced. 114. Poisson distribution Poisson distribution was discovered by the French mathematician and physicist Simeon Denis Poisson [1781 - 1840], who published it in 1837. Simon Denis Poisson was+an eminant French mathematician and physicist, an academic administrator of some note and according to an 626 letter from the mathematician Abel to a friend, a man who knew *how.to behave with a great deal of dignity”. One of Poisson's many interest was the application of probability to the law and in 1837 he wrote “Recherches sur la_probabilitie de Judgements.” This text contained a good deal of mathematics including a limit theorem for the binomial, distribution.;,Although credit for this theorem is given to Poisson, there is sone evidence that De-Moivre may have discovered it almost a century earlier. Although initially viewed as little more than a welcome approximation for hand to complete binomial probabilities, this particular result was destined for bigger things. It was the analytical seed out of which grew what is now one of the most important of all probability models, the Poisson distribution. Poisson distribution is a limiting case of the binomial distribution under the following conditions :- (i) The probability of success or failure in Bernoulli trial is very small. That is p> 0 or q-> 0. (i n, the number of trials is very large. (ii) np = 2 Gay) isa finite constant. Definition 11.4.1. A discrete random variable X is said to have a Poisson distribution if its probability function is iS f(x; A) = aes 0, otherwise (14) where e = 2.71828 and A is the parameter of the distribution which is the| mean number of success.4 416 Fundamentals of Probability & Probability Distributions rcan be easily shown that @. Ke HzO * w E fey As Eth.) = bee x0 0 11.4.1. Derivation‘of Poisson distribution from binomial distribution Poisson. distribution, can. be derived from the binomial istributig, tinder the following conditions : @-p he pebelily of success in.a Bernoulli trial is very small, that "
(t)1—A dt) + P,.,()adt + O(dt)? ty
FG20-F0 apy sp.+ 2G
Proceeding to the limit as dt 9, we get
lim Reteat)-F) AP.) +4P, (8)
Therefore, P()=-AP,( +P, (0), x21
where P/(t) denotes'the differentiation w.r.t.’t’.
(ay
For x=0," POs P_(t)=P (C1) calls in time t}=0.
Hence from (11.4.2), weiget
P(t ¥ dt) =P,(t){1-a dt) +0(at)*
which on taking the limit dt 0, gives
RADE =A BIO = ae a
Integrating watt, we get
Jog P(t}=-2t+C .
‘where C-is an arbitrary constant to be aetemined from the condition
PO) = 1 i,
Hence, log1=C = C= 6.
Therefore, log P()=-at => P(t) =e*
Substituting this value of P,(t) in (11.4.3), weget, with x=1
e P)=+AP (t+ 2078?
POE) +AP(t)= Ae,
This is an ordinary linear differential equation whose integrating fet
ise“. Hence its solution is
eM PA)=af eMeMdtsC, =at+C,ie
Discrete Distributions
G isan arbitrary constant to be deterind ——_.__ 419
W 20 ined from p
eon 100)= 0 which
vo, Pt)sen At.
jn substituting this in (11.4.3) with x= ine oid
We get
PL) +APR()=Ae™ at :
ting factor of this equation is and its solution i
ion is
pjet=[ teMe*atac, =O
“x J +c, $+C
procent- Hence » F{O)=0 which
P= OY ‘
pnceeding similarly step by step, we shall get
P= os x=0,1,2,..
tlisisthe probability function of a Poisson distribution with Parameter 2t
meerder cy ee Poisson probability functions are given below for
D1 2s a's o1 25 4.8
Fig. 11.4.1. Poisson probability functions.
Theorem 11.4. If X is a Poisson variate with parameter?, then
mean =,'=2 and Variance ==)
Poof, The probability function of Poisson variate with parameter 35
as eee420 Fundamentals of Probability & Probability Distributions
fog = SE, x=0,1,2
By definition . .
ae
mean =E[X]= E x =
aye
' xt (X12)!
j . nates “ogee
\ AZ Go
= ag x when y=x-=1
Again, var [X= EDX" —(EIX])*
Now, E[X’] = E[X(X - 1) +X]
soo seaete a2. >
where y.=x 42
aMerer+r= +d.
| Thus, var[X] = W42-a? = 2.
Hence mean and variance of a Poisson distribution are equal. ,
‘Theorem 11.4.2. If X is a Poisson variate with parameter A then,
Bi=d and Br=3+¢
where Bi and Bz are the measuires of skewriess and kurtosis.
Proof. By definiton, ei i sf
2
B= 4s .vand-Be= i '
as Pa
where pg, Hy and py are central second, third and fourth moments of
distribution. ; 7 natise cDiscrete Distributions 21
Wehave, from theorem 11.4.1 py’=A, py’= A442, and w= 2.Now,
wi=EDC] +
2EIX(K = 1)(K 2) + 3X(K+1) +X]
=EIX(K-1)(K-2) + 3[X(K- 1)] + EX]
i Le gen a
es ea or +30 +A
aioe +a +d
=v Lae!
w=EDC] : ;
= E[X(X- 1)(X - 2)(K —3) + 6X(X - 1)(K -2) + XX-1) +X]
= a x(x 1)(x - 2)( = 3). oe +6E[(X — 1K = 2)] + EX - 1)] + EX]
oMe? § ear +O 4 PHA
OP LORE EA HMO + TED.
Hence. Hs= Hy’ - 3Hy'by’ + 2(4')°
= +3N+A-3A A+A) +20 =A and
Hy = Mal ~ Abie’ uy’ + Oya! (ay)? 3(uy')6
= AS + 6034707 +A = AN(A +A? 4A) + ORAZ HA) SAF
, HSN +2.
: gle. abe a BAP +A 1
Therefore, B= Bs = 35> x and f= by Bessy.
Thus the coefficients of skewness and kurtosis are
= vB = Gandy, = fr-3= 1.
Remarks : ”
() Poisson distribution:is always positively skewed and leptokurtic.
(ii) When A — ©, fy tends to zero and B2 tends to 3 which shows that a
Poisson distribution tends to anormal distribution if A =.4225 —_~ Fundamentals ofProbability & Probability Dstibuons
(i) Fist thre ceral moments are equal to in magnitude,
variate is measured in some unit, the variance will be sq By iy
unit and ps will be cubie of the unit although they ne :
magnitude,
1142 Recmee Relation fore momentsof be Peso Digg
Theorem 1143.1X isa Poisson vasite with parameter), then
A
Host = TAH * ie
where 41 Hy and p,., are the (r + 1)th, rth and (r — Vth ecg
moments of the distribution.
Proof. By definition, the rth central moment of the distribution is
w= EIK-EOO} ee
“. SEIK-al‘
eye c
=Ee- a uy
Differentiating (11.4.4) with respect to A, we have:
a, tet, nent. ged a
& =f) yea § CM eta 1 vel
ea, :
eority § E (esa)
Bark t poet:
Ap,»
Hence pyr= 1 Ab. ae
The relation (114.5) is known as the recurrence relation for the mores
ofa Poisson distribution, *
Now, ifwe put r=1,2,3 in (114.3), we get
us)
women dpe, since == and
he pyr een dan and jute ways B= Reh.crt
Discrete Distributions 2
Theorem 11.4.4. Let X be a Poisson variate with probability function
fig MSE; x= 0,1,2,
then My(t) =e", y(t) = MY and Pfs] =e"? and
where My(t), x(t) and P[s] are the mgf, cf and pgf of X.
Proof. @) By definition, the mgf of X is
e tye
Mi)= He] = £ e* SE = EOE tne adit,
: &
(ii) The characteristic function of X is
et) = Ble} = &
(ii) The probability generating function of X is
af. PIB = Este net EPL thet,
Wecan find different moments of the distribution from mfg or cor pg.
We have, from theorem (11.4.4)
M(t) ==,
By definition, rth raw moment of the distributionis
Ms
0
dt
w= (BQO) = freee] = femciho
BA, since [My]. =1.
=| i a ol reM,(+ net SMD x] Ate
uf | = [herent MLO. IM 5 mg}
to dt at de
SALEM AQHA EAL IEDa Fundamentals of Probability & Probability Distributions
1 [MAb Te 24 .dM (t) 1dMy(t)
wy (4. [perenne IM 4 299! MD
oo oe ne PMO ne EM ae une
BASEL IAEA) FA (AE M +h Orgy
aie 647+).
Now)‘one'car find different central moments of ihe distribution fy,
the raw moments 1’, :’, Hy’ and 14’.
‘Theorem 1145. IFX is a Poisson variate with parameters 2, then
where k; is the rth curnulant of the distribution.
Paes We have megf of the distribution from the theorem 11.4, Aas
Myp=e 4-9,
Hence the cgf of X is
. Kx(t) = log My(t) = AC"?
Bt!
Now, Ket) =a +t+ atartatt:
; e
ae
Thus, = Coefficient of 4.in Ky(t)= a
Hence all cumulants of Poisson distribution are equal to A.
Thetis, ki=k=k=ki=. ke=h.
Additive property of Poisson variates
Theorem 11.4.6. Sum of k independent Poisson variates is also a Poisxn
variate.
Proof. Let X:, Xz,
parameters 2,1
. Xe be k independent Poisson variates wit
Thatis, *
eas
to, A= qh . X= 0, 1, 2s
Then; we have to prove that EX; =
X_ is also a Poisson variate wih
parameter EAj= A. ‘om
Discrete Distributions 25
‘The mgf of Xjis
Mx(t) = EL es)
Dne=1)
for allj
Now, the mgf of EX)=X is '
My(t) = Efe * em]
= Efe] E [e™]
ate, ae!
E[e**]
exe =1)
2:
: alte
to
etet=0 |
ee
which is the mgf of a Poisson distribution with parameter = 24;, since
gf uniquely determines a distribution.
Therefore, the probability function of X is
ay
foc y= Shere aa da, ana x= fx,
This completes the proof of the theorems.
Theorem 11.4.7. If X; and. Xa are two independent Poisson variates with
parameters Ai and Az, then X = X:—X2 is not a Poisson variate.
Proof. By definition, the mgf of X = X1—Xzis
Mx(t) = Efe}
Ele] Ee]
= My, (0). My)
a edit“ dete”
which cannot be put in the form e*“'~”, Therefore, X = X1 ~ Xz is not a
Poisson variate, although X = X: + X2 is a Poisson variate. It is to be
noted that X = X: ~ Xz cannot be a Poisson variate because here X can
take both positive'and negative values but Poisson variate can take only
positive values.”Foor ah eee a
i
€
426 Fundamentals of Probability & Probability Distributions
11.43. Recurrence Relation of Poisson Distribution
Let X be a Poisson variate with parameter 2, then
oa
fx; 4)=P fe] =
which is the probability that X takes the value x.
‘The probability that X takes the value x + 1 is
“yee
P+ 1]=f(+1, 2) =
z (+i)
Th Pix+1) | a
yt Pale ot th
Hence P[x+1]= 4 P[x]
+1
(1144
The relation (11.4.6) is known as recurrence relation of Poisson distribution,
This relation is very important for graduating Poisson distribution from
an observed distribution. First, we have to find the value of P [0] which
is equal to €*.
If'the value of 2. é not given, it is estimated from the observed data by
the method of morhents as” ‘
Lox
where X is the mean of the observed distribution.
The other probabilities for different values of X can be obiained as follows:
= re(24), pr
, Pal= (4), Pa
nd so on. i
11.44: Fitting a Poisson Distribution to an observed distribution
The first step in fitting a theoretical distribution is to estimate the
parameter of the distribution A by the method of moments form the
observed data. t
The mean of the Poisson distribution is 4, while the mean of tht
observed distribution is x.Discrete Distributions 47
Hence the estimate of 2 if it is not known is
3 Rs
‘The expected frequencies corresponding to the observed frequencies
will be obtained as
Ue,
x!
nf Aen
0,1, 2, -.
147)
where n is the total number of observations.
Remarks : If it is asked to fit a suitable distribution from an observed
distribution, one can varify whether the sample mean and variance are
approximately equal or not, since the mean and variance of a Poisson
distribution are equal.
Example 11.4.3. The following data gives the number of printing mistakes
ina book of three hundred pages. Fit a Poisson distribution to the data.
Number ofprintng mistakes | 0 | 1 | 2 [ 3 | 4 | 5 | Gandabove
‘Number of pages 130 | 72 [40 [35 | 15 | 6 2
Solution. In order to fit a Poisson distribution to the data, we have to
estimate the parameter A from the relation
§ Sa Eee. 359
2 =f. 3G = 1.197.
‘The probability function of Poisson distribution is
eT
fx; y= SF x =0,1,2,
Here, =1.197, then P[0]=e7
Thatis, log [P(O)] = - 1.197 log,e
1.197(0.43429)
0.519845
= 1.480155.
Thus, — P[0] = 0.302099.
Now, by using recurrence relation [11.4.4] and relation [11.4.5] we find
the probabilities and expected frequencies corresponding to the
different values of the variable X which are shown in the table 11.4.1.&
i
i
2
Fundamentals of Probability & Probability Distributions
Table-11.4.1
Number of printing Number of fe B
mistakes x-~ pages f £4) iene
0 130 302099 Eyer
1 n 361613 108.48 144
i 2 40 216425 6893p
3 35 ‘086358 B.D
4 6 025841 77565
5 6 ‘006186 18604
6 2 “001244 Pe
‘Since’ frequencies are always integers, therefore: by converting them jo
nearest integer, we get
[Observed frequency | 130 | 72 | 40 | 35 | 15 | 6] 2
*, [Expected frequency [91 | 108 | 65.[ 26 | 8 [2 fo
Itis to be noted here that the stim of the expected frequencies is equal
“the sum of the observed frequency.‘
‘Theorem: 11.4.8. If X is a Poisson variate with parameters A, then the
distribution function of X is.
F@)= TweaT fretted; x=0,1,
Proof. The probability fuinction of a Poisson variate with parameter his
fs = x=0,1,2,
ss 3
Now, consider the incomplete gamma integral
k= J, [fete at positive integer)
Catt a Ls [revert
ee ae a3 (uss
which isa reduction formula for Ix,
ae applications ee eat
et
~
But p= fetdtDiscrete Distributions tg
= PIX=0] + PIX = 1] +PIX=2) +.
PIX sx] = F(x).
Here F (x) is the distribution function of the random variable X. Hence
F (x) "ete de
Ten ki
fletrat=
since I(x+1)=x!.
This result is of great practical utility. It enables us to represent the
cumulative Poisson probability in terms of incomplete gamma integral,
the values of which are tabulated for different values of 4 by Karl
Pearson in his table of incomplete I’ - Functions.
Mode of the Poisson Distribution
e oe i
‘The mode of the distribution is clearly that value of r for which is
‘greater than the term that precedes it and the term that follows it. Thats
eat ett ign
ei * * wy
These gives r SA and r2A-1
ie, (Q-I)