Exam P Formula
Exam P Formula
Discrete Distributions
PMF Mean Variance MGF PGF Special Properties
Discrete Distributions 1 𝑎𝑎 + 𝑏𝑏 (𝑏𝑏 − 𝑎𝑎 + 1)c − 1 𝑒𝑒 kj − 𝑒𝑒 (ml3)j
Discrete Uniform
PMF Mean Variance MGF –
PGF –
Special Properties
𝑏𝑏 − 𝑎𝑎 + 1 2 12 (1 − 𝑒𝑒 j )(𝑏𝑏 − 𝑎𝑎 + 1)
1 𝑎𝑎 + 𝑏𝑏 (𝑏𝑏 − 𝑎𝑎 + 1)c − 1 𝑒𝑒 kj − 𝑒𝑒 (ml3)j
Discrete Uniform 𝑛𝑛 H – –
Binomial @ 𝑏𝑏A−𝑝𝑝 𝑎𝑎(1+−1 𝑝𝑝) 1ZH 2𝑛𝑛𝑛𝑛 𝑛𝑛𝑛𝑛(1
12 − 𝑝𝑝) )(𝑏𝑏j−+𝑎𝑎𝑞𝑞)+1 1)
(1 − 𝑒𝑒 j(𝑝𝑝𝑒𝑒 (𝑝𝑝𝑝𝑝 + 𝑞𝑞)1 –
𝑥𝑥
𝑛𝑛 𝑚𝑚
Binomial @ 𝑚𝑚A A𝑝𝑝H⋅ (1
w@
𝑁𝑁−−𝑝𝑝)
𝑚𝑚1ZH 𝑁𝑁 𝑛𝑛𝑛𝑛 𝑛𝑛𝑛𝑛(1 −– 𝑝𝑝) (𝑝𝑝𝑒𝑒 j + –
𝑞𝑞)1 (𝑝𝑝𝑝𝑝 + 𝑞𝑞)
–
1
– –
Hypergeometric 𝑥𝑥𝑥𝑥 @ 𝑛𝑛 − 𝑥𝑥 Az{@ 𝑛𝑛 A 𝑛𝑛 ⋅
𝑁𝑁
𝑚𝑚 𝑁𝑁 − 𝑚𝑚 𝑁𝑁 𝑚𝑚
– 𝑝𝑝𝑝𝑝
j
Geometric
Hypergeometric w@ A ⋅ @(1 − 𝑝𝑝) HZ3
Az{𝑝𝑝
@ A 𝑛𝑛 ⋅ 1 – – 𝑝𝑝𝑒𝑒 –
𝑥𝑥 𝑛𝑛 − 𝑥𝑥 𝑛𝑛 𝑁𝑁𝑝𝑝 1 − 𝑝𝑝
1 − (1 − 𝑝𝑝)𝑒𝑒 j 1 − (1 − 𝑝𝑝)𝑡𝑡
1 j 𝑝𝑝𝑝𝑝 𝑝𝑝 Memoryless property
𝑋𝑋:Geometric
trials; 𝑌𝑌: failures HZ3 1 𝑝𝑝c 𝑝𝑝𝑒𝑒 𝑝𝑝
(1 −
(1𝑝𝑝)
𝑋𝑋 = 𝑌𝑌 + 1 − 𝑝𝑝) á𝑝𝑝 𝑝𝑝 𝑝𝑝 − 1 1 − 𝑝𝑝 1− (1(1 −− 𝑝𝑝)𝑒𝑒
j 1− (1(1 −− 𝑝𝑝)𝑡𝑡
𝑝𝑝 1− 𝑝𝑝)𝑒𝑒 j 1− 𝑝𝑝)𝑡𝑡 Memoryless property
𝑋𝑋: trials; 𝑌𝑌: failures 1 𝑟𝑟 𝑝𝑝c 𝑝𝑝𝑝𝑝𝑒𝑒 j ä 𝑝𝑝𝑝𝑝𝑝𝑝 ä
𝑥𝑥 − 1 𝑝𝑝) á â
𝑋𝑋 = 𝑌𝑌 + 1
Negative Binomial à (1 − â 𝑝𝑝ä (1𝑝𝑝 − 𝑝𝑝) HZä − 1
𝑝𝑝 𝑝𝑝 1ã1−−(1(1−−𝑝𝑝)𝑒𝑒 j å 1à1−−(1(1−−𝑝𝑝)𝑡𝑡
𝑟𝑟 − 1 1 − 𝑝𝑝 𝑝𝑝)𝑒𝑒 äj 𝑝𝑝)𝑡𝑡 Neg Bin(𝑟𝑟 = 1, 𝑝𝑝) ~
𝑥𝑥 − 1 ä 𝑟𝑟 𝑟𝑟 à c â 𝑝𝑝𝑒𝑒 j 𝑝𝑝𝑝𝑝 ä
𝑋𝑋: trials; 𝑌𝑌: failures
Negative Binomial à 𝑦𝑦 + 𝑟𝑟â − 𝑝𝑝 1(1 −ä 𝑝𝑝) HZä á 𝑟𝑟 𝑝𝑝 ã 𝑝𝑝 å ä à 𝑝𝑝 â ä Geometric(𝑝𝑝)
𝑋𝑋 = 𝑌𝑌 + 𝑟𝑟 à𝑟𝑟 − 1 â 𝑝𝑝 (1 − 𝑝𝑝) 𝑝𝑝 − 𝑟𝑟 1 − 𝑝𝑝 1à− (1 − 𝑝𝑝)𝑒𝑒 j j â 1à− (1 − 𝑝𝑝)𝑡𝑡 â
𝑟𝑟 − 1 𝑝𝑝 1 − (1 − 𝑝𝑝)𝑒𝑒 1 − (1 − 𝑝𝑝)𝑡𝑡 Neg Bin(𝑟𝑟 = 1, 𝑝𝑝) ~
𝑟𝑟 à c â
𝑋𝑋: trials; 𝑌𝑌: failures 𝑦𝑦 + 𝑟𝑟 − 1 ä 𝑟𝑟 𝑝𝑝 𝑝𝑝 ä 𝑝𝑝 ä Geometric(𝑝𝑝)
𝑋𝑋 = 𝑌𝑌 + 𝑟𝑟 à â 𝑝𝑝 (1 − 𝑝𝑝) á − 𝑟𝑟 à â à â
𝑟𝑟 − 1 𝑒𝑒 Zè ⋅ 𝜆𝜆H 𝑝𝑝 1 − (1 − 𝑝𝑝)𝑒𝑒 j 1 − (1 − 𝑝𝑝)𝑡𝑡 Sum of independent
í
Poisson 𝜆𝜆 𝜆𝜆 𝑒𝑒 èrë Z3s 𝑒𝑒 è(jZ3) Poissons ~
𝑥𝑥! Poisson(𝜆𝜆 = ∑1023 𝜆𝜆0 )
Sum of independent
Zè H
𝑒𝑒 ⋅ 𝜆𝜆 í Z3s
Poisson 𝜆𝜆 𝜆𝜆 𝑒𝑒 èrë 𝑒𝑒 è(jZ3) Poissons ~
𝑥𝑥! Poisson(𝜆𝜆 = ∑1023 𝜆𝜆0 )
Continuous
Continuous
Continuous 11 1 𝑥𝑥 𝑥𝑥−−
𝑥𝑥𝑎𝑎 −
𝑎𝑎 𝑎𝑎 𝑎𝑎 𝑎𝑎++
𝑎𝑎𝑏𝑏 +
𝑏𝑏 𝑏𝑏 (𝑏𝑏(𝑏𝑏
−(𝑏𝑏
−
𝑎𝑎)−c c c
𝑎𝑎)𝑎𝑎) 𝑒𝑒 mj
𝑒𝑒 mj−
𝑒𝑒 mj
−𝑒𝑒 kj
𝑒𝑒 kj𝑒𝑒 kj
− (𝑋𝑋|𝑋𝑋
(𝑋𝑋|𝑋𝑋
(𝑋𝑋|𝑋𝑋
>>𝑐𝑐) ~ Uniform(𝑐𝑐,
>
𝑐𝑐) ~ Uniform(𝑐𝑐,
𝑐𝑐) ~ Uniform(𝑐𝑐,
𝑏𝑏) 𝑏𝑏) 𝑏𝑏)
Uniform
Uniform
Uniform 𝑏𝑏 𝑏𝑏
−− 𝑏𝑏𝑎𝑎 −
𝑎𝑎 𝑎𝑎 𝑏𝑏 𝑏𝑏−−
𝑏𝑏𝑎𝑎 −
𝑎𝑎 𝑎𝑎 22 2 121212 𝑡𝑡(𝑏𝑏
𝑡𝑡(𝑏𝑏
−
𝑡𝑡(𝑏𝑏
−𝑎𝑎)−
𝑎𝑎)𝑎𝑎) (𝑋𝑋(𝑋𝑋−(𝑋𝑋
−𝑐𝑐|𝑋𝑋
−
𝑐𝑐|𝑋𝑋
𝑐𝑐|𝑋𝑋
>>𝑐𝑐) ~ Uniform(0,
>
𝑐𝑐) ~ Uniform(0,
𝑐𝑐) ~ Uniform(0,
𝑏𝑏 𝑏𝑏
−− 𝑏𝑏𝑐𝑐) −
𝑐𝑐) 𝑐𝑐)
1 1Z1HZHZH H H H 11 1 Memoryless property:
Memoryless property:
Memoryless property:
Exponential
Exponential
Exponential 𝑒𝑒 𝑒𝑒π 𝑒𝑒π π 11
−−1𝑒𝑒 Z−
𝑒𝑒πZ 𝑒𝑒πZ π 𝜃𝜃 𝜃𝜃 𝜃𝜃 𝜃𝜃 c𝜃𝜃 c𝜃𝜃 c
𝜃𝜃 𝜃𝜃 𝜃𝜃 11
−−1𝜃𝜃𝜃𝜃−
𝜃𝜃𝜃𝜃𝜃𝜃𝜃𝜃 (𝑋𝑋(𝑋𝑋−(𝑋𝑋
−𝑎𝑎|𝑋𝑋
−𝑎𝑎|𝑋𝑋
𝑎𝑎|𝑋𝑋
>>𝑎𝑎) ~ 𝑋𝑋
>
𝑎𝑎) ~ 𝑋𝑋
𝑎𝑎) ~ 𝑋𝑋
∫Z3
∫Z3
∫Z3
𝑥𝑥 ∫Z3
𝑥𝑥 ∫Z3
𝑥𝑥 ∫Z3 ZHZHZH 11
−−1/
−/Pr(𝑌𝑌
/Pr(𝑌𝑌
Pr(𝑌𝑌
==𝑘𝑘)=
𝑘𝑘)
, 𝑘𝑘)
, , 1 1 1∫ ∫ ∫ Sum of 𝛼𝛼 independent exponentials(𝜃𝜃) ~
Sum of 𝛼𝛼 independent exponentials(𝜃𝜃) ~
Sum of 𝛼𝛼 independent exponentials(𝜃𝜃) ~
c c c
Gamma
Gamma
Gamma ⋅ 𝑒𝑒⋅ 𝑒𝑒π⋅ 𝑒𝑒π π 𝛼𝛼𝛼𝛼
𝛼𝛼𝛼𝛼
𝛼𝛼𝛼𝛼 𝛼𝛼𝜃𝜃𝛼𝛼𝜃𝜃
𝛼𝛼𝜃𝜃
à à à â â â
Γ(𝛼𝛼)
Γ(𝛼𝛼)⋅ 𝜃𝜃⋅ ∫𝜃𝜃⋅∫𝜃𝜃 ∫
Γ(𝛼𝛼) 82\
82\
82\
H H H 11−−1𝜃𝜃𝜃𝜃−
𝜃𝜃𝜃𝜃𝜃𝜃𝜃𝜃 Gamma(𝛼𝛼,
Gamma(𝛼𝛼,𝜃𝜃) 𝜃𝜃) 𝜃𝜃)
Gamma(𝛼𝛼,
𝑌𝑌 ~ Poisson@𝜆𝜆
𝑌𝑌 ~ Poisson@𝜆𝜆
𝑌𝑌 ~ Poisson@𝜆𝜆
==A =A A
π π π
Symmetry:
Symmetry:
Symmetry:
𝑋𝑋 𝑋𝑋
−− 𝑋𝑋𝜇𝜇 −
𝜇𝜇 𝜇𝜇 Pr(𝑍𝑍
Pr(𝑍𝑍
Pr(𝑍𝑍
≤≤𝑧𝑧)≤
𝑧𝑧)
=𝑧𝑧)
=Pr(𝑍𝑍
=
Pr(𝑍𝑍
Pr(𝑍𝑍
≥≥−𝑧𝑧)
≥
−𝑧𝑧)
−𝑧𝑧)
11 1 (HZæ) ñ ñ ñ
(HZæ)
(HZæ) 𝑍𝑍 𝑍𝑍=𝑍𝑍
== ø ñø
j ññ jøññ j ñ
Normal
Normal
Normal
Z Z Zñ ñ ñ
⋅ 𝑒𝑒⋅ 𝑒𝑒⋅ 𝑒𝑒 𝜎𝜎 𝜎𝜎 𝜎𝜎 𝜇𝜇 𝜇𝜇 𝜇𝜇 𝜎𝜎 c𝜎𝜎 c𝜎𝜎 c Pr(𝑍𝑍
Pr(𝑍𝑍
Pr(𝑍𝑍
≤≤−𝑧𝑧)
≤
−𝑧𝑧)
−𝑧𝑧)
==Pr(𝑍𝑍
=
Pr(𝑍𝑍
Pr(𝑍𝑍
≥≥𝑧𝑧)
≥𝑧𝑧) 𝑧𝑧)
𝜎𝜎√2𝜋𝜋
𝜎𝜎√2𝜋𝜋
𝜎𝜎√2𝜋𝜋
cøcøcø
𝑒𝑒 æjl
𝑒𝑒 æjl
𝑒𝑒 æjl
c c c
Pr(𝑍𝑍
Pr(𝑍𝑍
Pr(𝑍𝑍
≤≤𝑧𝑧)≤
𝑧𝑧)=𝑧𝑧)
=Φ(𝑧𝑧)
=
Φ(𝑧𝑧)
Φ(𝑧𝑧) Sum of independent normals ~
Sum of independent normals ~
Sum of independent normals ~
1
∑1023
∑𝜇𝜇1023 c c∑1∑1023
∑𝜎𝜎1023
c c ) c )
Normal(𝜇𝜇
Normal(𝜇𝜇==∑=
Normal(𝜇𝜇 023 , c𝜎𝜎
0𝜇𝜇, 0𝜎𝜎𝜇𝜇 0 ,=𝜎𝜎= =
023 0𝜎𝜎)
0 𝜎𝜎0
MULTIVARIATE
MULTIVARIATE
MULTIVARIATE PROBABILITY
PROBABILITY
PROBABILITY DISTRIBUTIONS
DISTRIBUTIONS
DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS
𝑉𝑉𝑉𝑉𝑉𝑉[𝑎𝑎𝑎𝑎
𝑉𝑉𝑉𝑉𝑉𝑉[𝑎𝑎𝑎𝑎
𝑉𝑉𝑉𝑉𝑉𝑉[𝑎𝑎𝑎𝑎 ++ + ==𝑎𝑎c=
𝑏𝑏𝑏𝑏]
𝑏𝑏𝑏𝑏] 𝑏𝑏𝑏𝑏] c c
𝑎𝑎𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋]
𝑎𝑎𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋]
𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋]
++ 𝑏𝑏 c+ c c
𝑏𝑏𝑉𝑉𝑉𝑉𝑉𝑉[𝑌𝑌]
𝑏𝑏𝑉𝑉𝑉𝑉𝑉𝑉[𝑌𝑌]
𝑉𝑉𝑉𝑉𝑉𝑉[𝑌𝑌]
++ 2𝑎𝑎𝑎𝑎
+
2𝑎𝑎𝑎𝑎
2𝑎𝑎𝑎𝑎 == = H£H£ H£
⋅ 𝑝𝑝⋅ 𝑝𝑝⋅ 𝑝𝑝⋅ … ⋅… ⋅8𝑝𝑝H⋅8®𝑝𝑝H 8® H®
⋅⋅ 𝑝𝑝… For i.i.d. random variables,
For i.i.d. random variables,
For i.i.d. random variables,
𝑥𝑥3𝑥𝑥! 3⋅𝑥𝑥!…3⋅ !…⋅⋅ 𝑥𝑥…⋅8𝑥𝑥!⋅8𝑥𝑥! 83! 3 3 (𝑥𝑥) 1 1 1
⋅ 𝐶𝐶𝐶𝐶𝐶𝐶[𝑋𝑋,
⋅ 𝐶𝐶𝐶𝐶𝐶𝐶[𝑋𝑋,
⋅ 𝐶𝐶𝐶𝐶𝐶𝐶[𝑋𝑋, 𝑌𝑌] 𝑌𝑌] 𝑌𝑌] 𝑆𝑆C𝑆𝑆(£)
C𝑆𝑆 C(𝑥𝑥)
(£) (£)
(𝑥𝑥)
==[𝑆𝑆=[𝑆𝑆
C (𝑥𝑥)]
[𝑆𝑆(𝑥𝑥)]
C C (𝑥𝑥)]
𝐸𝐸[𝑋𝑋
𝐸𝐸[𝑋𝑋] 0=
0𝐸𝐸[𝑋𝑋 ] 0=]𝑛𝑛𝑝𝑝 = 𝑛𝑛𝑝𝑝
0 𝑛𝑛𝑝𝑝
0 0
CC C
(𝑥𝑥)
(𝑥𝑥)(𝑥𝑥)[𝐹𝐹
[𝐹𝐹
(𝑥𝑥)]
[𝐹𝐹
(𝑥𝑥)]1 1 1
(𝑥𝑥)]
𝐹𝐹C𝐹𝐹(°)𝐹𝐹
C(°)C(°) = = =
𝐶𝐶𝐶𝐶𝐶𝐶[𝑋𝑋,
𝐶𝐶𝐶𝐶𝐶𝐶[𝑋𝑋,
𝐶𝐶𝐶𝐶𝐶𝐶[𝑋𝑋,
𝑌𝑌]𝑌𝑌]𝑌𝑌] 𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋 0 ] 0=
𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋
𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋 ] 0=]𝑛𝑛𝑝𝑝 =
𝑛𝑛𝑝𝑝0 (1 0 (1
𝑛𝑛𝑝𝑝 −0 (1−𝑝𝑝0− 𝑝𝑝) 0 )
𝑝𝑝0 )
𝜌𝜌C,n
𝜌𝜌C,n
𝜌𝜌=C,n
=𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶[𝑋𝑋,
=𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶[𝑋𝑋,
𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶[𝑋𝑋,
𝑌𝑌]𝑌𝑌]=𝑌𝑌]
== 𝐶𝐶𝐶𝐶𝐶𝐶ó𝑋𝑋
𝐶𝐶𝐶𝐶𝐶𝐶ó𝑋𝑋
𝐶𝐶𝐶𝐶𝐶𝐶ó𝑋𝑋 ò0 _, =
ò _= ò−𝑛𝑛𝑝𝑝
e𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋]e𝑉𝑉𝑉𝑉𝑉𝑉[𝑌𝑌]
e𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋]e𝑉𝑉𝑉𝑉𝑉𝑉[𝑌𝑌]
e𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋]e𝑉𝑉𝑉𝑉𝑉𝑉[𝑌𝑌] 0 , 𝑋𝑋0 ,_𝑋𝑋 𝑋𝑋 =−𝑛𝑛𝑝𝑝−𝑛𝑛𝑝𝑝
0 𝑝𝑝0_𝑝𝑝
, _0 𝑝𝑝
, _ , 𝑖𝑖 ≠ 𝑖𝑖 ≠𝑖𝑖𝑗𝑗 ≠𝑗𝑗 𝑗𝑗
Insurance
Insurance and
Insurance and
Risk
and Risk
Risk Management
Management
Management INSURANCE AND RISK MANAGEMENT
Category
Category
Category Definition of Payment, 𝒀𝒀
Definition of Payment, 𝒀𝒀
Definition of Payment, 𝒀𝒀 𝑬𝑬[𝒀𝒀]
𝑬𝑬[𝒀𝒀]
𝑬𝑬[𝒀𝒀]
YY Y
0,
0,
0, 𝑋𝑋 𝑋𝑋≤𝑋𝑋
≤𝑑𝑑 ≤
𝑑𝑑 𝑑𝑑 ∫S∫S(𝑥𝑥
∫(𝑥𝑥
S
−(𝑥𝑥
−𝑑𝑑)− 𝑑𝑑) 𝑓𝑓⋅C𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑
⋅ 𝑑𝑑) C⋅ 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑
C (𝑥𝑥) 𝑑𝑑𝑑𝑑
For exponential:
For exponential:
For exponential:
Deductible
Deductible
Deductible =ƒ =
𝑌𝑌 𝑌𝑌=𝑌𝑌 ƒ ƒ YY Y
𝑋𝑋 𝑋𝑋−𝑋𝑋−𝑑𝑑,−
𝑑𝑑, 𝑑𝑑, 𝑋𝑋 𝑋𝑋>𝑋𝑋
>𝑑𝑑 >
𝑑𝑑 𝑑𝑑 ∫S∫S𝑆𝑆∫CS𝑆𝑆(𝑥𝑥) 𝑑𝑑𝑑𝑑
C𝑆𝑆(𝑥𝑥) 𝑑𝑑𝑑𝑑
C (𝑥𝑥) 𝑑𝑑𝑑𝑑
𝜃𝜃 𝜃𝜃
⋅ Pr(𝑋𝑋
⋅𝜃𝜃Pr(𝑋𝑋
⋅ Pr(𝑋𝑋
>>𝑑𝑑)
>
𝑑𝑑) 𝑑𝑑)
« « «
𝑋𝑋,𝑋𝑋,𝑋𝑋, 𝑋𝑋 𝑋𝑋<𝑋𝑋<𝑢𝑢 <
𝑢𝑢 𝑢𝑢 ∫\∫\𝑥𝑥∫𝑥𝑥\⋅ 𝑓𝑓⋅𝑥𝑥
C𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑
C⋅ 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑
C (𝑥𝑥) 𝑑𝑑𝑑𝑑
++ 𝑢𝑢 +⋅ 𝑆𝑆⋅𝑢𝑢C𝑆𝑆(𝑢𝑢)
𝑢𝑢 ⋅C𝑆𝑆(𝑢𝑢)
C (𝑢𝑢) For exponential:
For exponential:
For exponential:
Policy Limit
Policy Limit
Policy Limit =≈ =
𝑌𝑌 𝑌𝑌=𝑌𝑌 ≈ ≈ « « «
𝑢𝑢, 𝑢𝑢, 𝑢𝑢, 𝑋𝑋 𝑋𝑋≥𝑋𝑋
≥𝑢𝑢 ≥
𝑢𝑢 𝑢𝑢 ∫\∫\𝑆𝑆∫ (𝑥𝑥) 𝑑𝑑𝑑𝑑
C𝑆𝑆C (𝑥𝑥) 𝑑𝑑𝑑𝑑
𝑆𝑆C (𝑥𝑥) 𝑑𝑑𝑑𝑑
𝜃𝜃 𝜃𝜃
⋅ Pr(𝑋𝑋
⋅𝜃𝜃Pr(𝑋𝑋
⋅ Pr(𝑋𝑋
<<𝑢𝑢)
<
𝑢𝑢) 𝑢𝑢)
\
1
Copyright
©
2014
Coaching
Actuaries.
All
Rights
Reserved.
2
www.coachingactuaries.com Copyright
©
2014
Coaching
Actuaries.
All
Rights
Reserved.
3
(A, B) indep. pair =⇒ (A, B c ), (Ac , B), (Ac , B c ) also indep. pairs Mode
S UMMARY S TATISTICS A mode is an x value which maximizes the PMF/PDF.
Expected Value It is possible to have 0, 1, 2, . . . or infinite modes.
R ANDOM VARIABLES Z ∞
Discrete: any values with the largest probability
X
A random variable, X, is a function from the sample space S to R E[X] = x · p(x) E[X] = x · f (x) dx
−∞
Cumulative Distribution Function
x Continuous: check end points of interval and where f 0 (x) = 0
Law of the Unconscious Statistician (LOTUS)
F (x) = P (X ≤ x) Percentile
X Z ∞
E[g(X)] = g(x) · p(x) E[g(X)] = g(x) · f (x) dx c is a (100p)th percentile of X if P (X ≤ c) ≥ p and P (X ≥ c) ≥ 1−p
x −∞
A 50th percentile is called a median
Expected Value Linearity
Discrete: look for smallest c with F (c) ≥ p
E[aX + b] = a · E[X] + b E[X + Y ] = E[X] + E[Y ] Continuous: solve for c in F (c) = p
Survival Shortcut
∞
X
If X is nonnegative integer-valued, then E[X] = P (X > k)
k=0
A valid CDF is nondecreasing, right-continuous and Z ∞
If X is nonnegative continuous, then E[X] = [1 − F (x)] dx
lim F (x) = 0, lim F (x) = 1 0
x→−∞ x→∞
Exam 1/P Formula Sheets
WWW.P ROBABILITY E XAM . COM
2 at (b+1)t
1 a+b (b − a + 1) − 1 e −e
DUniform({a, . . . , b}) Equally likely values a, . . . , b
b−a+1 2 12 (b − a + 1)(1 − et )
P (X = 1) = p t
Bernoulli(p) 1 trial w/ success chance p p p(1 − p) 1 − p + pe
P (X = 0) = 1 − p
# of successes in n indep. n x n−x t n
Binomial(n, p) p (1 − p) np np(1 − p) (1 − p + pe ) np ∈ N =⇒ np = mode = median
Bernoulli(p) trials x
# w/ property chosen w/ K N −K
K
x n−x K K K N −n Resembles Binomial(n, N)
HyperGeom(N, K, n) out replacement from N N
n n 1− ugly
N −1 with large N relative to n
where K have property n
N N N
−λx −λx 1 1 λ
Exp(λ) λe 1−e Only memoryless continuous distribution
λ λ2 λ−t
−λx α−1
α
λe (λx) α α λ
Gamma(α, λ) ugly Sum of α independent Exp(λ) for integer α > 0
Γ(α) λ λ2 λ−t
Exam 1/P Formula Sheets
WWW.P ROBABILITY E XAM . COM
• Uniform, U (m)
1
– PMF: f (x) = m, for x = 1, 2, . . . , m
m+1 m2 − 1
– µ= and σ 2 =
2 12
• Hypergeometric
N1 N2
x n−x
– PMF: f (x) = N
n
– x is the number of items from the sample of n items that are from group/type 1.
N1 N1 N2 N − n
– µ = n( ) and σ 2 = n( )( )( )
N N N N −1
• Binomial, b(n, p)
n x
– PMF: f (x) = p (1 − p)n−x , for x = 0, 1, . . . , n
x
– x is the number of successes in n trials.
– µ = np and σ 2 = np(1 − p) = npq
– MGF: M (t) = [(1 − p) + pet ]n = (q + pet )n
Continuous Distributions
• Uniform, U (a, b)
1
– PDF: f (x) = , for a ≤ x ≤ b
b−a
x−a
– CDF: P (X ≤ x) = , for a ≤ x ≤ b
b−a
a+b (b − a)2
– µ= and σ 2 =
2 12
etb − eta
– MGF: M (t) = , for t 6= 0, and M (0) = 1
t(b − a)
• Exponential
1
– PDF: f (x) = e−x/θ , for x ≥ 0
θ
– x is the waiting time we are experiencing to see one change occur.
– θ is the average waiting time between changes in a Poisson process. (Sometimes called the
“hazard rate”.)
– CDF: P (X ≤ x) = 1 − e−x/θ , for x ≥ 0.
– µ = θ and σ 2 = θ2
1
– MGF: M (t) =
1 − θt
– Distribution is said to be “memoryless”, because P (X ≥ x1 + x2 |X ≥ x1 ) = P (X ≥ x2 ).
• Gamma
1 1
– PDF: f (x) = α
xα−1 e−x/θ = xα−1 e−x/θ , for x ≥ 0
Γ(α)θ (α − 1)!θα
– x is the waiting time we are experiencing to see α changes.
– θ is the average waiting time between changes in a Poisson process and α is the number of
changes that we are waiting to see.
– µ = αθ and σ 2 = αθ2
1
– MGF: M (t) =
(1 − θt)α
• Chi-square (Gamma with θ = 2 and α = 2r )
1
– PDF: f (x) = xr/2−1 e−x/2 , for x ≥ 0
Γ(r/2)2r/2
– µ = r and σ 2 = 2r
1
– MGF: M (t) =
(1 − 2t)r/2
• Normal, N (µ, σ 2 )
1 2 2
– PDF: f (x) = √ e−(x−µ) /2σ
σ 2π
2 t2 /2
– MGF: M (t) = eµt+σ
Integration formulas
Z
1 1 1
• p(x)eax dx = p(x)eax − 2 p0 (x)eax + 3 p00 (x)eax − . . .
a a a
Z ∞
1 −x/θ
• x e dx = (a + θ)e−a/θ
a θ
Z ∞
2 1 −x/θ
• x e dx = ((a + θ)2 + θ2 )e−a/θ
a θ
Other Useful Facts
• When X depends upon Y , Var(X) = E[Var(X|Y )] + Var(E[X|Y ]). (Called the “Total Variance” of
X.)
• Chebyshev’s Inequality: For a random variable X having any distribution with finite mean µ and
variance σ 2 , P (|X − µ| ≥ kσ) ≤ k12 .
• For the variables X and Y having the joint PMF/PDF f (x, y), the moment generating function for
this distribution is
XX
M (t1 , t2 ) = E[et1 X+t2 Y ] = E[et1 X et2 Y ] = et1 x et2 y f (x, y)
x y
– µx = Mt1 (0, 0) and µy = Mt2 (0, 0) (These are the first partial derivatives.)
– E[X 2 ] = Mt1 t1 (0, 0) and E[Y 2 ] = Mt2 t2 (0, 0) (These are the “pure” second partial derivatives.)
– E[XY ] = Mt1 t2 (0, 0) = Mt2 t1 (0, 0) (These are the “mixed” second partial derivatives.)
Discrete Distributions:
If X ∼ Binomial (x; n, p), where p is the prob. of success and n is the number of tries, then
n
px (1 − p)n−x x ∈ {0, 1, ..., n}
P(X = x) = x , E X = np, Var [X] = np(1 − p)
0 otherwise
If X ∼ NegativeBinomial (x; r, p), where p is the prob. of success, and r is the number of
successes required,
x−1
pr (1 − p)x−r x ∈ {r, r + 1, ...}
r−1
r
P(X = x) = EX =
0 otherwise p
e−λ λx
x ∈ {0, 1, 2, ...}
P(X = x) = x! EX = λ Var [X] = λ
0 otherwise
2
Expectation: Continuous case. Suppose fX (x) is the probability density function of X. Then
ˆ ∞
EX = xfX (x)dx
ˆ−∞
∞
E [g(X)] = g(x)fX (x)dx
−∞
Var [X] = E (X − µ)2 = E X 2 − ( E X)2 , where µ = E X
Continuous distributions:
1 (b − a)2
b−a x ∈ [a, b] b+a
fX (x) = EX = Var [X] =
0 otherwise 2 12
1 2 2
fX (x) = √ e−(x−µ) /2σ EX = µ Var [X] = σ 2
2π σ
Poisson Approximation Theorem: Let X ∼ Binomial (x; n, p). For np small relative to n,
P(X = x) ≃ P(Y = x), where Y ∼ Poisson(y; λ = np).
DeMoivre-Laplace Approximation Theorem: Let X ∼ Binomial (x; n, p). For np(1 − p) suffi-
ciently large, !
X − np
P p ≤ z ≃ P(Z ≤ z) = Φ(z)
np(1 − p)
where Z ∼ N (z; 0, 1).