0% found this document useful (0 votes)
39 views1 page

PNS Complete Flow Chart

The document discusses different approaches and concepts related to probability including classical, statistical, and personal approaches. It also covers conditional probability, independent events, total probability law, and Bayes' theorem.

Uploaded by

dsaproudemy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views1 page

PNS Complete Flow Chart

The document discusses different approaches and concepts related to probability including classical, statistical, and personal approaches. It also covers conditional probability, independent events, total probability law, and Bayes' theorem.

Uploaded by

dsaproudemy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Probability

~ Made by Tanish Desai

Classical
Personal Statistical
Approach Here first think using
Approach complementary method and then
Approach
try using forward method

P(A) = n(A)/n(S)

Can Extend for


Conditional Probability
'n' events
Independent Events
P(A∩B)=P(A).P(B)

P(A/B)=P(A∩B)/P(B)

Total Probability Law


P(B) = ΣP(Ai).P(B/Ai)

In the above case if it is


given that B has occured
and we have to find
probability that it is
contained in Ai then we
use Bayes' Theoram

Bayes' Theoram
P(Ai/B) = P(B/Ai).P(Ai)/P(B)

Discrete distribution

Discrete Random Variable Independent Random


rv(Random Variable) Variable(X,Y)
P(X=x,Y=y)=P(X=x)P(Y=y)

discrete uniform distribution binomial distribution poisson distribution geometric distribution


hypergeometric distribution highly useful in solving
For a given sample space S of some experiment, a random
simillar to binomial but question involving proof or
variable
without replacement Used when n>>1 and p(s)<<1 involve 2 rv.
(rv) is any rule that associates a real number with each outcome in
X ~ Hypergeometric(n,M,N) then we use poisson It is simillar to bernoulli but here experiment stops after S.
first success
used when there are M success and N-M 1.experiment consist of independent trials
faliure and we need to choose n success Poi(x,p,n) = λx e-λ / x! 2.outcome of each trial is success or faliure
without replacement where λ = p*n
3.prob of success is constant from trial to trail and is
denoted by p.
Second moment around µ
root of Variance is called
E(x) = mean = λ is called Variance(σ2)
Standard Deviation(σ)
this can be approximated with Var(x) = λ kth ordinary moment Var(X) = E((X- µ)2)
X ~ G(p) Expected Value(µ)
binomial if M>>n but for small mx(t) = eλ(e^t - 1) pdf(probability density cdf(cummulative
values we need to use function) distribution function)
hypergeometric distribution.

Conditions
1. The experiment should consist of fixed number, n, of bernoulli trails, trails that Moment Generating
results in success or failure
Function(mgf)
2.The trails are identical and independent, and therefore the probability of if we have two or move
Conditions success, p, remains the same from trail to trail. exp/process interlinked then Assumptions:-
1. if each value is equally 3.The random variable X denotes the number of successes obtained in the n start from first find its 'p' then 1.The event occur with a known constant mean rate
2.probability of occerence of a event in a small interval is
likely trails. use it as a parameter for proportional to size of interval
second Binomial trial and so on 3.Probability of occurence of 2 event in same narrow time is
negligible.
4.The probability of an event in one interval is independent of
probability of event in another interval.

if X~Bin(n,p) and Y~Bin(m,p) are two independent events then distribution of X + Y is


MX+Y(t)=MX(t)*MY(t) (As both are independent events) Poisson Procees
A Poisson Process is a model for a series of discrete

=> X + Y ~ Bin(n+m, p) event where the rate of occurrence of events is

known, but the exact timing of events is random. The


arrival of an event is independent of the event
before.

rule of thumb is
N > 20n

Inequalities in
Probability

Markov Inequality

Continous Random Variable Chebyshev's


Unlike the discrete random variable where we derived the Inequality
formula for pmf, here we will focus on studying the named continuous The graph of f(x) is also known
as probability density curve
distributions with known pdf. For discrete distributions, names go with Let X be an continues random variable, then probability
situations describing what is being counted. For continuous distributions, function or the probability density function(pdf) of X is an
names go with shapes. integrable function f:R --> [0,1] such that for any two chosen a
and b

cdf of a continues function is the area under pdf.In other words the integral of f(x) is equal to
F(x).

Joint Distribution

Discrete Variable
Continuous Variable

Extras Let
TheX and Y bepair
ordered
dimensional
discrete
(X,Y) random
discrete
is calledvariable.
a two-A
fXY(x, random
variable.
function y) such that
Γ(x) function is defined as fXY(x, y) = P[X = x and Y = y]
25th Percentile -- > First quartile is called joint density for (X, Y)
and so on ...
simillarly we can find
varience and SD Probability of all the points are same or point choose is totally
random then it is uniform distribution

put limits
in integral
wrt 'x' * Marginal Distribution

Here be very careful we have to find Prob that The marginal distribution of "x" is given by
Bell Shaped Curve fX(x) = Σfor all y fXY(x, y)
event will occur in given time period or
simillarly we can define for y
probability that event will not occur
The Γ distribution gives prob of occurrence in
given time period Independent random variable

Continues Random Variable

Uniform Distributions Standard Normal Distribution


Gamma Distribution Exponential Distribution Chi squared Distribution

Conditional
Density

Standard Normal Distribution

Ex. A random variable X is normally distributed with


mean 9 and standard deviation 3. Find P(X ≥ 15),
P(X ≤ 15) and P(0 ≤ X ≤ 9).
relation of chi-squared distribution with the
normal distribution
We can find the value of φ(z)=P(x<z) by using standard
table given below

α
cdf

z Critical Value

way to calculate
covariance faster

Covariance of two independent variable is zero.


Cov(X,X) = E((μX-x)2)=V(X)
Cov(Y+Z,X) = Cov(Y,X) + Cov(Z,X)
Cov(aX+b,Y) = aCov(X,Y)

Why??

Can only be between


-1 and 1

Generalisation for n
variables

You might also like