0% found this document useful (0 votes)
27 views6 pages

Financial Analytics: Sessions 1

1) The document defines key probability concepts like experiments, sample spaces, events, and the axioms and laws of probability. 2) It discusses concepts like joint probability, independence, random variables, and their expectations, variances, probability density functions, and cumulative distribution functions. 3) The document is a set of lecture notes on financial analytics sessions covering foundational probability concepts.

Uploaded by

Pratik Jain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views6 pages

Financial Analytics: Sessions 1

1) The document defines key probability concepts like experiments, sample spaces, events, and the axioms and laws of probability. 2) It discusses concepts like joint probability, independence, random variables, and their expectations, variances, probability density functions, and cumulative distribution functions. 3) The document is a set of lecture notes on financial analytics sessions covering foundational probability concepts.

Uploaded by

Pratik Jain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

02-11-2020

Financial Analytics
Sessions 1

Dilip Kumar 1

Definition of Probability
• Experiment: Any procedure which can be repeated infinitely and has well
defined set of outcomes. For example: up and down movements in prices
in n steps, toss a coin twice.
• Sample space: possible outcomes of an experiment
– S = {HH, HT, TH, TT}
– For up and down shift in price (one step binomial model): S= {u, d}
– For two step binomial model: S= {uu, ud, du, dd}
– For three step binomial model: S= {uuu, uud, udu, duu, udd, dud, ddu, ddd}
• Event: a subset of possible outcomes (e.g. A={HH}, B={HT, TH})
• Probability of an event :
– Axiom 1: Pr(A)  0
– Axiom 2: Pr(S) = 1
– Axiom 3: For every sequence of disjoint events Pr(i Ai )   i Pr( Ai )

Dilip Kumar

1
02-11-2020

Laws of Probability (Law 1 and 2)


Law1: The probability of an event A is a number between 0 and 1:

If an event has probability 1 it is certain to occur.


Law2: Suppose A and B are two mutually exclusive events, which means
that they cannot both occur at the same time. Then the probability of
either A occurring or B occurring is the sum of their probabilities. That is

The event that A does not occur is called the complement of A

Dilip Kumar 3

Joint Probability (Law 3)


Law3: The joint probability of two events A and B occurring is:

For events A and B, joint probability Pr(AB) stands for the probability that both
events happen.
P(A|B) = conditional probability of A occurring, given that B occurs.
• It is 0 if the events are mutually exclusive.
P(B) = the marginal probability of B (prior probability)

Dilip Kumar 4

2
02-11-2020

Independence (Law 3)
Two events A and B are independent in case
Pr(AB) = Pr(A)Pr(B)
A set of events {Ai} is independent in case
Pr( i Ai )   i Pr( Ai )

Dilip Kumar 5

Law 4
Law4: Let A and B be any two events. Then the probability of
either A occurring or B occurring is the sum of their probabilities
less the probability that they both occur. That is:

Dilip Kumar 6

3
02-11-2020

Random variable, density and distribution


A random variable is a variable whose values are stochastic (uncertainty
about the values).
Deterministic variable: whose values are completely determined by the
information we currently hold.
A realization(also called an observation) on a random variable X may be
thought of as a number that is associated with a chance outcome. Since
every outcome is determined by a chance event, every outcome has a
measure of probability associated with it. The set of all outcomes and their
associated probabilities is called a probability measure.
Two ways to represent Probability measure:
Probability distribution function
Probability density function
Dilip Kumar 7

Expectation
A random variable X~Pr(X=x). Then, its expectation is
E[ X ]   x x Pr( X  x)
In an empirical sample, x1, x2,…, xN,

1 N
E[ X ] 
N
 i1 xi
Continuous case:

E[ X ]   xp ( x) dx


Expectation of sum of random variables


E[ X1  X 2 ]  E[ X1 ]  E[ X 2 ]

Dilip Kumar 8

4
02-11-2020

Variance
The variance of a random variable X is the expectation of
(X-E[x])2 :

Var ( X )  E (( X  E[ X ])2 )
 E ( X 2  E[ X ]2  2 XE[ X ])
 E ( X 2  E[ X ]2 )
 E[ X 2 ]  E[ X ]2

Dilip Kumar 9

Probability density function


Let X be a continuous random variable. Let x denotes the value
that the random variable X takes. We use f(x) to denote the
probability density function.
Properties:
f(x)≥0 for any x
Total area under f(x) is 1, that is, the density function integrate to 1 for
entire real line.

Dilip Kumar 10

5
02-11-2020

Cumulative distribution function


The cumulative distribution function, F(x), for a continuous
random variable X expresses the probability that X does not
exceed the value of x, as a function of x
F ( x)  P( X  x)
P (a  X  b)  P( X  b)  P( X  a )  F (b)  F (a)
Let X be a continuous random variable. Then, there is a
following relationship between probability density function and
cumulative distribution function.
b
P (a  X  b)  F (b)  F ( a)   f ( u ) du
a

Dilip Kumar 11

Cumulative distribution function


In other words, the cumulative distribution function F(x) is given
by the shaded area.

f(x)

F(x)=P(X≤x)

x x

Dilip Kumar

You might also like