probability lecture notes
probability lecture notes
Lecture 1
Scarlett [email protected]
De-Morgan's Laws
Axioms of probability
1.
2.
Partition
example 1.
black balls and white balls. Assume , then the probability of "first pick black, second pick white" is
where
therefore
Lecture 2
Review on .
When ,
Negative-Binomial
success at trial
Team and team are in a game. The team wins 4 in 7 games wins the game.
Random variables
1.
2. Non-decreasing
3. Right-continuous
. .
Actually,
1.
2.
Linear Filter
Lecture 3
Calculate the conditional probability ,
Since
Then
a-posteriori pdf
Expected value of .
is dispersion
measure of spread of observation around mean
Characteristic Function
Which means
Another example
We can make deduction of the Chebyshev's inequation
Make as
Then
Lecture 4
Characteristic Function
For Gaussian
Specifically, For
Then
Leibnitz's Rule
Properties:
1.
2.
Obviously, there is a character
Lecture 5
A Function of Two Random Variables
Given , find .
If
Leibnitz Rule
Proof:
When two random variables are independent, we can use convolutional operation
For example,
Another example, .
, assume as , as ,
If
jointly Gaussian
Lecture 6
Review
, given, find .
Lecture 7
Gaussian
Joint Gaussian
For
If
Since
then
If , then
Also,
then
But reversely, it doesn't apply, except when are jointly Gaussian random variables.
Example:
, assume ,
Theorem: Any two linear combinations of two jointly Gaussian random variables are still jointly Gaussian random variables.
This is really strange, since are not independent, but they are able to generate that are independent.
Assume
Let
Take as , as . Then
Joint Poisson
, assume
For ,
Let
Central Limit Theorem: Large sum of any random variables behave like Gaussian.
Let
As ,
Proof:
When
Accordingly, we have
That following Gaussian distribution, would be a good proof of central limit theorem.
Lecture 8
Conditional distributions and conditional p.d.f.s
Leads to
Since
Conditional expectation is
Which proves that and are actually independent, rather than instinctly feeling dependent.
Two experiments:
. , .
Where
Best estimator for the unknown ? Collect data (Design experiments), find independent
Assume .
Likelihood function
is unbiased estimation
We can find