2021 Week 5 Chapter3 Control
2021 Week 5 Chapter3 Control
Introduction
z=x +v
z ≔ measurement
- Known fact
f ( z ) , f ( x , z ) , f ( x|z ) , …
- Given a measure of z , What is an estimator ^x of x ? For example
^x =E [ z ] , ^x =E [ x|z ] ,∨others
Kim: Example
If you measure your temperature, there are two R.V. Let two random variables, x :
temperature, v : measurement sensor noise.
If you assume they are Gaussians, E [ z ] =E [ x ] + E[v ] . Or you may statistical average.
E [x∨z ]
How to get this? Later we will show if they are Gaussian, then
^x MV =argmin E[ ( x−^x )2 ¿ ¿ z ]¿
^x
https://en.wikipedia.org/wiki/Minimum_mean_square_error#:~:targetText=In
%20statistics%20and%20signal%20processing,values%20of%20a%20dependent
%20variable.
ϑ t=1 ϑ t =1
Bayesian probability
f ( x , z ) f ( z|x ) f ( x )
f ( x|z )= =
f (z) f (z)
f ( x|z ) : = the posteriori PDF
%% Kim
%%%%
z=x +ν
where extimate the states x ∈ R n , z ∈measurement ,h is known , ν noise RVectors .
^x MV =argmin E[ ( x−^x )2 ¿ ¿ z ]¿
^x
Remark 3.1.argmin
argmin ( x +2 ) =0
2
0 ≤ α ≤ 1, ρ ( αx + (1−α ) y ) ≤ αρ ( x )+ ( 1−α ) ρ ( y )
-. L ( 0 )=0
-. ρ ( x 1 ) ≥ ρ ( x 2 ) ≥ 0=¿ L ( x 1 ) ≥ L ( x 2 ) ≥ 0
(Skip) Theorem 3.2(Sherman’s Theorem). Let x be a random vector with mean, μ, and
density, f x (∙). Let L ( e ) , e=x−^x , be a loss function as defined above. If f x (∙). Is
symmetric about μ and unimodal(i.e., has only one peak), then ^x =μ minimizes
E [L ( e )] .
Without observation, the minimum variance estimator.
Sol:
∂A
=−2 E [ X ] +2 a=0
∂a
Remark Sherman’s theorem is generalized of the above estimator.
There are two MVE. But E [ x ] is the MVE also, the expectation.
Theorem 3.6. Given the equation(3.1). if the estimate is a function of z , then the
minimum variance estimate is the conditional mean.
Then
min E ¿ ¿
a
¿ min ¿
a
(a) is equivalent to
Remarks:
z=x +w
1) It means
^x MV =argmin E ¿¿
^x
2) It is unbiased
%% Kim, comment
It is important that
dx
For example, =−2 x , x ( 0 ) is a R .V .
dt
The solution is
−2 t
x ( t )=e x ( 0 ): which is a R.V. dependent of the initial point R.V. x (0)
−2 t
What is MVE? Yap ^x MV =x ( t )=e x ( 0 )=E [x (t)∨x ( 0 )]
The following example by M.Idan should be carefully understood. The main goal is to
verify
Let tow random variables X , Y which are independent with pdf as f ( x ) , g ( y ) . Define Z
Z=X +Y ;
Find pdf of Z
Solution:
z z ∞ z−x
P ( Z ≤ z )= ∫ f ( z ) dz= ∫ f (x , y )dz =∫ ∫ f ( x , y ) dy dx
−∞ −∞ −∞ −∞
∞ z− x
¿ ∫ f (x ) ∫ g ( y ) dy dx ∵ f ( x , y ) =f ( x ) g ( y )
−∞ −∞
Now
( ) ( )
∞ z− x ∞ z− x
df ( z ) d P ( Z ≤ z ) d d dy
dz
=
dz
=
dz
∫ f ( x) ∫ g ( y ) dy dx =∫ f (x )
dz
∫ g ( y ) dy dx
−∞ −∞ −∞ −∞
∞
¿ ∫ f ( x ) g ( z−x ) dx
−∞
= ∫ f ( z − y ) g ( y ) dy
−∞
Examp.
{
f ( x )=g ( y )= 1 , if 0 ≤ x ≤1 , 0 ≤ y ≤ 1 ( a .1)
0 otherwise
f( z )= ∫ f ( x ) g ( z−x ) dx
−∞
e) In conclusion
{
z0≤ z≤1
f ( z )= 2−z 1 ≤ z ≤2
0 otherwise
∞
f( z )= ∫ f ( x ) g ( z−x ) dx
−∞
g(-x+z)
g(x) g(-x)
-1+z z
1
-1
3. Increase z from negative infinity to positive infinity to integrate the convolution integral
g(-x+z) f(x)
-1+z z
z
1 1
f( z )= ∫ 1dx= ∫ 1 dx=2−z
−1 +z −1+ z
f( z )= ∫ 0 dx=0
2
f(x)
g(-x+z) f(x) g(-x+z)
-1+z z -1+z z
%%%%%%