Chapter 6
Chapter 6
are the
maximum likelihood estimators of μ and σ2, Then E(X) = r. Therefore:
respectively. A natural question then is whether
or not these estimators are "good" in any
sense. One measure of "good"
is "unbiasedness." The first equality holds because we effectively
multiplied the sample variance by 1. The second
Example No. 1: If Xi is a Bernoulli random equality holds by the law of expectation that tells us
variable with parameter p, then: we can pull a constant through the expectation. The
third equality holds because of the two facts we
recalled above. That is:
Let’s calculate the standard error of the sample Exhibit 4.2: PDFs are indicated for two
mean estimator: estimators of a parameter θ. One is unbiased.
The other is biased but has lower standard
error.
Mean squared error (MSE) combines the
notions of bias and standard error. It is defined
as: