Lehmann–Scheffé theorem
In statistics, the Lehmann–Scheffé theorem is prominent statement, tying together the ideas of completeness, sufficiency, uniqueness, and best unbiased estimation.[1] The theorem states that any estimator which is unbiased for a given unknown quantity and that depends on the data only through a complete, sufficient statistic is the unique best unbiased estimator of that quantity. The Lehmann–Scheffé theorem is named after Erich Leo Lehmann and Henry Scheffé, given their two early papers.[2][3]
If T is a complete sufficient statistic for θ and E(g(T)) = τ(θ) then g(T) is the uniformly minimum-variance unbiased estimator (UMVUE) of τ(θ).
Contents
Statement
Let be a random sample from a distribution that has p.d.f (or p.m.f in the discrete case)
where
is a parameter in the parameter space. Suppose
is a sufficient statistic for θ, and let
be a complete family. If
then
is the unique MVUE of θ.
Proof
By the Rao–Blackwell theorem, if is an unbiased estimator of θ then
defines an unbiased estimator of θ with the property that its variance is not greater than that of
.
Now we show that this function is unique. Suppose is another candidate MVUE estimator of θ. Then again
defines an unbiased estimator of θ with the property that its variance is not greater than that of
. Then
Since is a complete family
and therefore the function is the unique function of Y with variance not greater than that of any other unbiased estimator. We conclude that
is the MVUE.
See also
References
<templatestyles src="Reflist/styles.css" />
Cite error: Invalid <references>
tag; parameter "group" is allowed only.
<references />
, or <references group="..." />
<templatestyles src="Asbox/styles.css"></templatestyles>