Basu's theorem

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

In statistics, Basu's theorem states that any boundedly complete sufficient statistic is independent of any ancillary statistic. This is a 1955 result of Debabrata Basu.[1]

It is often used in statistics as a tool to prove independence of two statistics, by first demonstrating one is complete sufficient and the other is ancillary, then appealing to the theorem.[2] An example of this is to show that the sample mean and sample variance of a normal distribution are independent statistics, which is done in the Examples section below. This property (independence of sample mean and sample variance) characterizes normal distributions.

Statement

Let Pθ be a family of distributions on a measurable space (X, Σ). Then if T is a boundedly complete sufficient statistic for θ, and A is ancillary to θ, then T is independent of A.

Proof

Let PθT and PθA be the marginal distributions of T and A respectively.

P_\theta^A(B) = P_\theta (A^{-1} B) = \int_{T(X)} P_\theta(A^{-1}B | T=t) \  P_\theta^T (dt) \,

The PθA does not depend on θ because A is ancillary. Likewise, Pθ(·|T = t) does not depend on θ because T is sufficient. Therefore:

 \int_{T(X)} \big[ P(A^{-1}B | T=t) - P^A(B)  \big] \ P_\theta^T (dt) = 0 \,

Note the integrand (the function inside the integral) is a function of t and not θ. Therefore, since T is boundedly complete:

P(A^{-1}B | T=t) = P^A(B) \quad \text{for all }t\,

Therefore, A is independent of T.

Example

Independence of sample mean and sample variance of a normal distribution

Let X1, X2, ..., Xn be independent, identically distributed normal random variables with mean μ and variance σ2.

Then with respect to the parameter μ, one can show that

\widehat{\mu}=\frac{\sum X_i}{n},\,

the sample mean, is a complete sufficient statistic – it is all the information one can derive to estimate μ, and no more – and

\widehat{\sigma}^2=\frac{\sum \left(X_i-\bar{X}\right)^2}{n-1},\,

the sample variance, is an ancillary statistic – its distribution does not depend on μ.

Therefore, from Basu's theorem it follows that these statistics are independent.

This independence result can also be proven by Cochran's theorem.

Further, this property (that the sample mean and sample variance of the normal distribution are independent) characterizes the normal distribution – no other distribution has this property.[3]

Notes

  1. Basu (1955)
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.

Lua error in package.lua at line 80: module 'strict' not found.

References

  • Lua error in package.lua at line 80: module 'strict' not found.
  • Mukhopadhyay, Nitis (2000). Probability and Statistical Inference. Statistics: A Series of Textbooks and Monographs. 162. Florida: CRC Press USA. ISBN 0-8247-0379-0.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.