Observed information
In statistics, the observed information, or observed Fisher information, is the negative of the second derivative (the Hessian matrix) of the "log-likelihood" (the logarithm of the likelihood function). It is a sample-based version of the Fisher information.
Definition
Suppose we observe random variables , independent and identically distributed with density f(X; θ), where θ is a (possibly unknown) vector. Then the log-likelihood of the parameters given the data is
- .
We define the observed information matrix at as
In many instances, the observed information is evaluated at the maximum-likelihood estimate.[1]
Fisher information
The Fisher information is the expected value of the observed information given a single observation distributed according to the hypothetical model with parameter :
- .
Applications
In a notable article, Bradley Efron and David V. Hinkley [2] argued that the observed information should be used in preference to the expected information when employing normal approximations for the distribution of maximum-likelihood estimates.
See also
References
- ↑ Dodge, Y. (2003) The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9
- ↑ Lua error in package.lua at line 80: module 'strict' not found.