In mathematical statistics, the Fisher information (sometimes simply called information[1]) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information.
The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Sir Ronald Fisher (following some initial results by Francis Ysidro Edgeworth). The Fisher information matrix is used to calculate the covariance matrices associated with maximum-likelihood estimates. It can also be used in the formulation of test statistics, such as the Wald test.
In Bayesian statistics, the Fisher information plays a role in the derivation of non-informative prior distributions according to Jeffreys' rule.[2] It also appears as the large-sample covariance of the posterior distribution, provided that the prior is sufficiently smooth (a result known as Bernstein–von Mises theorem, which was anticipated by Laplace for exponential families).[3] The same result is used when approximating the posterior with Laplace's approximation, where the Fisher information appears as the covariance of the fitted Gaussian.[4]
Statistical systems of a scientific nature (physical, biological, etc.) whose likelihood functions obey shift invariance have been shown to obey maximum Fisher information.[5] The level of the maximum depends upon the nature of the system constraints.
^Lehmann & Casella, p. 115
^Robert, Christian (2007). "Noninformative prior distributions". The Bayesian Choice (2nd ed.). Springer. pp. 127–141. ISBN 978-0-387-71598-8.
^Le Cam, Lucien (1986). Asymptotic Methods in Statistical Decision Theory. New York: Springer. pp. 618–621. ISBN 0-387-96307-3.
^Kass, Robert E.; Tierney, Luke; Kadane, Joseph B. (1990). "The Validity of Posterior Expansions Based on Laplace's Method". In Geisser, S.; Hodges, J. S.; Press, S. J.; Zellner, A. (eds.). Bayesian and Likelihood Methods in Statistics and Econometrics. Elsevier. pp. 473–488. ISBN 0-444-88376-2.
^Frieden & Gatenby (2013)
and 25 Related for: Fisher information information
mathematical statistics, the Fisherinformation (sometimes simply called information) is a way of measuring the amount of information that an observable random...
In information geometry, the Fisherinformation metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a...
The quantum Fisherinformation is a central quantity in quantum metrology and is the quantum analogue of the classical Fisherinformation. It is one of...
In information theory, the principle of minimum Fisherinformation (MFI) is a variational principle which, when applied with the proper constraints needed...
In statistics, the observed information, or observed Fisherinformation, is the negative of the second derivative (the Hessian matrix) of the "log-likelihood"...
quadratic terms. The word information, in the context of Fisherinformation, refers to information about the parameters. Information such as: estimation, sufficiency...
Sir Ronald Aylmer Fisher FRS (17 February 1890 – 29 July 1962) was a British polymath who was active as a mathematician, statistician, biologist, geneticist...
of Information Review by Luciano Floridi for the Stanford Encyclopedia of Philosophy Principia Cybernetica entry on negentropy FisherInformation, a New...
depends on the expected value of the Fisherinformation matrix, which is provided by a theorem proven by Fisher. Wilks continued to improve on the generality...
distributions. Historically, information geometry can be traced back to the work of C. R. Rao, who was the first to treat the Fisher matrix as a Riemannian...
In statistical classification, the Fisher kernel, named after Ronald Fisher, is a function that measures the similarity of two objects on the basis of...
Q [ ϱ , B ] {\displaystyle F_{Q}[\varrho ,B]} denotes the quantum Fisherinformation and the density matrix is decomposed to pure states as ϱ = ∑ k p k...
Estimation theory FisherinformationInformation algebra Information asymmetry Information field theory Information geometry Information theory and measure...
serves as a point estimate for θ {\displaystyle \theta } , while the Fisherinformation (often approximated by the likelihood's Hessian matrix) indicates...
function is proportional to the square root of the determinant of the Fisherinformation matrix: p ( θ → ) ∝ det I ( θ → ) . {\displaystyle p\left({\vec {\theta...
The Fisher King is a 1991 American fantasy comedy-drama film written by Richard LaGravenese and directed by Terry Gilliam. Starring Robin Williams and...
operators defining quantum states. It is a quantum generalization of the Fisherinformation metric, and is identical to the Fubini–Study metric when restricted...
Roger (2010). "Efficient Monte Carlo computation of Fisherinformation matrix using prior information". Computational Statistics & Data Analysis. 54 (2):...
as Fisher's scoring, is a form of Newton's method used in statistics to solve maximum likelihood equations numerically, named after Ronald Fisher. Let...
Robert William Fisher (born April 13, 1961) is an American fugitive wanted for allegedly killing his family and blowing up the house in which they lived...
applying local quadratic approximation to the likelihood (through the Fisherinformation), the least-squares method may be used to fit a generalized linear...
Noel Roeim Fisher (born March 13, 1984) is a Canadian actor. He is known for his portrayal of Mickey Milkovich on the Showtime series Shameless, as well...
Statistical manifolds provide a setting for the field of information geometry. The Fisherinformation metric provides a metric on these manifolds. Following...
it with the Fisherinformation matrix, which transforms usual gradient to natural. These methods not requiring direct Hessian information are based on...