In information geometry, the Fisher information metric[1] is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space. It can be used to calculate the informational difference between measurements.[clarification needed]
The metric is interesting in several aspects. By Chentsov’s theorem, the Fisher information metric on statistical models is the only Riemannian metric (up to rescaling) that is invariant under sufficient statistics.[2][3]
It can also be understood to be the infinitesimal form of the relative entropy (i.e., the Kullback–Leibler divergence); specifically, it is the Hessian of the divergence. Alternately, it can be understood as the metric induced by the flat space Euclidean metric, after appropriate changes of variable. When extended to complex projective Hilbert space, it becomes the Fubini–Study metric; when written in terms of mixed states, it is the quantum Bures metric.
Considered purely as a matrix, it is known as the Fisher information matrix. Considered as a measurement technique, where it is used to estimate hidden parameters in terms of observed random variables, it is known as the observed information.
^Nielsen, Frank (2023). "A Simple Approximation Method for the Fisher–Rao Distance between Multivariate Normal Distributions". Entropy. 25 (4): 654. arXiv:2302.08175. Bibcode:2023Entrp..25..654N. doi:10.3390/e25040654. PMC 10137715. PMID 37190442.
^Amari, Shun-ichi; Nagaoka, Horishi (2000). "Chentsov's theorem and some historical remarks". Methods of Information Geometry. New York: Oxford University Press. pp. 37–40. ISBN 0-8218-0531-2.
^Dowty, James G. (2018). "Chentsov's theorem for exponential families". Information Geometry. 1 (1): 117–135. arXiv:1701.08895. doi:10.1007/s41884-018-0006-4. S2CID 5954036.
and 23 Related for: Fisher information metric information
mathematical statistics, the Fisherinformation (sometimes simply called information) is a way of measuring the amount of information that an observable random...
generalization of the Fisherinformationmetric, and is identical to the Fubini–Study metric when restricted to the pure states alone. The Bures metric may be defined...
such models, there is a natural choice of Riemannian metric, known as the Fisherinformationmetric. In the special case that the statistical model is an...
manifolds provide a setting for the field of information geometry. The Fisherinformationmetric provides a metric on these manifolds. Following this definition...
expected information matrix performing at least as well as the observed information matrix. Fisherinformation matrix Fisherinformationmetric Dodge, Y...
encoding of second order information (aka codeword covariances) indeed benefits classification performance. Fisherinformationmetric Tommi Jaakola and David...
particular type of information geometry and it is similar to the Fisher-Rao metric used in mathematical statistics. The Ruppeiner metric can be understood...
about the information geometry on the latent space of distributions rests upon the existence and uniqueness of the Fisherinformationmetric. In this general...
The quantum Fisherinformation is a central quantity in quantum metrology and is the quantum analogue of the classical Fisherinformation. It is one of...
Riemannian metric can be introduced in several ways. In particular, one can introduce Hessian metrics like the Fisherinformationmetric, the Weinhold metric, the...
become nearly (but not exactly) circular in these spaces. Using a Fisherinformationmetric, da Fonseca et al. investigated the degree to which MacAdam ellipses...
yields the field of information geometry, particularly via the Fisherinformationmetric. In structural geology, differential geometry is used to analyze...
the Fisherinformation matrix (metric) leads to a notation of g i j {\displaystyle g^{ij}} following the notation of the (contravariant) metric tensor...
works dealt with the projection filters in Hellinger distance and Fisherinformationmetric, that were used to project the optimal filter infinite-dimensional...
Estimation theory FisherinformationInformation algebra Information asymmetry Information field theory Information geometry Information theory and measure...
\eta _{\mu \nu }\;} is the Minkowski metric. g μ ν {\displaystyle g_{\mu \nu }\;} is a tensor, usually the metric tensor. These have signature (−,+,+,+)...
analysis (NDA), or discriminant function analysis is a generalization of Fisher's linear discriminant, a method used in statistics and other fields, to find...
metric to measure the distance between two hypotheses. Its absolute value is minimum when the two distributions are identical. It is the information measure...
been attempts at creating information models for older, pre-existing facilities. Approaches include referencing key metrics such as the Facility Condition...