In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Not limited to real-valued random variables and linear dependence like the correlation coefficient, MI is more general and determines how different the joint distribution of the pair is from the product of the marginal distributions of and . MI is the expected value of the pointwise mutual information (PMI).
The quantity was defined and analyzed by Claude Shannon in his landmark paper "A Mathematical Theory of Communication", although he did not call it "mutual information". This term was coined later by Robert Fano.[2] Mutual Information is also known as information gain.
^Cover, Thomas M.; Thomas, Joy A. (2005). Elements of information theory(PDF). John Wiley & Sons, Ltd. pp. 13–55. ISBN 9780471748823.
^Kreer, J. G. (1957). "A question of terminology". IRE Transactions on Information Theory. 3 (3): 208. doi:10.1109/TIT.1957.1057418.
and 23 Related for: Mutual information information
In probability theory and information theory, the mutualinformation (MI) of two random variables is a measure of the mutual dependence between the two...
statistics, probability theory and information theory, pointwise mutualinformation (PMI), or point mutualinformation, is a measure of association. It...
particularly information theory, the conditional mutualinformation is, in its most basic form, the expected value of the mutualinformation of two random...
In quantum information theory, quantum mutualinformation, or von Neumann mutualinformation, after John von Neumann, is a measure of correlation between...
In probability theory and information theory, adjusted mutualinformation, a variation of mutualinformation may be used for comparing clusterings. It...
derive a right to profits and votes Mutual information, the intersection of multiple information sets Mutual insurance, where policyholders have certain...
measures in information theory are mutualinformation, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory...
of information, information correlation, co-information, and simply mutualinformation. Interaction information expresses the amount of information (redundancy...
closely related to mutualinformation; indeed, it is a simple linear expression involving the mutualinformation. Unlike the mutualinformation, however, the...
{\Sigma }}_{1}| \over |{\boldsymbol {\Sigma }}_{0}|}\right\}.} The mutualinformation of a distribution is a special case of the Kullback–Leibler divergence...
measures in information theory are mutualinformation, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory...
of the channel, as defined above, is given by the maximum of the mutualinformation between the input and output of the channel, where the maximization...
A mutual fund is an investment fund that pools money from many investors to purchase securities. The term is typically used in the United States, Canada...
condition to capture some fraction of the mutualinformation with the relevant variable Y. The information bottleneck can also be viewed as a rate distortion...
than Pearson's, that is, more sensitive to nonlinear relationships. Mutualinformation can also be applied to measure dependence between two variables. The...
Liberty Mutual Insurance Company is an American diversified global insurer and the sixth-largest property and casualty insurer in the world. It ranks 71st...
quantum information in the state will remain after the state goes through the channel. In this sense, it is intuitively similar to the mutualinformation of...
of the information content of random variables and a measure over sets. Namely the joint entropy, conditional entropy, and mutualinformation can be considered...
and important measures of information is the mutualinformation, or transinformation. This is a measure of how much information can be obtained about one...
In linguistics, mutual intelligibility is a relationship between languages or dialects in which speakers of different but related varieties can readily...