Global Information Lookup Global Information

Mutual information information


Venn diagram showing additive and subtractive relationships of various information measures associated with correlated variables and .[1] The area contained by either circles is the joint entropy . The circle on the left (red and violet) is the individual entropy , with the red being the conditional entropy . The circle on the right (blue and violet) is , with the blue being . The violet is the mutual information .

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.

Not limited to real-valued random variables and linear dependence like the correlation coefficient, MI is more general and determines how different the joint distribution of the pair is from the product of the marginal distributions of and . MI is the expected value of the pointwise mutual information (PMI).

The quantity was defined and analyzed by Claude Shannon in his landmark paper "A Mathematical Theory of Communication", although he did not call it "mutual information". This term was coined later by Robert Fano.[2] Mutual Information is also known as information gain.

  1. ^ Cover, Thomas M.; Thomas, Joy A. (2005). Elements of information theory (PDF). John Wiley & Sons, Ltd. pp. 13–55. ISBN 9780471748823.
  2. ^ Kreer, J. G. (1957). "A question of terminology". IRE Transactions on Information Theory. 3 (3): 208. doi:10.1109/TIT.1957.1057418.

and 23 Related for: Mutual information information

Request time (Page generated in 0.8347 seconds.)

Mutual information

Last Update:

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two...

Word Count : 8693

Pointwise mutual information

Last Update:

statistics, probability theory and information theory, pointwise mutual information (PMI), or point mutual information, is a measure of association. It...

Word Count : 1692

Conditional mutual information

Last Update:

particularly information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random...

Word Count : 2385

Quantum mutual information

Last Update:

In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between...

Word Count : 1391

Adjusted mutual information

Last Update:

In probability theory and information theory, adjusted mutual information, a variation of mutual information may be used for comparing clusterings. It...

Word Count : 1115

Mutual

Last Update:

derive a right to profits and votes Mutual information, the intersection of multiple information sets Mutual insurance, where policyholders have certain...

Word Count : 198

Information theory

Last Update:

measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory...

Word Count : 7088

Interaction information

Last Update:

of information, information correlation, co-information, and simply mutual information. Interaction information expresses the amount of information (redundancy...

Word Count : 2417

Feature selection

Last Update:

of the feature set. Common measures include the mutual information, the pointwise mutual information, Pearson product-moment correlation coefficient,...

Word Count : 6933

Variation of information

Last Update:

closely related to mutual information; indeed, it is a simple linear expression involving the mutual information. Unlike the mutual information, however, the...

Word Count : 1446

Multivariate normal distribution

Last Update:

{\Sigma }}_{1}| \over |{\boldsymbol {\Sigma }}_{0}|}\right\}.} The mutual information of a distribution is a special case of the Kullback–Leibler divergence...

Word Count : 9474

Information

Last Update:

measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory...

Word Count : 5067

Information gain ratio

Last Update:

into account when choosing an attribute. Information gain is also known as mutual information. Information gain is the reduction in entropy produced...

Word Count : 1102

Channel capacity

Last Update:

of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization...

Word Count : 4751

Mutual fund

Last Update:

A mutual fund is an investment fund that pools money from many investors to purchase securities. The term is typically used in the United States, Canada...

Word Count : 5807

Information bottleneck method

Last Update:

condition to capture some fraction of the mutual information with the relevant variable Y. The information bottleneck can also be viewed as a rate distortion...

Word Count : 3658

Correlation

Last Update:

than Pearson's, that is, more sensitive to nonlinear relationships. Mutual information can also be applied to measure dependence between two variables. The...

Word Count : 5183

Liberty Mutual

Last Update:

Liberty Mutual Insurance Company is an American diversified global insurer and the sixth-largest property and casualty insurer in the world. It ranks 71st...

Word Count : 2261

Coherent information

Last Update:

quantum information in the state will remain after the state goes through the channel. In this sense, it is intuitively similar to the mutual information of...

Word Count : 310

Conditional entropy

Last Update:

classical counterpart. Entropy (information theory) Mutual information Conditional quantum entropy Variation of information Entropy power inequality Likelihood...

Word Count : 2071

Information theory and measure theory

Last Update:

of the information content of random variables and a measure over sets. Namely the joint entropy, conditional entropy, and mutual information can be considered...

Word Count : 1754

Quantities of information

Last Update:

and important measures of information is the mutual information, or transinformation. This is a measure of how much information can be obtained about one...

Word Count : 2183

Mutual intelligibility

Last Update:

In linguistics, mutual intelligibility is a relationship between languages or dialects in which speakers of different but related varieties can readily...

Word Count : 4747

PDF Search Engine © AllGlobal.net