Measure of relative information in probability theory
Information theory
Entropy
Differential entropy
Conditional entropy
Joint entropy
Mutual information
Directed information
Conditional mutual information
Relative entropy
Entropy rate
Limiting density of discrete points
Asymptotic equipartition property
Rate–distortion theory
Shannon's source coding theorem
Channel capacity
Noisy-channel coding theorem
Shannon–Hartley theorem
v
t
e
In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in shannons, nats, or hartleys. The entropy of conditioned on is written as .
and 19 Related for: Conditional entropy information
In information theory, the conditionalentropy quantifies the amount of information needed to describe the outcome of a random variable Y {\displaystyle...
The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditionalentropy of classical...
Despite similar notation, joint entropy should not be confused with cross-entropy. The conditionalentropy or conditional uncertainty of X given random...
to a stochastic process. For a strongly stationary process, the conditionalentropy for latest random variable eventually tend towards this rate value...
for example, differential entropy may be negative. The differential analogies of entropy, joint entropy, conditionalentropy, and mutual information are...
{H} (X_{1})+\ldots +\mathrm {H} (X_{n})} Joint entropy is used in the definition of conditionalentropy: 22 H ( X | Y ) = H ( X , Y ) − H ( Y ) {\displaystyle...
y,z)dxdydz.} Alternatively, we may write in terms of joint and conditionalentropies as I ( X ; Y | Z ) = H ( X , Z ) + H ( Y , Z ) − H ( X , Y , Z )...
information theoretic Shannon entropy. The von Neumann entropy is also used in different forms (conditionalentropies, relative entropies, etc.) in the framework...
marginal entropies, H ( X ∣ Y ) {\displaystyle \mathrm {H} (X\mid Y)} and H ( Y ∣ X ) {\displaystyle \mathrm {H} (Y\mid X)} are the conditionalentropies, and...
the joint entropy. This is equivalent to the fact that the conditional quantum entropy may be negative, while the classical conditionalentropy may never...
joint, conditional differential entropy, and relative entropy are defined in a similar fashion. Unlike the discrete analog, the differential entropy has...
X)\end{aligned}}} where H ( Y ∣ X ) {\displaystyle H(Y\mid X)} is the conditionalentropy and D KL {\displaystyle D_{\text{KL}}} is the Kullback–Leibler divergence...
quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy. For simplicity...
of random variables and a measure over sets. Namely the joint entropy, conditionalentropy, and mutual information can be considered as the measure of a...
entropy of X. The above definition of transfer entropy has been extended by other types of entropy measures such as Rényi entropy. Transfer entropy is...
variable e in terms of its entropy. One can then subtract the content of e that is irrelevant to h (given by its conditionalentropy conditioned on h) from...
that quantum conditionalentropies can be negative, and quantum mutual informations can exceed the classical bound of the marginal entropy. The strong...
relationships among Shannon's basic measures of information: entropy, joint entropy, conditionalentropy and mutual information. Information diagrams are a useful...
interpreted within the context of information theory. Bayes factor ConditionalentropyConditional probability Empirical likelihood Likelihood principle Likelihood-ratio...