Global Information Lookup Global Information

Differential entropy information


Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average (surprisal) of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not.[1]: 181–218  The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP). Differential entropy (described here) is commonly encountered in the literature, but it is a limiting case of the LDDP, and one that loses its fundamental association with discrete entropy.

In terms of measure theory, the differential entropy of a probability measure is the negative relative entropy from that measure to the Lebesgue measure, where the latter is treated as if it were a probability measure, despite being unnormalized.

  1. ^ Jaynes, E.T. (1963). "Information Theory And Statistical Mechanics" (PDF). Brandeis University Summer Institute Lectures in Theoretical Physics. 3 (sect. 4b).

and 23 Related for: Differential entropy information

Request time (Page generated in 0.8708 seconds.)

Differential entropy

Last Update:

Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the...

Word Count : 2728

Conditional entropy

Last Update:

In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y {\displaystyle...

Word Count : 2071

Joint entropy

Last Update:

in which case we say that the differential entropy is not defined. As in the discrete case the joint differential entropy of a set of random variables...

Word Count : 952

Maximum entropy probability distribution

Last Update:

with probability density p ( x ) {\displaystyle p(x)} , then the differential entropy of   X   {\displaystyle \ X\ } is defined as H ( X )   =   − ∫ −...

Word Count : 4530

Beta distribution

Last Update:

the discrete entropy. It is known since then that the differential entropy may differ from the infinitesimal limit of the discrete entropy by an infinite...

Word Count : 40369

Negentropy

Last Update:

_{x})-S(p_{x})\,} where S ( φ x ) {\displaystyle S(\varphi _{x})} is the differential entropy of the Gaussian density with the same mean and variance as p x {\displaystyle...

Word Count : 1106

Quantities of information

Last Update:

properties; for example, differential entropy may be negative. The differential analogies of entropy, joint entropy, conditional entropy, and mutual information...

Word Count : 2183

Information dimension

Last Update:

d-dimensional entropy does not necessarily exist there. Finally, dimensional-rate bias generalizes the Shannon's entropy and differential entropy, as one could...

Word Count : 3105

Limiting density of discrete points

Last Update:

Shannon for differential entropy. It was formulated by Edwin Thompson Jaynes to address defects in the initial definition of differential entropy. Shannon...

Word Count : 971

Multivariate normal distribution

Last Update:

vector, it is distributed as a generalized chi-squared variable. The differential entropy of the multivariate normal distribution is h ( f ) = − ∫ − ∞ ∞ ∫...

Word Count : 9474

Information theory

Last Update:

information theoretic quantities include Rényi entropy (a generalization of entropy), differential entropy (a generalization of quantities of information...

Word Count : 7095

Configuration entropy

Last Update:

V, Gilson MK (March 2010). "Thermodynamic and Differential Entropy under a Change of Variables". Entropy. 12 (3): 578–590. Bibcode:2010Entrp..12..578H...

Word Count : 408

Rayleigh distribution

Last Update:

{\displaystyle \operatorname {erf} (z)} is the error function. The differential entropy is given by[citation needed] H = 1 + ln ⁡ ( σ 2 ) + γ 2 {\displaystyle...

Word Count : 2183

Entropy

Last Update:

Clausius named the concept of "the differential of a quantity which depends on the configuration of the system", entropy (Entropie) after the Greek word...

Word Count : 13924

Principle of maximum entropy

Last Update:

entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy,...

Word Count : 4218

Entropy rate

Last Update:

mathematical theory of probability, the entropy rate or source information rate is a function assigning an entropy to a stochastic process. For a strongly...

Word Count : 781

Entropy estimation

Last Update:

learning, and time delay estimation it is useful to estimate the differential entropy of a system or process, given some observations. The simplest and...

Word Count : 1407

Mutual information

Last Update:

variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies...

Word Count : 8693

Additive white Gaussian noise

Last Update:

I ( X ; Y ) {\displaystyle I(X;Y)} , writing it in terms of the differential entropy: I ( X ; Y ) = h ( Y ) − h ( Y ∣ X ) = h ( Y ) − h ( X + Z ∣ X )...

Word Count : 2962

Exponential distribution

Last Update:

distribution with λ = 1/μ has the largest differential entropy. In other words, it is the maximum entropy probability distribution for a random variate...

Word Count : 6567

Dirichlet distribution

Last Update:

\operatorname {Dir} ({\boldsymbol {\alpha }})} random variable, the differential entropy of X (in nat units) is h ( X ) = E ⁡ [ − ln ⁡ f ( X ) ] = ln ⁡ B...

Word Count : 6539

Entropy as an arrow of time

Last Update:

Entropy is one of the few quantities in the physical sciences that require a particular direction for time, sometimes called an arrow of time. As one...

Word Count : 5020

Asymptotic equipartition property

Last Update:

{\displaystyle H} is simply the entropy of a symbol) and the continuous-valued case (where H {\displaystyle H} is the differential entropy instead). The definition...

Word Count : 3951

PDF Search Engine © AllGlobal.net