Differential entropy

Back to Estimators

The differential entropy of a continuous random variable is defined by

where is the probability density function of .

Entropy vs differential entropy

There is no generalization of Shannon entropy to continuous variables, since differential entropy is not an entropy combination. Therefore, while the definition of differential entropy looks similar to the definition of discrete entropy, it does not generalize its properties. The role of differential entropy is two-fold. First, it is a syntactic device for describing other information theoretic concepts which are defined as combinations of differential entropies. Second, when transforming data to minimize mutual information, it is equivalent to minimize differential entropies, which can be a bit more efficient. This makes the estimation of differential entropy by itself useful.

Differential entropy on differentiable manifolds

The definition of differential entropy can be generalized by assuming that is distributed on an -dimensional differentiable manifold in , , with a probability density function . Then differential entropy is defined by:

TIM implements estimators for both of these definitions.

Learn more

Files

An aggregate file for differential entropy.

Differential entropy of Y = sin(2 * pi * X + beta), X ~ Uniform(0, 1).

Differential entropy of a generalized normal distribution

Differential entropy of a generalized normal distribution

Differential entropy of a generalized normal distribution

Differential entropy of a normal distribution

Differential entropy of a normal distribution

Differential entropy of a normal distribution

Differential entropy of a uniform distribution

Differential entropy of a uniform distribution

Differential entropy of a uniform distribution