The differential entropy of a continuous random variable is defined by
where is the probability density function of .
There is no generalization of Shannon entropy to continuous variables, since differential entropy is not an entropy combination. Therefore, while the definition of differential entropy looks similar to the definition of discrete entropy, it does not generalize its properties. The role of differential entropy is two-fold. First, it is a syntactic device for describing other information theoretic concepts which are defined as combinations of differential entropies. Second, when transforming data to minimize mutual information, it is equivalent to minimize differential entropies, which can be a bit more efficient. This makes the estimation of differential entropy by itself useful.
The definition of differential entropy can be generalized by assuming that is distributed on an -dimensional differentiable manifold in , , with a probability density function . Then differential entropy is defined by:
TIM implements estimators for both of these definitions.