Mutual information

Back to Entropy combinations

The mutual information between random variables and is defined by:

where is the differential entropy. Mutual information measures the amount of information shared by and . Its importance lies in the fact that

and are independent.

See also

Differential entropy

Partial mutual information

Files

An aggregate file for analytic mutual informations.

An aggregate file for mutual information

Mutual information between marginals of a normal distribution

Mutual information between marginals of a normal distribution

Mutual information between marginals of a normal distribution.

Mutual information between sinusoids

Mutual information estimation

Mutual information estimation using naive algorithms

Includes computation from differential entropies and from a binning.

Mutual information estimation via entropy combination

Naive mutual information estimation

Partial mutual information estimation

Temporal mutual information estimation

Temporal partial mutual information estimation

Testing for mutual_information

mutual_information_naive