The mutual information between random variables and is defined by:
where is the differential entropy. Mutual information measures the amount of information shared by and . Its importance lies in the fact that
and are independent.
Includes computation from differential entropies and from a binning.