Partial mutual information

Back to Entropy combinations

The partial mutual information between random variables , , and is defined by

where denotes the Shannon differential entropy. It measures the amount of information shared by and while discounting the possibility that drives both and . If is independent of both and , then partial mutual information degenerates to mutual information.

References

Partial Mutual Information for Coupling Analysis of Multivariate Time Series,
Stefan Frenzel and Bernd Pompe,
Physical Review Letters, 2007.

See also

Differential entropy

Mutual information

Files

Partial mutual information estimation.