The partial mutual information between random variables , , and is defined by
where denotes the Shannon differential entropy. It measures the amount of information shared by and while discounting the possibility that drives both and . If is independent of both and , then partial mutual information degenerates to mutual information.
Partial Mutual Information for Coupling Analysis of Multivariate Time Series,
Stefan Frenzel and Bernd Pompe,
Physical Review Letters, 2007.