Virgo, Nathaniel, and Daniel Polani. “Decomposing multivariate information.” (2017).
URL1 URL2
Pairwise relationships between random variables are well understood in information theory, but there are a number of important open questions on how to understand and quantify the relationships between three or more random variables. The intricacies of the multivariable case were lucidly highlighted in. James and Crutchfield [4] give another useful framing; they raise the following questions: how much of the information in a three-variable system is in the form of three-way interactions, as opposed to pair-wise ones? More generally, how can we decompose the joint entropy of n variables to properly understand all of the k-way interactions (with 1 ≤ k ≤ n) and their relationships?