Options
Information-theoretic inference of common ancestors
Publikationstyp
Journal Article
Publikationsdatum
2015-04-16
Author
Enthalten in
Volume
17
Issue
4
Start Page
2304
End Page
2327
Citation
Entropy 17 (4): 2304-2327 (2015)
Publisher DOI
Scopus ID
Publisher
MDPI
A directed acyclic graph (DAG) partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class that can be derived from observations of a subsystem only. To this end, we prove an information-theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system. More explicitly, we show that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Within the causal interpretation of DAGs, our result can be seen as a quantitative extension of Reichenbach's principle of common cause to more than two variables. Our conclusions are valid also for non-probabilistic observations, such as binary strings, since we state the proof for an axiomatized notion of "mutual information" that includes the stochastic as well as the algorithmic version.
Schlagworte
Bayesian nets
Causality
Common cause principle
Directed acyclic graphs
Information theory
Kolmogorov complexity
Mutual information
DDC Class
004: Informatik
510: Mathematik