Potential conditional mutual information: Estimators and properties

Department of Electrical Engineering, University of Washington, Seattle, United States
DOI
10.7287/peerj.preprints.3345v1
Subject Areas
Data Mining and Machine Learning, Data Science
Keywords
information theory, directed information, estimators, consistency, potential influence, causal influence
Copyright
© 2017 Rahimzamani et al.
Licence
This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ Preprints) and either DOI or URL of the article must be cited.
Cite this article
Rahimzamani A, Kannan S. 2017. Potential conditional mutual information: Estimators and properties. PeerJ Preprints 5:e3345v1

Abstract

The conditional mutual information \(I(X;Y|Z)\) measures the average information that X and Y contain about each other given Z. This is an important primitive in many learning problems including conditional independence testing, graphical model inference, causal strength estimation and time-series problems. In several applications, it is desirable to have a functional purely of the conditional distribution \(p_{Y |X,Z}\) rather than of the joint distribution \(p_{X,Y,Z}\). We define the potential conditional mutual information as the conditional mutual information calculated with a modified joint distribution \(p_{Y |X,Z} q_{X,Z}\), where \(q_{X,Z}\) is a potential distribution, fixed airport. We develop K nearest neighbor based estimators for this functional, employing importance sampling, and a coupling trick, and prove the finite k consistency of such an estimator. We demonstrate that the estimator has excellent practical performance and show an application in dynamical system inference.

Author Comment

This is a preprint submission to PeerJ Preprints.