• Rapid Communication

Lower bounds on mutual information

David V. Foster and Peter Grassberger
Phys. Rev. E 83, 010101(R) – Published 20 January 2011

Abstract

We correct claims about lower bounds on mutual information (MI) between real-valued random variables made by Kraskov et al., Phys. Rev. E 69, 066138 (2004). We show that non-trivial lower bounds on MI in terms of linear correlations depend on the marginal (single variable) distributions. This is so in spite of the invariance of MI under reparametrizations, because linear correlations are not invariant under them. The simplest bounds are obtained for Gaussians, but the most interesting ones for practical purposes are obtained for uniform marginal distributions. The latter can be enforced in general by using the ranks of the individual variables instead of their actual values, in which case one obtains bounds on MI in terms of Spearman correlation coefficients. We show with gene expression data that these bounds are in general nontrivial, and the degree of their (non)saturation yields valuable insight.

  • Figure
  • Figure
  • Figure
  • Received 6 August 2010

DOI:https://doi.org/10.1103/PhysRevE.83.010101

© 2011 American Physical Society

Authors & Affiliations

David V. Foster1 and Peter Grassberger2,3

  • 1Complexity Science Group, University of Calgary, Calgary, Canada T2N 1N4
  • 2Complexity Science Group, University of Calgary, Calgary, Canada
  • 3John von Neumann Institut für Computing, FZ Jülich, D-52425 Jülich, Germany

Article Text (Subscription Required)

Click to Expand

References (Subscription Required)

Click to Expand
Issue

Vol. 83, Iss. 1 — January 2011

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review E

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×