Mutual information and conditional mean prediction error
Bowsher, CG; Voliotis, M
Date: 26 July 2014
Article
Journal
arXiv
Publisher
arXiv.org
Related links
Abstract
Mutual information is fundamentally important for measuring statistical dependence between variables and for quantifying information transfer by signaling and communication mechanisms. It can, however, be challenging to evaluate for physical models of such mechanisms and to estimate reliably from data. Furthermore, its relationship to ...
Mutual information is fundamentally important for measuring statistical dependence between variables and for quantifying information transfer by signaling and communication mechanisms. It can, however, be challenging to evaluate for physical models of such mechanisms and to estimate reliably from data. Furthermore, its relationship to better known statistical procedures is still poorly understood. Here we explore new connections between mutual information and regression-based dependence measures, $\nu^{-1}$, that utilise the determinant of the second-moment matrix of the conditional mean prediction error. We examine convergence properties as $\nu\rightarrow0$ and establish sharp lower bounds on mutual information and capacity of the form $\mathrm{log}(\nu^{-1/2})$. The bounds are tighter than lower bounds based on the Pearson correlation and ones derived using average mean square-error rate distortion arguments. Furthermore, their estimation is feasible using techniques from nonparametric regression. As an illustration we provide bootstrap confidence intervals for the lower bounds which, through use of a composite estimator, substantially improve upon inference about mutual information based on $k$-nearest neighbour estimators alone.
Mathematics and Statistics
Faculty of Environment, Science and Economy
Item views 0
Full item downloads 0