Pointwise relations between information and estimation in Gaussian noise
Kartik Venkat Tsachy Weissman
Proceedings of the IEEE International Symposium on Information Theory, Cambridge, MA, USA, July 2012
Abstract

Many of the classical and recent  relations   between   information  and  estimation  in the presence of Gaussian noise can be viewed as identities  between  expectations of random quantities. These include the I-MMSE relationship of Guo et al.; the relative entropy and mismatched  estimation relationship of Verdu; the relationship  between  causal  estimation  and mutual  information  of Duncan, and its extension to the presence of feedback by Kadota et al.; the relationship  between  causal and non-casual  estimation  of Guo et al., and its mismatched version of Weissman. We dispense with the expectations and explore the nature of the  pointwise   relations   between  the respective random quantities. The  pointwise   relations  that we find are as succinctly stated as - and give considerable insight into - the original expectation identities. As an illustration of our results, consider Duncan's 1970 discovery that the mutual  information  is equal to the causal MMSE in the AWGN channel, which can equivalently be expressed saying that the difference  between  the input-output  information density and half the causal  estimation  error is a zero mean random variable (regardless of the distribution of the channel input). We characterize this random variable explicitly, rather than merely its expectation. Classical  estimation  and  information  theoretic quantities emerge with new and surprising roles. For example, the variance of this random variable turns out to be given by the causal MMSE (which, in turn, is equal to the mutual  information  by Duncan's result).