Information Theory from a Functional Viewpoint
Ph.D. Dissertation, Princeton University, January 2018
Abstract

A perennial theme of information theory is to find new methods to determine the fundamental limits of various communication systems, which potentially helps the engineers to find better designs by eliminating the deficient ones. Traditional methods have focused on the notion of “sets”: the method of types concerns the cardinality of subsets of the typical sets; the blowing-up lemma bounds the probability of the neighborhood of decoding sets; the single-shot (information-spectrum) approach uses the likelihood threshold to define sets. This thesis promotes the idea of deriving the fundamental limits using functional inequalities, where the central notion is “functions” instead of “sets”. A functional inequality follows from the entropic definition of an information measure by convex duality. For example, the Gibbs variational formula follows from the Legendre transform of the relative entropy.

As a first example, we propose a new methodology of deriving converse (i.e. impossibility) bounds based on convex duality and the reverse hypercontractivity of Markov semigroups. This methodology is broadly applicable to network information theory, and in particular resolves the optimal scaling of the second-order rate for the previously open “side-information problems”. As a second example, we use the functional inequality for the so-called Eγ metric to prove non-asymptotic achievability (i.e. existence) bounds for several problems including source coding, wiretap channels and mutual covering.

Along the way, we derive general convex duality results leading to a unified treatment to many inequalities and information measures such as the Brascamp-Lieb inequality and its reverse, strong data processing inequality, hypercontractivity and its reverse,transportation-cost inequalities, and R´enyi divergences. Capitalizing on such dualities, we demonstrate information-theoretic approaches to certain properties of functional inequalities, such as the Gaussian optimality. This is the antithesis of the main thesis (functional approaches to information theory).