Person Award 36431

abbas3
Hitachi America Chair in the School of Engineering and Fortinet Founders Chair, Department of Electrical Engineering
Stanford University

Entropy

Definitions

Let \(X\) be a discrete random variable defined on a finite alphabet \(\mathcal{X}\) and with probability mass function \(p_X\). The entropy of \(X\) is the random variable \(H(X)\) defined by

\[ H(X)=\log\frac{1}{p_X(X)}.\] 

The base of the logarithm defines the unit of entropy. If the logarithm is to the base 2, the unit of entropy is the bit. If the if the logarithm is to the base \(e\), the unit of entropy is the nat.

Mutual Information

Definitions

Let \(X\) and \(Y\) be discrete random variables defined on finite alphabets \(\mathcal{X}\) \(\mathcal{Y}\), respectively, and with joint probability mass function \(p_{X,Y}\). The mutual information of \(X\) and \(Y\) is the random variable \(I(X,Y)\) defined by

\[ I(X,Y) = \log\frac{p_{X,Y}(X,Y)}{p_X(X)p_Y(Y)}.\]

As with entropy, the base of the logarithm defines the units of mutual information.  If the if the logarithm is to the base \(e\), the unit of entropy is the nat.

Online Committee Report, ITA 2013

Summary

The website had been running smoothly and consistently until February 5th, when it experienced significant slowdowns and errors (error 504). The problem has been fixed on February 8th, with the website running up again, but the fundamental cause of the problem has not been precisely identified yet. The developers have scheduled additional time to fix the issue next week, and the Online Committee will provide more information then.

 

The main topics covered in this report are the following.