B2.14

Information Theory | Part II, 2004

For integer-valued random variables XX and YY, define the relative entropy hY(X)h_{Y}(X) of XX relative to YY.

Prove that hY(X)0h_{Y}(X) \geqslant 0, with equality if and only if P(X=x)=P(Y=x)\mathbb{P}(X=x)=\mathbb{P}(Y=x) for all xx.

By considering YY, a geometric random variable with parameter chosen appropriately, show that if the mean EX=μ<\mathbb{E} X=\mu<\infty, then

h(X)(μ+1)log(μ+1)μlogμ,h(X) \leqslant(\mu+1) \log (\mu+1)-\mu \log \mu,

with equality if XX is geometric.

Typos? Please submit corrections to this page on GitHub.