1.II.12G

Coding and Cryptography | Part II, 2008

State Shannon's Noisy Coding Theorem for a binary symmetric channel.

Define the mutual information of two discrete random variables XX and YY. Prove that the mutual information is symmetric and non-negative. Define also the information capacity of a channel.

A channel transmits numbers chosen from the alphabet A={0,1,2}\mathcal{A}=\{0,1,2\} and has transition matrix

(12ββββ12ββββ12β)\left(\begin{array}{ccc} 1-2 \beta & \beta & \beta \\ \beta & 1-2 \beta & \beta \\ \beta & \beta & 1-2 \beta \end{array}\right)

for a number β\beta with 0β130 \leqslant \beta \leqslant \frac{1}{3}. Calculate the information capacity of the channel.

Typos? Please submit corrections to this page on GitHub.