1.II.12G
State Shannon's Noisy Coding Theorem for a binary symmetric channel.
Define the mutual information of two discrete random variables and . Prove that the mutual information is symmetric and non-negative. Define also the information capacity of a channel.
A channel transmits numbers chosen from the alphabet and has transition matrix
for a number with . Calculate the information capacity of the channel.
Typos? Please submit corrections to this page on GitHub.