Paper 2, Section I, K

Coding and Cryptography | Part II, 2021

State Shannon's noisy coding theorem for a binary symmetric channel, defining the terms involved.

Suppose a channel matrix, with output alphabet of size nn, is such that the entries in each row are the elements of the set {p1,,pn}\left\{p_{1}, \ldots, p_{n}\right\} in some order. Further suppose that all columns are permutations of one another. Show that the channel's information capacity CC is given by

C=logn+i=1npilogpiC=\log n+\sum_{i=1}^{n} p_{i} \log p_{i}

Show that the information capacity of the channel matrix

(1313161616161313)\left(\begin{array}{cccc} \frac{1}{3} & \frac{1}{3} & \frac{1}{6} & \frac{1}{6} \\ \frac{1}{6} & \frac{1}{6} & \frac{1}{3} & \frac{1}{3} \end{array}\right)

is given by C=53log3C=\frac{5}{3}-\log 3.

Typos? Please submit corrections to this page on GitHub.