Information theory and its relationship to probability, statistics, and data compression;
entropy, relative entropy and mutual information;
Huffman coding, arithmetic coding and Lempel-Ziv coding;
channel capacity; group codes; generator and parity check matrices;
Hamming codes and bound; bounds on the dimension of a linear code; random coding bounds; code construction.