General Prerequisites: Part A Probability would be helpful, but not essential.
Course Overview: The aim of the course is to investigate methods for the communication of information from a sender, along a channel of some kind, to a receiver. If errors are not a concern we are interested in codes that yield fast communication; if the channel is noisy we are interested in achieving both speed and reliability. A key concept is that of information as reduction in uncertainty. The highlight of the course is Shannon's Noisy Coding Theorem.
Lecturer(s):
Prof. Harald Oberhauser
Learning Outcomes: (i) Know what the various forms of entropy are, and be able to manipulate them.
(ii) Know what data compression and source coding are, and be able to do it.
(iii) Know what channel coding and channel capacity are, and be able to use that.
Course Synopsis: Uncertainty (entropy); conditional uncertainty; information. Chain rules; relative entropy; Gibbs' inequality; asymptotic equipartition and typical sequences. Instantaneous and uniquely decipherable codes; the noiseless coding theorem for discrete memoryless sources; constructing compact codes.
The discrete memoryless channel; decoding rules; the capacity of a channel. The noisy coding theorem for discrete memoryless sources and binary symmetric channels.
Extensions to more general sources and channels.