test
General Prerequisites:
Part A Probability would be very helpful, but not essential.
Course Term: Hilary
Course Lecture Information: 16 lectures
Course Weight: 1
Course Level: H
Assessment Type: Written Examination
Course Overview:
Information theory is a relatively young subject. It played an important role in the rise of the current information/digital/computer age and still motivates much research in diverse fields such as statistics and machine learning, physics, computer science and engineering. Every time you make a phone call, store a file on your computer, query an internet search engine, watch a DVD, stream a movie, listen to a CD or mp3 file, etc., algorithms run that are based on topics we discuss in this course. However, independent of such applications, the underlying mathematical objects arise naturally as soon as one starts to think about "information" in a mathematically rigorous way. In fact, a large part of the course deals with two fundamental questions:

How much information is contained in a signal/data/message? (source coding)
What are the limits to information transfer over a channel that is subject to noisy perturbations? (channel coding)
Learning Outcomes:
The student will have learned about entropy, mutual information and divergence, their basic properties, how they relate to information transmission. Knowledge of fundamentals of block/symbol/channel coding. Understand the theoretical limits of transmitting information due to noise.
Course Synopsis:
(Conditional) entropy, mutual information, divergence and their basic properties and inequalities (Fano, Gibbs').
(Strong and weak) typical sequences: the asymptotic equipartition property, and applications to block coding.
Symbol codes: Kraft-McMillan, optimality, various symbols codes and their construction and complexity
Channel coding: discrete memoryless channels, channel codes/rates/errors, Shannons' noisy channel coding theorem.