B8.4 Information Theory - Material for the year 2019-2020

2019-2020
Lecturer(s): 
Dr Robin Stephenson
Dr Christoph Koch
General Prerequisites: 

Part A Probability would be very helpful, but not essential.

Course Term: 
Michaelmas
Course Lecture Information: 

16 lectures

Course Weight: 
1.00 unit(s)
Course Level: 
H

Assessment type:

Course Overview: 

Information theory is a relatively young subject. It played an important role in the rise of the current information/digital/computer age and still motivates much research in diverse fields such as statistics and machine learning, physics, computer science and engineering. Every time you make a phone call, store a file on your computer, query an internet search engine, watch a DVD, stream a movie, listen to a CD or mp3 file, etc., algorithms run that are based on topics we discuss in this course. However, independent of such applications, the underlying mathematical objects arise naturally as soon as one starts to think about "information" in a mathematically rigorous way. In fact, a large part of the course deals with two fundamental questions:

  1. How much information is contained in a signal/data/message? (source coding)
  2. What are the limits to information transfer over a channel that is subject to noisy perturbations? (channel coding)
Learning Outcomes: 

The student will have learned about entropy, mutual information and divergence, their basic properties, how they relate to information transmission. Knowledge of fundamentals of block/symbol/channel coding. Understand the theoretical limits of transmitting information due to noise.

Course Synopsis: 

(Conditional) entropy, mutual information, divergence and their basic properties and inequalities (Fano, Gibbs').
(Strong and weak) typical sequences: the asymptotic equipartition property, and applications to block coding.
Symbol codes: Kraft-McMillan, optimality, various symbols codes and their construction and complexity
Channel coding: discrete memoryless channels, channel codes/rates/errors, Shannons' noisy channel coding theorem.

Reading List: 
  1. H. Oberhauser, B8.4 Information theory (online lecture notes)
  2. T. Cover and J. Thomas, Elements of Information Theory (Wiley, 1991), Chapters 1-8, 11.
  3. D. MacKay, Information Theory, Inference, and Learning Algorithms (Cambridge, 2003)
Further Reading: 
  1. R. B. Ash, Information Theory (Dover, 1990).
  2. D. J. A. Welsh, Codes and Cryptography (Oxford University Press, 1988), Chapters 1-3, 5.
  3. G. Jones and J. M. Jones, Information and Coding Theory (Springer, 2000), Chapters 1-5.
  4. Y. Suhov & M. Kelbert, Information Theory and Coding by Example (Cambridge University Press, 2013), Relevant examples.