Mathematical Institute
Search results: 15
Part A Probability would be helpful, but not essential.
The aim of the course is to investigate methods for the communication of information from a sender, along a channel of some kind, to a receiver. If errors are not a concern we are interested in codes that yield fast communication; if the channel is noisy we are interested in achieving both speed and reliability. A key concept is that of information as reduction in uncertainty. The highlight of the course is Shannon's Noisy Coding Theorem.
Prof. Harald Oberhauser
(i) Know what the various forms of entropy are, and be able to manipulate them.
(ii) Know what data compression and source coding are, and be able to do it.
(iii) Know what channel coding and channel capacity are, and be able to use that.
Uncertainty (entropy); conditional uncertainty; information. Chain rules; relative entropy; Gibbs' inequality; asymptotic equipartition and typical sequences. Instantaneous and uniquely decipherable codes; the noiseless coding theorem for discrete memoryless sources; constructing compact codes.
The discrete memoryless channel; decoding rules; the capacity of a channel. The noisy coding theorem for discrete memoryless sources and binary symmetric channels.
Extensions to more general sources and channels.
Part A Probability would be very helpful, but not essential.
Information theory is a relatively young subject. It played an important role in the rise of the current information/digital/computer age and still motivates much research in diverse fields such as statistics and machine learning, physics, computer science and engineering. Every time you make a phone call, store a file on your computer, query an internet search engine, watch a DVD, stream a movie, listen to a CD or mp3 file, etc., algorithms run that are based on topics we discuss in this course. However, independent of such applications, the underlying mathematical objects arise naturally as soon as one starts to think about "information" in a mathematically rigorous way. In fact, a large part of the course deals with two fundamental questions:
- How much information is contained in a signal/data/message? (source coding)
- What are the limits to information transfer over a channel that is subject to noisy perturbations? (channel coding)
Prof. Harald Oberhauser
The student will have learned about entropy, mutual information and divergence, their basic properties, how they relate to information transmission. Knowledge of fundamentals of block/symbol/channel coding. Understand the theoretical limits of transmitting information due to noise.
(Conditional) entropy, mutual information, divergence and their basic properties and inequalities (Fano, Gibbs').
(Strong and weak) typical sequences: the asymptotic equipartition property, and applications to block coding.
Symbol codes: Kraft-McMillan, optimality, various symbols codes and their construction and complexity
Channel coding: discrete memoryless channels, channel codes/rates/errors, Shannons' noisy channel coding theorem.
Part A Probability would be very helpful, but not essential.
Information theory is a relatively young subject. It played an important role in the rise of the current information/digital/computer age and still motivates much research in diverse fields such as statistics and machine learning, physics, computer science and engineering. Every time you make a phone call, store a file on your computer, query an internet search engine, watch a DVD, stream a movie, listen to a CD or mp3 file, etc., algorithms run that are based on topics we discuss in this course. However, independent of such applications, the underlying mathematical objects arise naturally as soon as one starts to think about "information" in a mathematically rigorous way. In fact, a large part of the course deals with two fundamental questions:
- How much information is contained in a signal/data/message? (source coding)
- What are the limits to information transfer over a channel that is subject to noisy perturbations? (channel coding)
Dr Robin Stephenson
Dr Christoph Koch
The student will have learned about entropy, mutual information and divergence, their basic properties, how they relate to information transmission. Knowledge of fundamentals of block/symbol/channel coding. Understand the theoretical limits of transmitting information due to noise.
(Conditional) entropy, mutual information, divergence and their basic properties and inequalities (Fano, Gibbs').
(Strong and weak) typical sequences: the asymptotic equipartition property, and applications to block coding.
Symbol codes: Kraft-McMillan, optimality, various symbols codes and their construction and complexity
Channel coding: discrete memoryless channels, channel codes/rates/errors, Shannons' noisy channel coding theorem.
Part A Probability would be very helpful, but not essential.
Information theory is a relatively young subject. It played an important role in the rise of the current information/digital/computer age and still motivates much research in diverse fields such as statistics and machine learning, physics, computer science and engineering. Every time you make a phone call, store a file on your computer, query an internet search engine, watch a DVD, stream a movie, listen to a CD or mp3 file, etc., algorithms run that are based on topics we discuss in this course. However, independent of such applications, the underlying mathematical objects arise naturally as soon as one starts to think about "information" in a mathematically rigorous way. In fact, a large part of the course deals with two fundamental questions:
- How much information is contained in a signal/data/message? (source coding)
- What are the limits to information transfer over a channel that is subject to noisy perturbations? (channel coding)
Prof. Hanqing Jin
The student will have learned about entropy, mutual information and divergence, their basic properties, how they relate to information transmission. Knowledge of fundamentals of block/symbol/channel coding. Understand the theoretical limits of transmitting information due to noise.
(Conditional) entropy, mutual information, divergence and their basic properties and inequalities (Fano, Gibbs').
(Strong and weak) typical sequences: the asymptotic equipartition property, and applications to block coding.
Symbol codes: Kraft-McMillan, optimality, various symbols codes and their construction and complexity
Channel coding: discrete memoryless channels, channel codes/rates/errors, Shannons' noisy channel coding theorem.
- Lecturer: Hanqing Jin
How much information is contained in a signal/data/message? (source coding)
What are the limits to information transfer over a channel that is subject to noisy perturbations? (channel coding)
(Strong and weak) typical sequences: the asymptotic equipartition property, and applications to block coding.
Symbol codes: Kraft-McMillan, optimality, various symbols codes and their construction and complexity
Channel coding: discrete memoryless channels, channel codes/rates/errors, Shannons' noisy channel coding theorem.
- Lecturer: Hanqing Jin
How much information is contained in a signal/data/message? (source coding)
What are the limits to information transfer over a channel that is subject to noisy perturbations? (channel coding)
(Strong and weak) typical sequences: the asymptotic equipartition property, and applications to block coding.
Symbol codes: Kraft-McMillan, optimality, various symbols codes and their construction and complexity
Channel coding: discrete memoryless channels, channel codes/rates/errors, Shannons' noisy channel coding theorem.
- Lecturer: Samuel Cohen
How much information is contained in a signal/data/message? (source coding)
What are the limits to information transfer over a channel that is subject to noisy perturbations? (channel coding)
(Strong and weak) typical sequences: the asymptotic equipartition property, and applications to block coding.
Symbol codes: Kraft-McMillan, optimality, various symbols codes and their construction and complexity
Channel coding: discrete memoryless channels, channel codes/rates/errors, Shannons' noisy channel coding theorem.