General Prerequisites:
Part A Probability and Part A Integration are required.
Course Term: Hilary
Course Lecture Information: 16 lectures
Course Weight: 1
Course Level: H
Assessment Type: Written Examination
Course Overview:
High-dimensional probability and high-dimensional statistics have
emerged in recent years as ever more important topics due to the need to analyse
vast amounts of complex data. The ideas and methods developed for dealing
with probability distributions on spaces of high-dimension
(such as distributions of randomly sampled data with multiple attributes)
have been used not only in pure mathematics but also in applications
from stochastic simulation to statistics, data science to statistical mechanics.

This course will focus on the development of basic ideas and techniques
such as elementary dimension-free tail estimates, concentration
bounds, the metric entropy method, the Poincare and logarithmic
Sobolev inequalities, large deviation principles for rare events etc.
Learning Outcomes:
The students will learn the fundamental ideas and modern tools for
handling distributions on high-dimensional spaces, and understand
some special features of probability distributions on such spaces, for example
the concentration of probability laws on small regions of low dimensional
subspaces.
Course Synopsis:
  • (2 lectures) Derive a few elementary but important tail estimates for distributions in terms of moments, variances and other statistical characteristics.
  • (3 lectures) Strong law of large numbers, Cramer's large deviation principle.
  • (3 lectures) Elementary results on concentration of probabilities,
    concentration functions, Wasserstein distance and information inequality.
  • (4 lectures) Heat semi-group, Gaussian measures, Poincare inequality, logarithmic Sobolev inequalities for Gaussian measures and some other distributions. Concentration for Gaussian measures.
  • (2 lectures) Lasso performance bound, and large eigenvalue of random matrices.
  • (3 lectures) Developing the entropy method by means of examples -- bounded difference inequalities, concentration of convex Lipschitz functions and modified logarithmic Sobolev inequalities.
  • (3 lectures) Building the connection between concentration estimates
    and isoperimetric inequalities.