B8.6 High Dimensional Probability (2024-25)
Main content blocks
- Lecturer: Profile: Zhongmin Qian
emerged in recent years as ever more important topics due to the need to analyse
vast amounts of complex data. The ideas and methods developed for dealing
with probability distributions on spaces of high-dimension
(such as distributions of randomly sampled data with multiple attributes)
have been used not only in pure mathematics but also in applications
from stochastic simulation to statistics, data science to statistical mechanics.
This course will focus on the development of basic ideas and techniques
such as elementary dimension-free tail estimates, concentration
bounds, the metric entropy method, the Poincare and logarithmic
Sobolev inequalities, large deviation principles for rare events etc.
handling distributions on high-dimensional spaces, and understand
some special features of probability distributions on such spaces, for example
the concentration of probability laws on small regions of low dimensional
subspaces.
- (2 lectures) Derive a few elementary but important tail estimates for distributions in terms of moments, variances and other statistical characteristics.
- (3 lectures) Strong law of large numbers, Cramer's large deviation principle.
- (3 lectures) Elementary results on concentration of probabilities,
concentration functions, Wasserstein distance and information inequality. - (4 lectures) Heat semi-group, Gaussian measures, Poincare inequality, logarithmic Sobolev inequalities for Gaussian measures and some other distributions. Concentration for Gaussian measures.
- (2 lectures) Lasso performance bound, and large eigenvalue of random matrices.
- (3 lectures) Developing the entropy method by means of examples -- bounded difference inequalities, concentration of convex Lipschitz functions and modified logarithmic Sobolev inequalities.
- (3 lectures) Building the connection between concentration estimates
and isoperimetric inequalities.
Section outline
-
-
12 March 2025 (last version for year 2025) -- Small modifications in Sections 5.1 and 5.2.
6 March 2025: Added two consequences of the Gaussian isopermetric inequality, Proposition 4.33 and Theorem 4.34. Made modifications in Section 5.1 (mainly presentation of the material).
24 Feb 2025 Modifications (mainly presentation) were made in Sections 4.4 - 4.6, in particular for the proof of the Gaussian isoperimetric inequality.
19 Feb. 2025 Substantial modifications made in Section 4.1 (Added Dominiation inequality), Section 4.2, Section 4.3 and Section 4.6
9 Feb. 2025 Substantial modifications were made in Section 4.1.
21 Jan 2015 Modifications are made in Section 1, Section 2 and sub-Section 3.1. This course should be welcome to those interested in Probability, Stochastic Analysis, Analysis, and Data Science.
-
-
This sheet covers a few but important elementary probability estimates, for the first 4 lectures.
-
For material covered in Section 4.1 - 4.3
-
This Sheet 4 is about the material covered in Sections 4.4 - 4.6, and additional material about the curvature operator.
-
-
Registration start: Monday, 13 January 2025, 12:00 PMRegistration end: Friday, 14 February 2025, 12:00 PM
-
Class Tutor's Comments Assignment
Class tutors will use this activity to provide overall feedback to students at the end of the course.
-