Skip to main content
Mathematical Institute Mathematical Institute
  • You are currently using guest access (Log in)
  • C6.5 Theories of Deep Learning
  • General
  • Course Materials
  • Home
  • Calendar

C6.5 Theories of Deep Learning (2021-22)

  1. Home
  2. Courses
  3. Archive
  4. Year 2021-22
  5. Undergraduate
  6. Part C
  7. Michaelmas
  8. C6.5 Theories of Deep Learning

  • Lecturer: Profile: Jared Tanner

Course information

General Prerequisites:
Only elementary linear algebra and probability are assumed in this course; with knowledge from the following prelims courses also helpful: linear algebra, probability, analysis, constructive mathematics, and statistics and data analysis. It is recommended that students have familiarity with some of: more advanced statistics, optimisation (B6.3, C6.2), networks (C5.4), and numerical linear algebra (C6.1), though none of these courses are required as the material is self contained.
Course Term: Michaelmas
Course Lecture Information: 16 lectures
Course Weight: 1
Course Level: M
Course Overview:
A course on theories of deep learning.
Learning Outcomes:
Students will become familiar with the variety of architectures for deep nets, including the scattering transform and ingredients such as types of nonlinear transforms, pooling, convolutional structure, and how nets are trained. Students will focus their attention on learning a variety of theoretical perspectives on why deep networks perform as observed, with examples such as: dictionary learning and transferability of early layers, energy decay with depth, Lipschitz continuity of the net, how depth overcomes the curse of dimensionality, constructing adversarial examples, geometry of nets viewed through random matrix theory, and learning of invariance.
Course Synopsis:
Deep learning is the dominant method for machines to perform classification tasks at reliability rates exceeding that of humans, as well as outperforming world champions in games such as go. Alongside the proliferating application of these techniques, the practitioners have developed a good understanding of the properties that make these deep nets effective, such as initial layers learning weights similar to those in dictionary learning, while deeper layers instantiate invariance to transforms such as dilation, rotation, and modest diffeomorphisms. There are now a number of theories being developed to give a mathematical theory to accompany these observations; this course will explore these varying perspectives.


Topic outline

  • General

    • Announcements Forum
    • Discussion Forum
  • Course Materials

    • Sheet 1 Assignment
    • Sheet 2 Assignment
    • NeurIPS 2018 example file
    • NeurIPS 2018 style file
    • Sheet 3 Assignment
    • Sheet 4 Assignment
    • Lecture 1 slides File
    • Lecture 2 slides File
    • Lecture 3 slides File
    • Lecture 4 slides File
    • Lecture 5 slides File
    • Lecture 6 slides File
    • Lecture 7 slides File
    • Lecture 8 slides File
    • Lecture 9 slides File
    • Lecture 10 slides File
    • Lecture 11 slides File
    • Lecture 12 slides File
    • Lecture 13 slides File
    • Lecture 14 slides File
    • Lecture 15 slides File
    • Lecture 16 slides File
Skip Panopto
Panopto
Fetching Panopto content...
Skip Upcoming events
Upcoming events
There are no upcoming events
Go to calendar...
You are currently using guest access (Log in)
Get the mobile app