Spring 2024 (weeks 6 - 21). Lectures on Monday 10:00-12:45 at the UvA, room SP C0.110.
Machine learning is one of the fastest growing areas of science, with far-reaching applications. In this course we focus on the fundamental ideas, theoretical frameworks, and rich array of mathematical tools and techniques that power machine learning. The course covers the core paradigms and results in machine learning theory with a mix of probability and statistics, combinatorics, information theory, optimization and game theory.
During the course you will learn to
This course strongly focuses on theory. (Good applied master level courses on machine learning are widely available, for example here, here and here). We will cover statistical learning theory including PAC learning, VC dimension, Rademacher complexity and Boosting, as well as online learning including prediction with expert advice, online convex optimisation, bandits and reinforcement learning.
This course is offered as part of the MasterMath program. To participate for credit, sign up here. We use the MasterMath ELO for submitting homework, and receiving grades.
We use the MasterMath ELO for announcements and as our student forum.
The prerequisites are
as covered e.g. in any bachelor mathematics program in the Netherlands, and as reviewed in the Appendix of the book [1]. The course does require general 'mathematical maturity', in particular the ability to combine insights from all three fields when proving theorems.
We offer weekly homework sets whose solution requires constructing proofs. This course will not include any programming or data.
The lectures will be held on location at the UvA, room SP C0.110. To view them remotely, join the zoom live stream. Mastermath has kindly offered us recording support. Recorded lectures will be posted to our vimeo archive.
The Monday 3h slot will consist of 2h of lectures followed by a 1h TA session discussing the homework.
We provide weekly homework sets. Odd sets are for practice. Even sets are graded and must be handed in before the next lecture.
The grade will be composed as follows.
The average of midterm and final exam grades has to be at least 5.0 to pass the course.
There will be a retake possibility for either or both exams, which are 70% of the grade. The homework still counts as part of the grade after the retake.
It is allowed and strongly encouraged to solve and submit the homework in small teams. Exams are personal.
NB Collaboration on homework is only allowed within a team. In particular, solutions may
When using any source that is not on the official literature list, always cite the source.
Wk | When | What | Lect. | TA |
6 | Mon 5 Feb | Introduction. Statistical learning. Halfspaces. PAC learnability for finite hyp. classes, realizable case. Chapters 1 and 2 in [1]. Slides | Tim | N/A |
7 | Mon 12 Feb | PAC learnability for finite hyp. classes, agnostic case. Uniform convergence. Chapters 3 and 4 in [1]. Slides | Tim | Hidde |
8 | Mon 19 Feb | Infinite classes. VC Dimension part 1. Chapter 6.1-6.3 in [1]. Slides | Tim | Jack |
9 | Mon 26 Feb | VC Dimendion part 2. Fundamental theorem of PAC learning. Sauer's Lemma. Chapter 6 in [1]. | Tim | Hidde |
10 | Mon 4 Mar | Proof of Fund.Th of PAC Learning. Rademacher Complexity part 1. Section 28.1 in [1]. | Tim | Jack |
11 | Mon 11 Mar | Nonuniform Learnability, SRM, Other notions of Learning. Sections 7.1 and 7.2 in [1] (note the errata about this chapter). Rademacher Complexity part 2. Chapter 26 in [1]. | Tim | Hidde |
12 | Mon 18 Mar | Generalisation of deep neural networks. Double descent. This material is not part of either exam. | Tim | Jack |
13 | Mon 25 Mar | Midterm exam on material covered in lectures 1-6. | ||
14 | Mon 1 Apr | Holiday (Easter) | ||
15 | Mon 8 Apr | Full Information Online Learning (Experts). | Wouter | Hidde |
16 | Mon 15 Apr | Bandits. UCB and EXP3. | Wouter | Jack |
17 | Mon 22 Apr | Online Convex Optimization. | Wouter | Hidde |
18 | Mon 29 Apr | Exp-concavity. Online Newton Step. | Wouter | Jack |
19 | Mon 6 May | Boosting. AdaBoost. and Chapter 10 in [1]. | Wouter | Hidde |
20 | Mon 13 May | Log-loss prediction. Normalised Maximum Likelihood. book chapter on ELO. | Wouter | |
21 | Mon 20 May | Holiday (Pentecost) | ||
24 | Mon 10 Jun | Final exam | ||
27 | Mon 1 Jul | Retake exam(s) |
The first half of the course will use the book
The second half of the course will use slides posted above and on the ELO.
Consult these references to learn more