Machine Learning Theory

Spring 2023 (weeks 6 - 21). Lectures on Thursday 10:00-12:45 at the UvA, room SP C0.110.

Aim | Prerequisites | Lecturers | Mode | Schedule | Exams | Material

Aim

Machine learning is one of the fastest growing areas of science, with far-reaching applications. In this course we focus on the fundamental ideas, theoretical frameworks, and rich array of mathematical tools and techniques that power machine learning. The course covers the core paradigms and results in machine learning theory with a mix of probability and statistics, combinatorics, information theory, optimization and game theory.

During the course you will learn to

This course strongly focuses on theory. (Good applied master level courses on machine learning are widely available, for example here, here and here). We will cover statistical learning theory including PAC learning, VC dimension, Rademacher complexity and Boosting, as well as online learning including prediction with expert advice, online convex optimisation, bandits and reinforcement learning.

MasterMath Website

This course is offered as part of the MasterMath program. To participate for credit, sign up here. We use the MasterMath ELO for submitting homework, and receiving grades.

We use the MasterMath Zulip MLTs23 stream as our forum. Sign up instructions are here.

Prerequisites

The prerequisites are

as covered e.g. in any bachelor mathematics program in the Netherlands, and as reviewed in the Appendix of the book [1]. The course does require general 'mathematical maturity', in particular the ability to combine insights from all three fields when proving theorems.

We offer weekly homework sets whose solution requires constructing proofs. This course will not include any programming or data.

Lecturers

The lectures will be held on location at the UvA, room SP C0.110. To view them remotely, join the zoom live stream. Mastermath has kindly offered us recording support. Recorded lectures will be posted to our vimeo archive.

The Thursday 3h slot will consist of 2h of lectures followed by a 1h TA session discussing the homework.

Mode

The grade will be composed as follows.

The average of midterm and final exam grades has to be at least 5.0 to pass the course.

It is strongly encouraged to solve and submit your weekly homework in small teams. Exams are personal.

There will be a retake possibility for either/both exams, which are 60% of the grade.

Schedule

When What Lect. TA
Thu 9 Feb Introduction. Statistical learning. Halfspaces. PAC learnability for finite hyp. classes, realizable case. Chapters 1 and 2 in [1]. Slides Tim N/A
Thu 16 Feb PAC learnability for finite hyp. classes, agnostic case. Uniform convergence. Chapters 3 and 4 in [1]. Slides Tim Sarah
Thu 23 Feb Infinite classes. VC Dimension part 1. Chapter 6.1-6.3 in [1]. Slides Tim Sarah
Thu 2 Mar VC Dimendion part 2. Fundamental theorem of PAC learning. Sauer's Lemma. Chapter 6 in [1]. Slides Tim Hidde
Thu 9 Mar Proof of Fund.Th of PAC Learning. Rademacher Complexity part 1. Section 28.1 in [1]. Slides Tim Sarah
Thu 16 Mar Nonuniform Learnability, SRM, Other notions of Learning. Sections 7.1 and 7.2 in [1] (note the errata about this chapter). Rademacher Complexity part 2. Chapter 26 in [1]. Slides Tim Hidde
Thu 23 Mar Full Information Online Learning (Experts). Slides Wouter Sarah
Thu 30 Mar Midterm exam on material covered in lectures 1-6.
Thu 6 Apr Bandits. UCB and EXP3. Slides Wouter Hidde
Thu 13 Apr Online Convex Optimization. Slides Wouter Sarah
Thu 20 Apr Holiday
Thu 27 Apr Holiday (Kings Day)
Thu 4 May Exp-concavity. Online Newton Step. Slides Wouter Hidde
Thu 11 May Boosting. AdaBoost. Slides and Chapter 10 in [1]. Wouter Sarah
Thu 18 May Holiday (Ascension Day)
Thu 25 May Log-loss prediction. Normalised Maximum Likelihood. Slides and book chapter on ELO. Wouter Hidde
Thu 22 Jun Final exam
Thu 13 Jul Retake exam(s)

Course Material

The first half of the course will use the book

The second half of the course will use slides posted above and on the ELO.

For going beyond the course

Consult these references to learn more