RSP

Research Semester Programme Machine Learning TheoryMid-Semester Lecture

Overview

This is the second lecture event of the ML Theory semester programme. We are delighted to host an afternoon with two distinguished researchers speaking about their advances in our understanding of machine learning. These research-level lectures are intended for a non-specialist audience. The lectures are freely accessible.

The overarching theme of the afternoon is "Learning to Model the Environment". Our two speakers will discuss performing causal inference and executing optimal behaviour in largely unknown environments, focusing on philosophical and technical challenges. For more details see the abstracts below. We are looking forward to a strong program and we hope to welcome you on the 12th.

Schedule

DateTimeEvent
April 12 202314:00Jonas Peters (remote)
Recent Advances in Causal Inference
April 12 202315:00Break
April 12 202315:30Tor Lattimore (live)
The Curse and Blessing of Curvature for Zeroth-order Convex Optimisation
April 12 202316:30Reception

Speakers

Jonas Peters

Jonas Peters (University of Copenhagen) is professor in statistics at the Department of Mathematical Sciences at the University of Copenhagen. Previously, he has been a group leader at the Max-Planck-Institute for Intelligent Systems in Tuebingen and a Marie Curie fellow at the Seminar for Statistics, ETH Zurich. He studied Mathematics at the University of Heidelberg and the University of Cambridge and obtained his PhD jointly from MPI and ETH. He is interested in inferring causal relationships from different types of data and in building statistical methods that are robust with respect to distributional shifts. In his research, Jonas seeks to combine theory, methodology, and applications. His work relates to areas such as computational statistics, causal inference, graphical models, independence testing or high-dimensional statistics.

Recent Advances in Causal Inference

Abstract: TBD


Tor Lattimore

Tor Lattimore (DeepMind) is a research scientist at DeepMind in London, working mostly on algorithms for sequential decision making.

The Curse and Blessing of Curvature for Zeroth-order Convex Optimisation

Abstract: Zeroth-order convex optimisation is still quite poorly understood. I will tell a story about how to use gradient-based methods without access to gradients and explain how curvature of the loss function plays the roles of both devil and angel. The main result is a near-optimal sample-complexity analysis of a simple and computationally efficient second-order method applied to a quadratic surrogate loss.

The talk is based on a recent paper with András György.