Information Theory 2017

University of Amsterdam course, Nov/Dec 2017
Master of Logic
Lecturer: Christian Schaffner (UvA / email)
Teaching assistants: Yfke Dulek (email)
Alvaro Piedrafita (email)


6 Feb 2017: first update of page
See here for the Spring 2014 and Fall 2014, Fall 2015, Fall 2016 editions of this course.

Sign up!

Content of the course

Information theory was developed by Claude E. Shannon in the 1950s to investigate the fundamental limits on signal-processing operations such as compressing data and on reliably storing and communicating data. These tasks have turned out to be fundamental for all of computer science.

In this course, we quickly review the basics of probability theory and introduce concepts such as (conditional) Shannon entropy, mutual information and entropy diagrams. Then, we prove Shannon's theorems about data compression and channel coding. An interesting connection with graph theory is made in the setting of zero-error information theory. We also cover some aspects of information-theoretic security such as perfectly secure encryption.


Students are required to know the (theory) contents of the course Basic Probability: Theory in the Master of Logic (no programming will be required for this course). Study the script and the theory homework exercises.

Intended Learning Outcomes

At the end of the course, you will be able to solve problems of the following kinds:


(Entropy and related concepts)

(Data compression)

(Noisy-channel coding)

Course website

Updated information about the course can be found on

Study Material

The material will be presented in black-boards lectures. The following are good references:

Lectures and Exercise sessions

please check Datanose for the definite times and locations.

Homework, exam, and grading

This is a 6 ECTS course, which comes to roughly 20 hours of work per week.

There will be homework exercises every week to be handed in one week later. The answers should be in English. Feel free to use LaTeX, here is a template to get you started, but readable handwritten solutions are fine, too. Cooperation while solving the exercises is allowed and encouraged, but everyone has to hand in their own solution set in their own words.

There will be a final written exam. The exam is open-book, meaning that you can bring the study material [CF], [CT], [MacKay] mentioned above as well as any notes you made, but no electronic devices are allowed. Here is the written exam from Spring 2014 and Fall 2015.

The final grade for the course consists by 10% of the average grade in the online multiple-choice concept questions (asked after every lecture), 40% of the average homework grade (ignoring the worst grade) and 50% of the grade obtained at the final exam.

Life after "Information Theory"

If you got hooked on the world of entropies, you have several options after the course to pursue the topics of information theory and cryptography: