Information Theory 2018
University
of Amsterdam course, Nov/Dec 2018
Master of Logic, Master of Computational Science, elective of Master of AI
Master of Logic, Master of Computational Science, elective of Master of AI
Lecturer: Christian
Schaffner (UvA / email)
Teaching assistants: Yfke Dulek (email)
Alvaro Piedrafita (email)
Teaching assistants: Yfke Dulek (email)
Alvaro Piedrafita (email)
News:
23 March 2018: first update of page
See here for the Spring 2014 and Fall 2014, Fall 2015, Fall 2016, Fall 2017 editions of this course.
Sign up!
- The course will be handled on Canvas, sign up there!
- Attend the first lecture at Science Park.
Content of the course
Information theory was developed by Claude E. Shannon in the 1950s to investigate the fundamental limits on signal-processing operations such as compressing data and on reliably storing and communicating data. These tasks have turned out to be fundamental for all of computer science.In this course, we quickly review the basics of probability theory and introduce concepts such as (conditional) Shannon entropy, mutual information and entropy diagrams. Then, we prove Shannon's theorems about data compression and channel coding. An interesting connection with graph theory is made in the setting of zero-error information theory. We also cover some aspects of information-theoretic security such as perfectly secure encryption.