Information Theory

Content of the course

Information theory was developed by Claude E. Shannon in the 1950s to investigate the fundamental limits on signal-processing operations such as compressing data and on reliably storing and communicating data. These tasks have turned out to be fundamental for all of computer science.

In this course, we quickly review the basics of probability theory and introduce concepts such as (conditional) Shannon entropy, mutual information and Renyi entropy. Then, we prove Shannon's theorems about data compression and channel coding. Later in the course, we also cover some aspects of information-theoretic security such as the concept of randomness extraction and privacy amplification.

Intended Learning Outcomes

Now, at the end of the course, you are able to

Course website

Updated information about the course can be found on

Study Material

The material will be presented in black-boards lectures. The following are good references:

Lectures and Exercise sessions (2 x 45min each)

please check Datanose for the definite times and locations.
Lectures (Hoorcollege)
Times: Tuesdays, 11-13, location: Science Park G0.05
Thursdays, 11-13, location: check Datanose
starting 4 February 2013

Exercises (Werkcollege)
Time: Fridays 9-11, location: check Datanose
first exercises: 7 February 2013

Homework, exam, and grading

This is a 6 ECTS course, which comes to roughly 20 hours of work per week.

There will be homework exercises every week to be handed in one week later. The answers should be in English. Feel free to use LaTeX, here is a template to get you started, but readable handwritten solutions are fine, too. Cooperation while solving the exercises is allowed and encouraged, but everyone has to hand in their own solution set in their own words.

There will be a final written exam on Friday, March 28, 2014, from 9:00-12:00 in SP, G2.02. The exam is open-book, meaning that you can bring the study material [CF], [CT], [MacKay] mentioned above as well as any notes you made, but no electronic devices are allowed.

The final grade for the course consists by 1/2 of the average homework grade (ignoring the worst grade) and 1/2 of the grade obtained at the final exam.

Preliminary course schedule Spring 2014

Tue, 4 Feb 2014

Overview of the course, Probability Theory

Section 2.1 of [CF]

Thu, 6 Feb 2014

Shannon Entropy, Jensen's inequality, Properties of Shannon entropy

Sections 3.1-3.3 of [CF]

Slides Homework #1 (updated) 

6 Feb 14: There was a typo in the definition of the random variable Z in homework exercise 4, it is corrected now.

Tue, 11 Feb 2014

Chain Rule, Mutual Information, Entropy Diagrams, Markov Chains

Sections 3.3, 3.4 of [CF], Sections 2.5, 2.8 of [CT]

Thu, 13 Feb 2014

Data-Processing Inequality, Sufficient Statistics,Fano's inequality, Perfectly Secure Encryption: One-time pad

Sections 2.8-2.10 of [CT], Section 4 of [CF]

Homework #2 (updated)  Homework Latex Template (updated) 
Tue, 18 Feb 2014

Perfectly Secure Encryption: Shannon's theorem. Data compression / Source coding, Asymptotic Equipartition Property

Section 4 of [CF], Section 3.1 of [CT]

Entropy of Alice in Wonderland  Hex Editor with statistics  Slides
Thu, 20 Feb 2014

Data Compression: Typical set, Source-coding Theorem, high-probability set, symbol codes

Section 3.2+3.3 of [CT]

Homework #3  Table for #3, Exercise 3  Entropy Diagram Latex Template (thanks to Yfke!) 
25 Feb 14: Added PDF file of the probability table for ex 3.
Tue, 25 Feb 2014

Data Compression: symbol codes, properties, source-coding theorem reloaded, Kraft's inequality, Huffman codes

Section 5 of [CF], Chapter 5 of [CT], Chapter 5 of [MacKay]

Thu, 27 Feb 2014

Optimality of Huffman Codes, Arithmetic Codes

Section 5.4+5.5 of [CF], Section 6.2 of [MacKay]

Homework #4 
Tue, 4 Mar 2014

Guest lecture by Teresa Piovesan

Zero-error information theory

Pages 4+5 of this survey paper

Thu, 6 Mar 2014

Guest lecture by Teresa Piovesan

Zero-error information theory

Homework #5 (updated, thanks to Teresa!) 
Clarification added in Exercise 2(f).  
Tue, 11 Mar 2014

Noisy-channel coding: capacity, set-up and proof of converse

Sections 7.1, 7.4, 7.5, 7.9, 7.12 of [CT]

Thu, 13 Mar 2014

Shannon's theorem about noisy-channel coding

Sections 7.6, 7.7 of [CT], Chapter 10 of [MacKay]

Slides   Homework #6
Tue, 18 Mar 2014

Source-channel Separation, Error-correcting codes

Section 7.13 of [CT], Chapter 1 of [MacKay]

Thu, 20 Mar 2014

No more lecture, but exercise session instead of Fri, 21 March

Homework #7
Fri, 21 Mar 2014

No Exercise Session

Tue, 25 Mar 2014

No Lecture (exam week)

Thu, 27 Mar 2014

No Lecture (exam week)

Fri, 28 Mar 2014

Final Written exam

9:00-12:00 in SP, G2.02