Information and Communication

News:

20 Jan 2016: added last pictures and updated presentation times
5 March 2015: It is important to show up personally for the first lecture on Monday, 4 January 2016 because the details of the course will be further organised then.

Content of the course

Information theory was developed by Claude E. Shannon in the 1950s to investigate the fundamental limits on signal-processing operations such as compressing data and on reliably storing and communicating data. These tasks have turned out to be fundamental for all of computer science.

In this course, we introduce the basics of probability theory and then study concepts such as (conditional) Shannon entropy and mutual information. Then, we treat Shannon's theorems about data compression and channel coding. We will also cover some aspects of information-theoretic security for encryption.

Intended Learning Outcomes

At the end of the course, you are able to The contents of the course might vary depending on the previous knowledge of the participants.

Course website

Updated information about the course can be found on http://homepages.cwi.nl/~schaffne/courses/infcom/2015/

Prerequisites

Basic calculus, e.g. working with logarithms. Also, basic notions of discrete probability (as learned e.g. in stochastiek 1) are helpful, but are not a strict requirement. This course is well-suited for students who are pursuing a double bachelor in mathematics and computer science.

Study Material

The material will be presented on slides and black-boards lectures. The following are good references:

Schedule

please check Datanose for the definite times and locations.
It is important to show up personally for the first lecture on Monday, 4 January because the details of the course will be further organised then. If you cannot make it on Monday, but you want to attend the course nevertheless, please send me an email.

Language

The lectures will be given in English. The homework and final report might be written in Dutch, and the presentation can be delivered in Dutch, but the use of English is encouraged.

Credits, homework, final presentation, report

This is a 6 ECTS course, which will keep you busy full-time (40h/week) for the month of January 2016. There will be lectures in the first two weeks (4-15 January) and homework exercises to solve and hand in. In the third week of the course, you choose a topic from this list and study it. In the final week, you present the topic to the class and write a final report about this topic.

Grades

Your grade for the final presentation will be determined by the quality of the presentation, your ability to answer questions about the subject (we will use this list for the evaluation).

The final presentation counts 1/3 towards your final grade of the course, 1/3 will be determined by the report, and 1/3 will be determined by the average of the 3 homework exercises.

Course schedule for January 2016

(preliminary version)

Day

Contents

[CF]

[CT]

[MacKay]

Exercises

Mon, 4 Jan 2016, 9:00-10:00

Overview and organisation of the course

It is essential to attend this first lecture if you want to follow the course.

Slides #1

Mon, 4 Jan 2016, 10:00-12:00

Discrete Probability Theory

Blackboard Photo 1   Photo 2   Photo 3  

2.1

2.1, 2.2
Mon, 4 Jan 2016, 12:00-13:00

Exercise session (on Discrete Probability Theory)

Homework #1
Tue, 5 Jan 2016, 9:00-11:00

Jensen's inequality, Entropy

Handout Jensen's inequality

Slides #2

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5   Photo 6   Photo 7  

2.2, 3.1, 3.2 2.1, 2.6 2.7
Tue, 5 Jan 2016, 11:00-13:00

Exercise Session on Probability Theory

Wed, 6 Jan 2016, 11:00-13:00

Exercise Session on Entropy

Blackboard Photo 1   Photo 2

3

Homework #2
Thu, 7 Jan 2015, 9:00-11:00

Data Compression: symbol codes, Kraft's inequality, source-coding theorem (symbol-code version), Huffman codes

Slides #3

Entropy of Alice in Wonderland  Hex Editor with statistics 

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5   Photo 6   Photo 7  

5.1, 5.2

5

5, L4

Thu, 7 Jan 2016, 11:00-13:00

Exercise Session

Fri, 8 Jan 2015, 11:00-13:00

Huffman coding

Preparation Homework: Figure out how to construct a Huffman code

the online game of 20 questions

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5   Photo 6  

Fri, 8 Jan 2016, 13:00-15:00

Exercise Session

Blackboard Photo 1   Photo 2   Photo 3   Photo 4  

Mon, 11 Jan 2016, 9:15-11:00

Entropy Diagrams, Markov chains, Data-Processing Inequality, Fano's inequality

ILLC meeting room F1.15

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5   Photo 6   Photo 7   Photo 8   Photo 9  

3.4 4
Mon, 11 Jan 2016, 11:00-13:00

Exercise Session

Photo 1   Photo 2  

Tue, 12 Jan 2016, 9:15-11:00

Sufficient Statistic, Perfectly Secure Encryption: Definitions, One-time Pad and Shannon's theorem

Insecurity of Key Reuse in OTP

Try it yourself by solving this crypto challenge!

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5   Photo 6  

4
Tue, 12 Jan 2016, 11:00-13:00

Exercise Session

Photo  

Homework #3
Thu, 14 Jan 2016, 9:15-12:00

Topic selection for continuation of the course, Exercise session

ILLC meeting room F1.15

Tue, 26 Jan 2016, 10:30-13:00

Student presentations

at ILLC meeting room F1.15

10:30 - 11:30Amon: Shannon's noisy-channel coding theorem SlidesPDFReportPDF
12:00 - 13:00Michael: Error-correcting codes SlidesPDFReportPDF
Wed, 27 Jan 2016, 13:30-16:00

Student presentations

at CWI L017

13:30 - 14:30Abe: Code breaking or practical data compression SlidesPDFReportPDF
15:00 - 16:00Govert: Gambling SlidesPDFReportPDF

Life after "Information & Communication"

If you got hooked on the world of entropies, you have several options after the course to pursue the topics of information theory and cryptography: