## Information Theory

### Content of the course

Information theory was developed by Claude E. Shannon in the 1950s to investigate the fundamental limits on signal-processing operations such as compressing data and on reliably storing and communicating data. These tasks have turned out to be fundamental for all of computer science.

In this course, we quickly review the basics of probability theory and introduce concepts such as (conditional) Shannon entropy, mutual information and Renyi entropy. Then, we prove Shannon's theorems about data compression and channel coding. Later in the course, we also cover some aspects of information-theoretic security such as the concept of randomness extraction and privacy amplification.

### Intended Learning Outcomes

Now, at the end of the course, you are able to
• Define Shannon entropy and Mutual Information and compute these quantities on examples.
• Work with joint discrete random variables (conditioning, Bayes' rule)
• Define basic discrete probability distributions (Bernoulli, Binomial, Geometric) and compute their expected value and variance
• State Jensen's inequality for convex and concave functions and use it in proofs
• Use entropy diagrams to read off and find new relations between entropic quantities
• Prove Shannon's theorem about perfectly secure encryption (e.g. by using an entropy diagram)
• Define typical and jointly typical sets, prove properties about their size and probability mass and know how they are used in source coding and channel coding
• Use Kraft's inequality e.g. to check if a prefix-free code exists for given codeword lengths
• Compute a d-ary Huffman code
• Describe how much a given source can be compressed and give a way to do it
• Prove properties about Arithmetic Codes
• Find the confusability graph of a given channel, and find channels for a given confusability graph
• Compute the independence number and (zero-error) Shannon capacity of a given confusability graph
• Compute the capacity and maximizing input distribution of a given channel
• Define basic channels (binary symmetric, erasure channel)
• State Shannon's noisy channel-coding theorem and and understand the key ingredients of the proof
• Grasp definitions of types of entropy different than Shannon entropy

### Course website

Updated information about the course can be found on http://homepages.cwi.nl/~schaffne/courses/inftheory/2014/

### Study Material

The material will be presented in black-boards lectures. The following are good references:

### Lectures and Exercise sessions (2 x 45min each)

please check Datanose for the definite times and locations.
Lectures (Hoorcollege)
Times: Tuesdays, 11-13, location: Science Park G0.05
Thursdays, 11-13, location: check Datanose
starting 4 February 2013

Exercises (Werkcollege)
Time: Fridays 9-11, location: check Datanose
first exercises: 7 February 2013