This course introduces students to information theory, a mathematical formalism for quantifying and reasoning about communication. While traditionally a part of electrical engineering, it has found several powerful applications in the theory of algorithms and complexity and adjacent fields such as combinatorics and game theory. The first third of the course will teach students the basics of information theory (Shannon entropy, mutual information, Kullback-Liebler divergence). The rest of the course will sample topics from error correcting codes, communication complexity, data structures, and optimization, in each case highlighting applications of information theory.
This course requires a good background in mathematics (especially beginning probability theory) and exposure to formal analysis of algorithms (Dartmouth's CS 31 or an equivalent course). Preparedness will be assessed in Week 1 using a diagnostic test; see the schedule for details.
According to the ORC, the prerequisites are "COSC 31 or COSC 30 plus permission of the instructor."
Professor: Amit Chakrabarti
There is no teaching assistant for this course.
The book Elements of Information Theory, 2nd edition, by Cover and Thomas is recommended (though not required) for the first half of the course. No existing textbook covers all the material of the second half. Other useful online resources.
- Information Theory, Inference, and Learning Algorithms, by David MacKay, is another very nice textbook for the first half of the course.
- Lecture Notes on Information Theory, by Polyanskiy and Wu, is an excellent modern resource, more mathematically deep than what we'll do and with a statistics slant.
Work and Assessment
The assessed work in the course consists of (a) solving several homework problems, spaced out throughout the term; (b) scribing notes for up to two lectures; (c) a final project. Many homework problems will be challenging. For details, please see the assessment tab.
There are no exams.