CS 49.08/249.8: Information Theory in Computer Science
Winter 2021 | Professor: Amit Chakrabarti

Course Description

This course introduces students to information theory, a mathematical formalism for quantifying and reasoning about communication. While traditionally a part of electrical engineering, it has found several powerful applications in the theory of algorithms and complexity and adjacent fields such as combinatorics and game theory. The first third of the course will teach students the basics of information theory (Shannon entropy, mutual information, Kullback-Liebler divergence). The rest of the course will sample topics from error correcting codes, communication complexity, data structures, and optimization, in each case highlighting applications of information theory.

Prerequisites

This course requires a good background in mathematics (especially beginning probability theory) and exposure to formal analysis of algorithms (Dartmouth's CS 31 or an equivalent course). Preparedness will be assessed in Week 1 using a diagnostic test; see the schedule for details.

According to the ORC, the prerequisites are "COSC 31 or COSC 30 plus permission of the instructor."

Course Staff

Professor: Amit Chakrabarti

There is no teaching assistant for this course.

Textbook

The book Elements of Information Theory, 2nd edition, by Cover and Thomas is recommended (though not required) for the first half of the course. No existing textbook covers all the material of the second half. Other useful online resources.

As we head into specialized topics, registered students will work on scribing lecture notes as the course runs, and these will form our own in-house resource.

Work and Assessment

The assessed work in the course consists of (a) solving several homework problems, spaced out throughout the term; (b) scribing notes for up to two lectures; (c) a final project. Many homework problems will be challenging. For details, please see the assessment tab.

There are no exams.