Computer Science 85/185
The goal of this course is to study some recent and some not-so-recent significant works of research where techniques of information theory are used to prove theorems in computational complexity theory.
For the first phase of the course, we will learn the basics of information theory. We will use the excellent textbook by Cover and Thomas during this phase. This phase will last about 1/3 of the term. After that, we will move on to reading the papers themselves. The work for a registered student will be one homework during the initial phase, presentation of a paper (or part) during the second phase, and scribing notes for a few lectures and typesetting them in LaTeX.
||Sudikoff 214 | 2A hour | TTh 14:00-15:50, X-hr W 16:15-17:05|
||Amit Chakrabarti | Sudikoff 107 | 6-1710 | Office hours: TBA|
CS 39 (Theory of Computation) or equivalent
Strong mathematics background
One homework set.
Scribing and preparing lecture notes.
Informational Complexity and the Direct Sum Problem for Simultaneous
Message Complexity . |
Amit Chakrabarti, Yaoyun Shi, Anthony Wirth and Andrew Yao. Proc 42nd FOCS, 270--278, 2001.
An information statistics approach to data stream and communication
complexity . |
Ziv Bar-Yossef, T. S. Jayram, Ravi Kumar and D. Sivakumar. Journal of Computer and System Sciences, 68 (4): 702--732, 2004.
Two applications of information complexity . |
T. S. Jayram, Ravi Kumar and D. Sivakumar. Proc 35th STOC, 673--682, 2003.
An Optimal Randomised Cell Probe Lower Bound for Approximate Nearest
Neighbour Searching . |
Amit Chakrabarti and Oded Regev. Proc 45th FOCS, 473--482, 2004.
A Spectral Approach to Lower Bounds with Applications to Geometric
Searching . |
Bernard Chazelle. SIAM Journal on Computing, 27 (2): 545--556, 1998.
Entropy and Sorting . |
Jeff Kahn and Jeong Han Kim. Journal of Computer and System Sciences, 51 (3): 390--399, 1995.
Optimal Separations between Concurrent-Write Parallel Machines . |
Ravi Boppana. Proc 21st STOC, 320--326, 1989.
|1||Jan 5||Probability distributions; the definition of entropy|
|2||Jan 10||Conditional entropy; mutual information; KL-divergence; basic properties (some proofs via Jensen's inequality)|
|1||Jan 12||Entropy and data compression; uniquely decodable codes and prefix-free codes; Kraft inequality; Shannon codes|
|1||Jan 18||Arithmetic coding and its analysis, without regard to implementation issues|
|1||Jan 19||Communication protocols; deterministic and randomized (private as well as public coin) communication complexity|
|1||Jan 24||Information complexity; the cost of transmitting one bit and its consequences|