Computer Science 49/149 (Winter 2019)

Communication Complexity

Dartmouth Logo
Computer Science
Dartmouth College

Who: Taught by Amit Chakrabarti
When: 2A hour, TTh 14:25-16:15, X-hr W 16:35-17:25
Where: Sudikoff 214

This course will introduce you to Communication Complexity, a subject that deals with how to perform a computation collaboratively when the input is not in one place but is split amongst two or more parties. The main objects of study will be communication protocols, which are efficient algorithms to solve such split-input problems. As computation has moved away from the basic isolated-computer setting to a complex, networked, cloud-enhanced setting, understanding the possibilities and limitations of such communication protocols has become increasingly important.

The course will first cover a number of key algorithms for basic communication problems. These algorithms will often feature a beautiful interplay between probability theory, graph theory, combinatorics, and algebra.

We will then move on to studying communication lower bounds, i.e., the limits of our ability to communicate efficiently. We will see that certain problems inhernetly require the parties holding the input to communicate very inefficiently, and we will understand in a mathematically precise way why this must be. This part of the course will use further probability theory, information theory, and linear algebra.

All advanced mathematical topics required for our study will be developed or reviewed within the course. The student is expected to have basic background in probability theory and linear algebra, and to be very comfortable writing mathematical proofs. Specific background in either Algorithms (CS 31) or the Theory of Computation (CS 39) is not an absolute requirement, but will be a plus.

1. Jan 3Equality testing with fingerprints // basic probability, prime number theorem[KN: 1.1, 3.1]
2. Jan 8Algebraic fingerprints // finite fields, zeroes of polynomials
3. Jan 10Comparison and median-finding // binary search, Chernoff bounds [KN: exercises]
4. Jan 15Private-vs-public; // the probabilistic method[KN: 3.3], [BCKWY]
5. Jan 17INDEX lower bound; Yao's minimax principle // Hamming balls, entropy function[KN: 3.2, 3.3]
6. Jan 22Simultaneous messages; Discrepancy, INNER-PRODUCT // codes, matrix norm[BK], [FKNN]
7. Jan 24DISJOINTNESS, corruption; Introduction to information theory
8. Jan 29More information theory; tight lower bound for augmented indexing
9. Jan 31Information complexity; disjointness revisited
10. Feb 5Information complexity of 1-bit AND // probability "metrics"
11. Feb 7DISJOINTNESS for small sets; Multi-party communication
12. Feb 12Multi-party UNIQUE-DISJOINTNESS // negative type distances
13. Feb 14The GAP-HAMMING-DISTANCE problem; Intro to round elimination
14. Feb 19(cancelled due to illness)
15. Feb 21GREATER-THAN with bounded rounds; Tree Pointer Jumping [KN: 12.2]
16. Feb 26Circuit size and depth Lower bounds via communication complexity [KN: 9.2]
17. Feb 27(x-hour) Number-on-Forehead communication [KN: 9.2]
18. Feb 28Karchmer-Wigderson games; Monotone depth lower bound for STCONN [KN: 6,10.2,10.3]
19. Mar 5Student presentations 1 and 2; Lower bound for FORK [KN: 5.3]
20. Mar 6(x-hour) Student presentations 3 and 4

Textbooks and Such

There is no set textbook for the course, so it is vital to attend class. However, much of the material we shall cover can be found in the following sources.

Administrative Details

Homework: Here is a master list of homework problems for the entire term, ordered roughly according to the sequence in which I expect to cover topics in class. This document may evolve during the term and a few more problems may be added. There will be three homework checkpoints during the term.

  • By Jan 31, you should obtain 30 points.
  • By Feb 14, you should obtain 60 points.
  • By Mar 4, you should obtain 90 points.

Research Presentation: In lieu of a final exam, students will read, understand, and present (using slides) a paper chosen from a small list. Details will be announced in the third week of the term.

List of Papers
  1. Braverman, Garg, Pankratov, and Weinstein. From Information to Exact Communication.
  2. Braverman and Moitra. An Information Complexity Approach to Extended Formulations.
  3. Barak, Braverman, Chen, and Rao. How to Compress Interactive Communication.
  4. Chattopadhyay, Radhakrishnan, and Rudra. Topology Matters in Communication.
  5. Harsha, Jain, McAllester, and Radhakrishnan. The Communication Complexity of Correlation.
  6. Huang and Yi. The Communication Complexity of Distributed ε-Approximations.
  7. Lovett. Communication is Bounded by Root of Rank.
  8. Phillips, Verbin, and Zhang. Lower Bounds for Number-in-Hand Multiparty Communication Complexity, Made Easy.
  9. Sherstov. The Communication Complexity of Gap Hamming Distance.

Other courses taught by Amit Chakrabarti