Math 254A: Entropy and Ergodic Theory (UCLA, Fall 2017)

Summary: An introduction to entropy and its many roles in different branches of mathematics, especially information theory, probability, combinatorics and ergodic theory. The aim is to give a quick overview of many topics, emphasizing a few basic combinatorial problems that they have in common and which are responsible for the ubiquity of entropy.

Provisionally, the course will separate into three parts:

  1. Shannon's entropy and related notions, and some of the original applications to information theory;
  2. Related applications in probability (especially large deviations theory) and combinatorics;
  3. The entropy rate of a stationary stochastic process, and its consequences for abstract ergodic theory.
I will emphasize the basic mathematical questions that these areas have in common and how entropy appears in the answers.

In order to cover a wide range of subjects, I will have to sacrifice a lot of generality, and sometimes omit technical details. To get the most out of the course, students will need to support the lectures with reading from the notes and a variety of other sources.

Lectures: Mondays, Wednesdays and Fridays, 2--2.50, in Math Sciences 5148.

Office Hours: Mondays 10.30--11.30 and Wednesdays 3--4. These times may change in some weeks if I have to travel. I can usually also accommodate requests for appointments at other times.

Communication: Email is preferred. I will try to answer emails within 24 hours, excluding weekends. I prefer the pronouns he/him/his.

Grading: Enrolled students will be assigned letter grades, based on class participation and a small number of homeworks. Merely attending most classes will guarantee a B. An A grade also requires work on some homework problems. Details of the latter TBA. Some homework questions will be deliberately quite challenging, and so a good but incomplete attempt at one of these will be enough to contribute to an A grade.

Lecture notes (quite rough, probably still plenty of mistakes --- please beware):

  1. Some basic questions about counting sequences
  2. Approximate typicality and the asymptotic equipartition property
  3. The meaning of entropy in information theory
  4. Conditional entropy and mutual information
  5. Joint typicality and the conditional AEP
  6. Channel coding
  7. Rate-distortion theory
  8. Fano's inequality and two applications
  9. Some applications in combinatorics. I won't produce notes for this lecture. Instead I will take material from two nice surveys, by Radhakrishnan and Galvin.
  10. Large deviations, I
  11. Large deviations, II
  12. Introduction to statistical mechanics, I
  13. Introduction to statistical mechanics, II
  14. Introduction to statistical mechanics, III. I may also refer to Section IV.4 of this book by Ellis.
  15. A first look at concentration
  16. Transportation metrics
  17. Transportation and concentration
  18. Stationary processes and ergodic theory
  19. The ergodic theorems
  20. Factors, isomorphisms and disintegrations
  21. The entropy rate of a stationary process
  22. The Kolmogorov--Sinai entropy of a measure-preserving system
  23. Some properties and applications of entropy rate
  24. Rokhlin's lemma
  25. Weak containment
  26. Weak containment with retention of entropy
  27. Sinai's factor theorem
  28. Sinai's and Ornstein's theorems, II

Homeworks:

Homework 1.
Homework 2.
Homework 3.
Homework 4.
Homework 5.

My notes draw on several other references. Here are links to some of the main references (should work within UCLA). (I may add to this list during the fall quarter.)

  1. Cover and Thomas, Elements of Information Theory, 2nd Ed
  2. Shannon, A Mathematical Theory of Communication
  3. MacKay, Information Theory, Inference and Learning Algorithms
  4. Einsiedler and Ward, Ergodic Theory with a view towards Number Theory


Tim Austin,
tim AT math DOT ucla DOT edu