Math 254A: Entropy and Ergodic Theory (UCLA, Fall 2017)
Summary: An introduction to entropy and its many roles in different branches of mathematics, especially information theory, probability, combinatorics and ergodic theory.
The aim is to give a quick overview of many topics, emphasizing a few basic combinatorial problems that they have in common and which are responsible for the ubiquity of entropy.
Provisionally, the course will separate into three parts:
I will emphasize the basic mathematical questions that these areas have in common and how entropy appears in the answers.
- Shannon's entropy and related notions, and some of the original applications to information theory;
- Related applications in probability (especially large deviations theory) and combinatorics;
- The entropy rate of a stationary stochastic process, and its consequences for abstract ergodic theory.
In order to cover a wide range of subjects, I will have to sacrifice a lot of generality, and sometimes omit technical details. To get the most out of the course, students will need to support the lectures with reading from the notes and a variety of other sources.
Lectures: Mondays, Wednesdays and Fridays, 2--2.50, in Math Sciences 5148.
Office Hours: Mondays 10.30--11.30 and Wednesdays 3--4. These times may change in some weeks if I have to travel. I can usually also accommodate requests for appointments at other times.
Communication: Email is preferred. I will try to answer emails within 24 hours, excluding weekends. I prefer the pronouns he/him/his.
Grading: Enrolled students will be assigned letter grades, based on class participation and a small number of homeworks. Merely attending most classes will guarantee a B.
An A grade also requires work on some homework problems. Details of the latter TBA.
Some homework questions will be deliberately quite challenging, and so a good but incomplete attempt at one of these will be enough to contribute to an A grade.
Lecture notes (quite rough, probably still plenty of mistakes --- please beware):
- Some basic questions about counting sequences
- Approximate typicality and the asymptotic equipartition property
- The meaning of entropy in information theory
- Conditional entropy and mutual information
- Joint typicality and the conditional AEP
- Channel coding
- Rate-distortion theory
- Fano's inequality and two applications
- Some applications in combinatorics. I won't produce notes for this lecture. Instead I will take material from two nice surveys, by Radhakrishnan and Galvin.
- Large deviations, I
- Large deviations, II
- Introduction to statistical mechanics, I
- Introduction to statistical mechanics, II
- Introduction to statistical mechanics, III. I may also refer to Section IV.4 of this book by Ellis.
- A first look at concentration
- Transportation metrics
- Transportation and concentration
- Stationary processes and ergodic theory
- The ergodic theorems
- Factors, isomorphisms and disintegrations
- The entropy rate of a stationary process
- The Kolmogorov--Sinai entropy of a measure-preserving system
- Some properties and applications of entropy rate
- Rokhlin's lemma
- Weak containment
- Weak containment with retention of entropy
- Sinai's factor theorem
- Sinai's and Ornstein's theorems, II
My notes draw on several other references. Here are links to some of the main references (should work within UCLA). (I may add to this list during the fall quarter.)
- Cover and Thomas, Elements of Information Theory, 2nd Ed
- Shannon, A Mathematical Theory of Communication
- MacKay, Information Theory, Inference and Learning Algorithms
- Einsiedler and Ward, Ergodic Theory with a view towards Number Theory
tim AT math DOT ucla DOT edu