- Block Diagram of a Communications System, Entropy, Relative Entropy
- Concave Functions, Jensen's Inequality, Conditional Entropy, Mutual Information
- Fano's Inequality, Non-Singular Codes, Uniquely Decodable Codes, Prefix-Free Codes
- Kraft's Inequality, Shannon and Huffman Codes
- Shannon's Source Coding Theorem
- Data Processing Inequality, Log-Sum Inequality, Convexity of Relative Entropy, Typical Sequences, AEP
- Data Compression and Typicality, Channel Capacity
- Concavity/Convexity of Mutual Information, Kuhn-Tucker Conditions, (Weakly) Symmetric Channels
- Block Codes/Encoder/Decoder, Converse of the Channel Coding Theorem, Joint Typicality
- Channel Coding Theorem (direct part), Source-Channel Separation Theorem
- Channels with Feedback
- Strong Typicality
- Rate-Distortion Theory
- A Glimpse at Multi-Terminal Information Theory

© Signal and Information Processing Laboratory (ISI), ETH Zurich | Contact | Imprint |