Information Theory
Lecture Notes
LEC # |
TOPICS |
LECTURE NOTES |
1 |
Introduction, entropy |
(PDF) |
2 |
Jensen's inequality, data processing theorem, Fanos's inequality |
(PDF) |
3 |
Different types of convergence, asymptotic equipartition property (AEP), typical set, joint typicality |
(PDF) |
4 |
Entropies of stochastic processes |
(PDF) |
5 |
Data compression, Kraft inequality, optimal codes |
(PDF) |
6 |
Huffman codes |
(PDF) |
7 |
Shannon-Fano-Elias codes, Slepian-Wolf |
|
8 |
Channel capacity, binary symmetric and erasure channels |
(PDF) |
9 |
Maximizing capacity, Blahut-Arimoto |
(PDF) |
10 |
The channel coding theorem |
(PDF) |
11 |
Strong coding theorem, types of errors |
(PDF) |
12 |
Strong coding theorem, error exponents |
(PDF) |
13 |
Fano's inequality and the converse to the coding theorem |
(PDF) |
14 |
Feedback capacity |
(PDF) |
15 |
Joint source channel coding |
(PDF) |
16 |
Differential entropy, maximizing entropy |
(PDF) |
17 |
Additive Gaussian noise channel |
(PDF) |
18 |
Gaussian channels: parallel, colored noise, inter-symbol interference |
(PDF) |
19 |
Gaussian channels with feedback |
(PDF) |
20 |
Multiple access channels |
(PDF) |
21 |
Broadcast channels |
(PDF) |
22 |
Finite state Markov channels |
(PDF) |
23 |
Channel side information, wide-band channels |
(PDF) |
Assignments
PROBLEM SETS |
Problem set 1 (PDF) |
Problem set 2 (PDF) |
Problem set 3 (PDF) |
Problem set 4 (PDF) |
Problem set 5 (PDF) |
Problem set 6 (PDF) |
Problem set 7 (PDF) |
Problem set 8 (PDF) |
Problem set 9 (PDF) |