ECE1502 Information Theory (Fall 2025)
Instructor:
· Prof. Wei Yu < weiyu@ece.utoronto.ca >
Teaching Assistant:
· TBD < @mail.utoronto.ca >
· TBD < @mail.utoronto.ca >
Lectures:
· Location: BA2195
· Time: Thursday 9:10am - 12noon (First lecture: Sept 4).
· No lecture on Oct 2 (professor away); Extra Lecture on Dec 4 (makeup day)
· No lecture on Oct 30 (study break)
· Last Lecture: Dec 4
Course Description:
Information theory answers two fundamental questions in communication theory: what is the ultimate limit of data compression (answer: the entropy H), and what is the ultimate limit of transmission rate of communication (answer: the channel capacity C) - Cover & Thomas
This course is a one-term introduction to information theory at a first-year graduate level. The aim is to provide a comprehensive coverage of the following major areas of information theory:
• Entropy and mutual information
• Data compression
• Channel capacity
This is a fundamental course for students interested in digital communications, data compression and signal processing. It is also of interests to students in Computer Science, Machine Learning, Statistics and related disciplines.
Pre-requisite: An undergraduate course in Probability. A fair level of mathematical maturity is expected.
Textbook: Elements of Information Theory
by Thomas M. Cover and Joy A. Thomas, John Wiley, 2nd Edition, 2006.
Course Schedule:
Date |
Topics |
Text References |
|
Sept 4 |
Introduction. Entropy. |
Ch. 1.1 |
Probability refresher |
Sept 11 |
Joint Entropy. Conditional Entropy. Relative Entropy. |
Ch. 2.1-2.3 |
|
Sept 18 |
Mutual Information. Jensen's inequality. Conditional Mutual Information. Data Processing Inequality. |
Ch. 2.5-2.6, 2.8 |
HW #1 (due Oct 2) |
Sept 25 |
Entropy Rate of Stochastic Process. Asymptotic Equipartition Property (AEP) |
Ch. 4.1-4.2 Ch. 3 |
|
Oct 2 |
No lecture |
||
Oct 9 |
Data Compression, Kraft’s Inequality, Shannon Code, Huffman Code |
Ch. 5.1-5.9 |
|
Oct 16 |
Arithmetic Code, Lempel-Ziv Code. |
Ch. 13.3-13.4 |
HW #2 (due Nov 6) |
Oct 23 |
Gambling on Horse Races. |
Ch. 6.1-6.3 |
|
Oct 30 |
Study Break |
||
Nov 6 |
Midterm |
||
Nov 13 |
Discrete Memoryless Channel, Channel Capacity Theorem. Joint Typicality. |
Ch. 7.1-7.7 |
|
Nov 20 |
Achievability Proof. Converse of the Channel Capacity Theorem. Fano's Inequality. Source Channel Separation. |
Ch. 7.9-7.10, 7.13 |
HW #3 (due Nov 27) |
Nov 27 |
Differential Entropy. Gaussian Channel. Maximum Entropy Distribution. |
Ch. 8.1-8.6 Ch. 12.1-12.2 |
|
Dec 4 |
Discrete-Time Gaussian Channel. Gaussian Vector Channels. Water-filling. |
Ch. 9.1-9.4 |
HW #4 (due Dec 9) |
|
|||
Final Exam (TBD: Late December ) |
|
-- |
Grades:
· Homework: 20%
· Midterm: 30%
· Final Exam: 50%. Both exams are open-book open-notes.
References:
• C. E. Shannon: A mathematical Theory of Communications, Bell System Tech. J., vol 27:379-423, 623-656, 1948
• R. G. Gallager: Information Theory and Reliable Communications, John Wiley, 1968
• I. Csiszár and J. Körner: Information Theory: Coding Theorems for Discrete Memoryless Systems. Academic Press, New York, 1981
• Review articles in IEEE Transactions on Information Theory vol. 44, 1998.