ECE1502 Information Theory (Fall 2022)
Instructor:
· Prof. Wei Yu < weiyu@ece.utoronto.ca >
Teaching Assistant:
· Ryan Song < r.song@mail.utoronto.ca >
· Tao Jiang < taoca.jiang@mail.utoronto.ca >
Lectures:
· Tuesdays 911am and Thursdays, 9am10am. BA 1200 (First lecture: Sept 8).
· No lectures/tutorials during Fall study break: Nov 711. (Last Lecture: Dec 1).
Course Description:
Information theory answers two fundamental questions in communication theory: what is the ultimate limit of data compression (answer: the entropy H), and what is the ultimate limit of transmission rate of communication (answer: the channel capacity C)  Cover & Thomas
This course is a oneterm introduction to information theory at a firstyear graduate level. The aim is to provide a comprehensive coverage of the following major areas of information theory:
• Entropy and mutual information
• Data compression
• Channel capacity
This is a fundamental course for students interested in digital communications, data compression and signal processing. It is also of interests to students in Computer Science, Statistics and related disciplines.
Prerequisite: An undergraduate course in Probability. A fair level of mathematical maturity is expected.
Textbook: Elements of Information Theory
by Thomas M. Cover and Joy A. Thomas, John Wiley, 2nd Edition, 2006.
Course Schedule:
Date 
Topics 
Text References 

Sept 8 
Introduction. Entropy. 
Ch. 1.1 
Probability refresher 
Sept 13 
Joint Entropy. Conditional Entropy. Relative Entropy. 
Ch. 2.12.3 

Sept 20 
Mutual Information. Jensen's inequality. Conditional Mutual Information. Data Processing Inequality. 
Ch. 2.52.6, 2.8 
HW #1 
Sept 27 
Entropy Rate of Stochastic Process. Asymptotic Equipartition Property (AEP) 
Ch. 4.14.2 Ch. 3 

Oct 4 
Data Compression, Kraft’s Inequality, Shannon Code, Huffman Code 
Ch. 5.15.9 

Oct 11 
Arithmetic Code, LempelZiv Code. 
Ch. 13.313.4 
HW #2 
Oct 18 
Gambling on Horse Races. 
Ch. 6.16.3 

Oct 25 
Discrete Memoryless Channel, Channel Capacity Theorem. Joint Typicality. Achievability Proof. 
Ch. 7.17.7 

Nov 1 
Midterm 
Midterm (Nov 1, 911am HA403) 

Nov 711 
Fall study break 

Nov 15 
Converse of the Channel Capacity Theorem. Fano's Inequality. Source Channel Separation. 
Ch. 7.97.10, 7.13 
HW #3 
Nov 22 
Differential Entropy. Gaussian Channel. Maximum Entropy Distribution. 
Ch. 8.18.6 Ch. 12.112.2 

Nov 29 
DiscreteTime Gaussian Channel. Gaussian Vector Channels. Waterfilling. Bandlimited Gaussian Channel. 
Ch. 9.19.5 
HW #4 
Dec 6 
No class. 



Final Exam 

 
Grades:
· Homework: 10%
· Midterm: 25%
· Final Exam: 65%. Both exams are openbook opennotes.
References:
• C. E. Shannon: A mathematical Theory of Communications, Bell System Tech. J., vol 27:379423, 623656, 1948
• R. G. Gallager: Information Theory and Reliable Communications, John Wiley, 1968
• I. Csiszár and J. Körner: Information Theory: Coding Theorems for Discrete Memoryless Systems. Academic Press, New York, 1981
• Review articles in IEEE Transactions on Information Theory vol. 44, 1998.