ECE1502 Information Theory (Fall 2020)
Instructor:
· Prof. Wei Yu < weiyu@ece.utoronto.ca >
Teaching Assistant:
· Reza Farsani < reza.farsani@utoronto.ca >
Lectures: (starting Sept 10)
· Lectures are recorded asynchronously
· Class discussions are held synchronously on Thursdays 9:3011:00
· TA discussions are held synchronously on Mondays 9:3011:00
Synchronous sessions are on Zoom. Asynchronous lectures are available by streaming.
No lectures/tutorials during Fall study break: Nov 913. Last day of the session: Dec 9.
Course Description:
Information theory answers two fundamental questions in communication theory: what is the ultimate limit of data compression (answer: the entropy H), and what is the ultimate limit of transmission rate of communication (answer: the channel capacity C)  Cover & Thomas
This course is a oneterm introduction to information theory at a firstyear graduate level. The aim is to provide a comprehensive coverage of the following major areas of information theory:
• Entropy and mutual information
• Data compression
• Channel capacity
This is a fundamental course for students interested in digital communications, data compression and signal processing. It is also of interests to students in Computer Science, Statistics and related disciplines.
Prerequisite: An undergraduate course in Probability. A fair level of mathematical maturity is expected.
Textbook: Elements of Information Theory
by Thomas M. Cover and Joy A. Thomas, John Wiley, 2nd Edition, 2006.
Course Schedule:
Date 
Class discussion 
Text References 
TA discussion 
Sept 10 
Introduction 
Ch. 1.1 

Sept 14 


Probability refresher 
Sept 17 
Entropy. Joint Entropy. Relative Entropy. 
Ch. 2.12.3 

Sept 21 


HW #1 
Sept 24 
Mutual Information. Jensen's inequality. Conditional Entropy. Conditional Mutual Information. Data Processing Inequality. Entropy Rate of Stochastic Process 
Ch. 2.52.6, 2.8 Ch. 4.14.2 

Sept 28 


HW #1 
Oct 1 
Asymptotic Equipartition Property (AEP) 
Ch. 3 

Oct 5 


HW #2 
Oct 8 
Data Compression, Kraft’s Inequality, Shannon Code, Huffman Code 
Ch. 5.15.9 

Oct 12 


 
Oct 15 
Arithmetic Code, LempelZiv Code. 
Ch. 13.313.4 

Oct 19 


HW #2 
Oct 22 
Gambling on Horse Races. 
Ch. 6.17.3 

Oct 26 


Midterm 
Oct 29 
Discrete Memoryless Channel, Channel Capacity Theorem. 
Ch. 7.17.5 

Nov 2 


HW #3 
Nov 5 
Joint Typicality. Achievability Proof. Converse of the Channel Capacity Theorem. Fano's Inequality. 
Ch. 7.67.7, 7.97.10, 7.13 

Nov 913 
Fall study break 

Nov 16 


HW #3 
Nov 19 
Source Channel Separation. Differential Entropy. Gaussian Channel 
Ch. 7.13, 8.18.6 

Nov 23 


HW #4 
Nov 26 
Maximum Entropy Distribution. DiscreteTime Gaussian Channel 
Ch. 12.112.2 Ch. 9.19.2 

Nov 30 


HW #4 
Dec 3 
Gaussian Vector Channels. Waterfilling. Bandlimited Gaussian Channel. Complex Gaussian Channels. 
Ch. 9.39.5 

Dec 7 
MultipleAccess Channel. RateDistortion Theory. 
Ch. 15.115.3, 10.110.5 
HW #5 






Dec 17 
Final Exam 

 
Grades:
· Midterm: 35%
· Final Exam: 65%. All exams are openbook, takehome exams.
References:
• C. E. Shannon: A mathematical Theory of Communications, Bell System Tech. J., vol 27:379423, 623656, 1948
• R. G. Gallager: Information Theory and Reliable Communications, John Wiley, 1968
• I. Csiszár and J. Körner: Information Theory: Coding Theorems for Discrete Memoryless Systems. Academic Press, New York, 1981
• Review articles in IEEE Transactions on Information Theory vol. 44, 1998.