ECE1502 Information Theory (Fall 2020)

 

Instructor:

·      Prof. Wei Yu < weiyu@ece.utoronto.ca >  

 

Teaching Assistant:

·      Reza Farsani < reza.farsani@utoronto.ca >

 

Lectures: (starting Sept 10)

·      Lectures are recorded asynchronously

·      Class discussions are held synchronously on Thursdays 9:30-11:00

·      TA discussions are held synchronously on Mondays 9:30-11:00

Synchronous sessions are on Zoom. Asynchronous lectures are available by streaming.

  

No lectures/tutorials during Fall study break: Nov 9-13. Last day of the session: Dec 9.

 

Course Description:

Information theory answers two fundamental questions in communication theory: what is the ultimate limit of data compression (answer: the entropy H), and what is the ultimate limit of transmission rate of communication (answer: the channel capacity C)  - Cover & Thomas

 

This course is a one-term introduction to information theory at a first-year graduate level. The aim is to provide a comprehensive coverage of the following major areas of information theory:

•          Entropy and mutual information

•          Data compression

•          Channel capacity

 

This is a fundamental course for students interested in digital communications, data compression and signal processing. It is also of interests to students in Computer Science, Statistics and related disciplines.

 

Pre-requisite: An undergraduate course in Probability. A fair level of mathematical maturity is expected. 

 

Textbook: Elements of Information Theory

by Thomas M. Cover and Joy A. Thomas, John Wiley, 2nd Edition, 2006.

 

Course Schedule:

 

Date

Class discussion

Text References

TA discussion

Sept 10

Introduction

Ch. 1.1

Sept 14

 

 

Probability refresher

Sept 17

Entropy. Joint Entropy. Relative Entropy.

Ch. 2.1-2.3

Sept 21

 

 

HW #1

Sept 24

Mutual Information. Jensen's inequality. Conditional Entropy. Conditional Mutual Information. Data Processing Inequality. Entropy Rate of Stochastic Process

Ch. 2.5-2.6, 2.8 Ch. 4.1-4.2

 

Sept 28

 

 

HW #1

Oct 1

Asymptotic Equipartition Property (AEP)

Ch. 3

Oct 5

 

 

HW #2

Oct 8

Data Compression, Kraft’s Inequality, Shannon Code, Huffman Code

Ch. 5.1-5.9

Oct 12

 

 

--

Oct 15

Arithmetic Code, Lempel-Ziv Code.

Ch. 13.3-13.4

Oct 19

 

 

HW #2

Oct 22

Gambling on Horse Races.

Ch. 6.1-7.3

 

Oct 26

 

 

Midterm

Oct 29

Discrete Memoryless Channel, Channel Capacity Theorem.

Ch. 7.1-7.5

Nov 2

 

 

HW #3

Nov 5

Joint Typicality. Achievability Proof. Converse of the Channel Capacity Theorem. Fano's Inequality.

Ch. 7.6-7.7, 7.9-7.10, 7.13

Nov 9-13

Fall study break

Nov 16

 

 

HW #3

Nov 19

Source Channel Separation. Differential Entropy. Gaussian Channel

Ch. 7.13, 8.1-8.6

Nov 23

 

 

HW #4

Nov 26

Maximum Entropy Distribution. Discrete-Time Gaussian Channel

Ch. 12.1-12.2

Ch. 9.1-9.2

Nov 30

 

 

HW #4

Dec 3

Gaussian Vector Channels. Water-filling. Band-limited Gaussian Channel. Complex Gaussian Channels.

Ch. 9.3-9.5

Dec 7

Multiple-Access Channel. Rate-Distortion Theory.

Ch. 15.1-15.3, 10.1-10.5

HW #5

 

 

 

 

 

Dec 17

Final Exam

 

--

 

Grades:

·      Midterm: 35%

·      Final Exam: 65%. All exams are open-book, take-home exams.

 

References:

      C. E. Shannon: A mathematical Theory of Communications, Bell System Tech. J., vol 27:379-423, 623-656, 1948

      R. G. Gallager: Information Theory and Reliable Communications, John Wiley, 1968

      I. Csiszár and J. Körner: Information Theory: Coding Theorems for Discrete Memoryless Systems. Academic Press, New York, 1981

      Review articles in IEEE Transactions on Information Theory vol. 44, 1998.