﻿

ECE1502 Information Theory (Fall 2022)

Instructor:

·     Prof. Wei Yu < weiyu@ece.utoronto.ca >

Teaching Assistant:

·     Ryan Song < r.song@mail.utoronto.ca >

·     Tao Jiang < taoca.jiang@mail.utoronto.ca >

Lectures:

·     Tuesdays 9-11am and Thursdays, 9am-10am. BA 1200 (First lecture: Sept 8).

·     No lectures/tutorials during Fall study break: Nov 7-11. (Last Lecture: Dec 1).

Course Description:

Information theory answers two fundamental questions in communication theory: what is the ultimate limit of data compression (answer: the entropy H), and what is the ultimate limit of transmission rate of communication (answer: the channel capacity C)  - Cover & Thomas

This course is a one-term introduction to information theory at a first-year graduate level. The aim is to provide a comprehensive coverage of the following major areas of information theory:

•          Entropy and mutual information

•          Data compression

•          Channel capacity

This is a fundamental course for students interested in digital communications, data compression and signal processing. It is also of interests to students in Computer Science, Statistics and related disciplines.

Pre-requisite: An undergraduate course in Probability. A fair level of mathematical maturity is expected.

Textbook: Elements of Information Theory

by Thomas M. Cover and Joy A. Thomas, John Wiley, 2nd Edition, 2006.

Course Schedule:

 Date Topics Text References Sept 8 Introduction. Entropy. Ch. 1.1 Probability refresher Sept 13 Joint Entropy. Conditional Entropy. Relative Entropy. Ch. 2.1-2.3 Sept 20 Mutual Information. Jensen's inequality. Conditional Mutual Information. Data Processing Inequality. Ch. 2.5-2.6, 2.8 HW #1 Sept 27 Entropy Rate of Stochastic Process. Asymptotic Equipartition Property (AEP) Ch. 4.1-4.2 Ch. 3 Oct 4 Data Compression, Kraft’s Inequality, Shannon Code, Huffman Code Ch. 5.1-5.9 Oct 11 Arithmetic Code, Lempel-Ziv Code. Ch. 13.3-13.4 HW #2 Oct 18 Gambling on Horse Races. Ch. 6.1-6.3 Oct 25 Discrete Memoryless Channel, Channel Capacity Theorem. Joint Typicality. Achievability Proof. Ch. 7.1-7.7 Nov 1 Midterm Midterm (Nov 1, 9-11am HA403) Nov 7-11 Fall study break Nov 15 Converse of the Channel Capacity Theorem. Fano's Inequality. Source Channel Separation. Ch. 7.9-7.10, 7.13 HW #3 Nov 22 Differential Entropy. Gaussian Channel. Maximum Entropy Distribution. Ch. 8.1-8.6 Ch. 12.1-12.2 Nov 29 Discrete-Time Gaussian Channel. Gaussian Vector Channels. Water-filling. Band-limited Gaussian Channel. Ch. 9.1-9.5 HW #4 Dec 6 No class. Final Exam --

·      Homework: 10%

·      Midterm: 25%

·      Final Exam: 65%. Both exams are open-book open-notes.

References:

C. E. Shannon: A mathematical Theory of Communications, Bell System Tech. J., vol 27:379-423, 623-656, 1948

R. G. Gallager: Information Theory and Reliable Communications, John Wiley, 1968

I. Csiszár and J. Körner: Information Theory: Coding Theorems for Discrete Memoryless Systems. Academic Press, New York, 1981

Review articles in IEEE Transactions on Information Theory vol. 44, 1998.