ECE368 Probabilistic Reasoning (Winter 2020)

 

Instructor:

·      Prof. Wei Yu < weiyu@ece.utoronto.ca >  

·      Office hour: Monday 3-4pm BA4114

 

Teaching Assistants:

·      Kareem Attiah < kareemattiah@gmail.com >

·      Wei Cui < william.cui@mail.utoronto.ca >

·      Reza Farsani < reza.farsani@utoronto.ca >

·      Haider Al-Lawati < haider.al.lawati@mail.utoronto.ca >

·      Tao Jiang < taoca.jiang@mail.utoronto.ca >

·      Justin Kang < j96kang@hotmail.com >

 

Lectures: (starting Jan 6)

·      Lecture 0101 Thursday 10:00-12:00 & Monday 16:00-17:00 GB 248

·      Lecture 0102 Monday 10:00-12:00 & Thursday 13:00-14:00 GB 244

Starting March 16, all lectures are posted online.

 

Tutorials: (starting Jan 14)

·      Tutorial 0101 Tuesday 15:00-16:00 BA 1230 (Kareem Attiah)

·      Tutorial 0102 Tuesday 09:00-10:00 SF 2202 (Wei Cui)

·      Tutorial 0103 Tuesday 09:00-10:00 BA 2185 (Haider Al-Lawati)

·      If you are in Tutorial 0104, please come to any of the other three tutorial sections.

 

No lectures/tutorials during reading week: Feb 17-21. Last day of the session: April 9.

 

Calendar Description:

This course will focus on different classes of probabilistic models and how, based on those models, one deduces actionable information from data. The course will cover probabilistic models including vectors (e.g., multivariate Gaussian), temporal (e.g., hidden Markov models), and graphical (e.g., factor graphs). On the inference side topics such as hypothesis testing, marginalization, estimation, and message passing will be covered. Applications of these tools cover a vast range of data processing. Pre-requisite: STA286 or ECE302

IV - AECPEBASC; IV-AEELEBASC

 

Learning Objectives:

1.     Understand joint distributions, marginal distributions, conditionals and Bayes rule.

2.     Understand vector-based probabilistic models, e.g., jointly Gaussian vectors and linear inference.

3.     Understand graphical models, message-passing inference, and MAP rule.

4.     Understand hidden Markov models (HMMs), the forward-backward algorithm and the Viterbi algorithm.

5.     Be able to model decision problems in engineering as probabilistic reasoning.

6.     Be able to identify which type of probabilistic model (vector, graphical, temporal) to use for different engineering applications.

 

Recommended Textbooks:

[1]  Dimitri P. Bertsekas and John N. Tsitsiklis, Introduction to Probability, Second Edition, Athena Scientific, 2008.

[2]  Christopher Bishop, Pattern Recognition and Machine Learning, Springer, 2006

[3]  Kevin P. Murphy, Machine Learning: A Probabilistic Perspective, MIT Press, 2012

 

Course Schedule:

 

Week

Topics

Text References

Assessment

Jan 6

Joint and conditional probability distributions, Bayes Rule, Conditional Expectation, Correlation, Independence.

Bishop 1.2

Bertsekas Ch. 1-5

Murphy 2.2

 

Jan 13

Maximum likelihood (ML) estimation. Binomial distribution, Laplace smoothing. Bayesian vs. Frequentist statistics.

Bertsekas 9.1

Bishop 2.1

Murphy 2.3

Homework #1:

Probability

Jan 20

Maximum a Posteriori (MAP) estimation, Beta distribution,

Conjugate prior.

Bertsekas 8.1-8.2

Bishop 2.1

Murphy 3.2-3.3

Homework #2:

ML Estimation

Jan 27

ML vs Bayesian Statistical Inference. Least mean square (LMS) estimation.

Bertsekas 8.3

Homework #3:

MAP Rule

Feb 3

Bayesian hypothesis testing. Naïve Bayes classifier. Multivariate Gaussian distributions.

Bertsekas 8.2, 9.3 Murphy 3.5

Homework #4:

MAP and LMS Estimator

Feb 10

Linear and quadratic discriminant analysis.

Bishop 2.3

Murphy 4.1-4.2

Computer Lab #1 due Feb 24

Feb 17

(reading week)

 

 

Feb 24

Bayesian inference in the Gaussian vector model: MAP, and MMSE estimation of Gaussian vectors. Matched filtering 

Bishop 2.3.2-2.3.3

Murphy 4.3-4.4

Bertsekas 8.1-8.2

Midterm:

Feb 28, 6-8pm EX310 & EX320

March 2

Bayesian linear regression. Ridge regression.

Murphy 7.1-7.3

Bishop 3.1, 3.3

Murphy 7.5-7.6

Homework #5:

Gaussian Estimation

March 9

Markov chain. Transition matrix. Stationary distribution. Eigenvalues of Stochastic matrix. PageRank.

Bertsekas 7.1-7.3

Bishop 13.1

Murphy 17.1-17.2

Midterm return.

Computer Lab #2, due March 19

March 16

Directed graphical models. Bayes nets. Undirected graphical models. Markov random fields.

Bishop 8.1-8.3

Murphy 10.1-10.3

Homework #6:

Bayesian Regression

March 23

Inference on graphical models.

Message-passing algorithm. Factor graph and belief propagation

Bishop 8.4

Murphy 20.1-20.3

Homework #7:

Markov Chain

 

March 30

Hidden Markov Models (HMM). Forward-backward algorithm. Viterbi algorithm.

Bishop 13.2

Murphy 17.3-17.5

Homework #8:

Graphical Models

April 6

Expectation-Maximization (EM) algorithm for HMM.

Bishop 13.2

Computer Lab #3, due April 9

 

 

 

Final Exam

 

 

Homework/Tutorial Schedule:

 

#1: Probability Review: (Jan 14)

 

Q1. Murphy, p.61. Exercise 2.1.

Q2. Murphy, p.61. Exercise 2.2.

Q3. Murphy. p.62. Exercise 2.4.

Q4. Bertsekas p.191. Ch. 3. Problem 23.

Q5. Bertsekas p.191. Ch. 3. Problem 25.

Q6. Bertsekas p.194. Ch. 3. Problem 34.

 

#2: Maximum Likelihood Estimation: (Jan 21)

 

Q1. Bertsekas p.445. Ch. 8. Problem 1.

Q2. Bertsekas p.445. Ch. 8. Problem 2.

Q3. Bertsekas p.507. Ch. 9. Problem 1.

Q4. Bertsekas p.507. Ch. 9. Problem 2.

Q5. Bertsekas p.509. Ch. 9. Problem 8.

Q6. Bertsekas p.509. Ch. 9. Problem 9.

 

#3: MAP Estimation: (Jan 28)

 

Q1. Bertsekas p.445. Ch. 8. Problem 3.

Q2. Bertsekas p.445. Ch. 8. Problem 4.

Q3. Bertsekas p.446. Ch. 8. Problem 5.

Q4. Bertsekas p.446. Ch. 8. Problem 7.

Q5. Bertsekas p.447. Ch. 8. Problem 8.

 

#4: Least Mean Square Estimation: (Feb 4)

 

Q1. Bertsekas p.447. Ch. 8. Problem 10.

Q2. Bertsekas p.447. Ch. 8. Problem 11.

Q3. Bertsekas p.448. Ch. 8. Problem 12(a)

Q4. Murphy p.94. Exercise 3.11.

Q5. Bertsekas p.448. Ch. 8. Problem 13(a)(b)

 

Feb 11: Computer Lab #1 TA discussion

 

Feb 25: Questions from 2019 Midterm.

  

#5: Gaussian Vectors: (March 3)

 

Q1. Murphy p.144. Exercise 4.7

Q2. Murphy p.144. Exercise 4.9

Q3. Murphy p.144. Exercise 4.14

Q4. Murphy p.144. Exercise 4.19

 

March 10: Take up the midterm 

Computer Lab #2, TA discussion

 

#6: Bayesian Linear Regression (March 17)

 

Q1. Murphy p.241. Exercise 7.2

Q2. Murphy p.242. Exercise 7.4

Q3. Murphy p.242. Exercise 7.5

 

#7: Markov Chains (March 24)

 

Q1. Bertsekas p.385. Ch. 7, Problem 11

Q2. Bertsekas p.385. Ch. 7, Problem 12

Q3. Bertsekas p.386. Ch. 7, Problem 15

Q4. Bertsekas p.386. Ch. 7, Problem 16

Q5. Bertsekas p.391. Ch. 7, Problem 22

 

#8: Graphical Models (March 31)

 

Q1. Bishop p.420. Problem 8.11

Q2. Bishop p.421. Problem 8.16

Q3. Murphy p.335. Exercise 10.5

Q4. Murphy p.335. Exercise 10.6

 

April 7: Computer Lab #3, TA discussion

 

Grades:

·      Computer Labs: 35%

·      Midterm: 35% (Type C. One double-sided aid sheet. Non-programmable calculator.)

·      Final Exam: 30% (Take-home. Details to be announced.)

 

Computer Labs:

Python programming with data set.

1.     Naïve Bayes Classifier and Linear/Quadratic Discriminant Analysis (10%)

(a)   Spam email filter and document classification based on the occurrence of words

(b)  Linear discriminant analysis for height/weight data

 

2.     Bayesian linear regression (5%)

(a) Gaussian estimation; prediction error of regression

 

3.     Inference on a Hidden Markov Model (20%)

(a)   Positioning of a robot using forward-backward algorithm

(b)  Tracking of a robot using Viterbi algorithm