ECE367H1: Matrix Algebra and Optimization (Fall 2023)
Instructor:
· Prof. Wei Yu < weiyu@ece.utoronto.ca >
· Office hour: Wednesday 11am12noon BA4114
Teaching Assistants:
· Faeze Moradi Kalarde <faeze.moradi@mail.utoronto.ca> (Tutorial HA410)
· Buu Phan <truong.phan@mail.utoronto.ca> (Tutorial HA401)
· Susanna Rumsey <s.rumsey@mail.utoronto.ca> (Tutorial BA1240)
· Kareem Attiah <kareem.attiah@mail.utoronto.ca> (Grading)
Lectures: (Starting Sept 12)
· Tuesday 12:0013:00 MC254
· Wednesday 9:0011:00 MC254
Tutorials: (Starting Sept 15)
· Friday 16:0018:00 (HA410, HA401, BA1240)
No lectures/tutorials during study break: Nov 711. Last day of the session: Dec 7.
Calendar Description:
This course will provide students with a grounding in optimization methods and the matrix algebra upon which they are based. The first past of the course focuses on fundamental building blocks in linear algebra and their geometric interpretation: matrices, their use to represent data and as linear operators, and the matrix decompositions (such as eigen, spectral, and singularvector decompositions) that reveal structural and geometric insight. The second part of the course focuses on optimization, both unconstrained and constrained, linear and nonlinear, as well as convex and nonconvex; conditions for local and global optimality, as well as basic classes of optimization problems are discussed. Applications from machine learning, signal processing, and engineering are used to illustrate the techniques developed.
Textbooks:
[1] Giuseppe Calafiore and Laurent El Ghaoui, Optimization Models, Cambridge University Press, 2014. (Main textbook)
[2] Stephen Boyd and Lieven Vandenberghe, Introduction to Applied Linear Algebra: Vectors, Matrices, and Least Squares, Cambridge University Press, 2018. (PDF available at authors’ website. Some homework problems are taken from this textbook.)
Course Schedule:
Week 
Topics 
Text References 
Assessment 
Tutorial 
Sept 1213 
Vectors, Norms, Inner Products, Orthogonal Decomposition 
Ch. 2.12.2 
Homework #1: Due Sept 27, 11:59pm (Word Vector, Fourier Series) 
Homework #1: Theory 
Sept 1920 
Projection onto Subspaces, Fourier Series. GramSchmidt and QR decomposition. Hyperplanes and HalfSpaces. NonEuclidean Projection. 
Ch. 2.3 

Homework #1: Applications 
Sept 2627 
Projection onto Affine Sets. Functions, Gradients and Hessians.

Ch. 2.32.4 
Homework #2: Due Oct 11, 11:59pm (Function Approximation, PageRank) 
Homework #2: Theory 
Oct 34 
Matrices, Range, Null Space, Eigenvalues and Eigenvectors Matrices Diagonalization. 
Ch. 3.13.5 

Homework #2: Application 
Oct 1011 
Symmetric matrices. Orthogonal Matrices. Spectral Decomposition. Positive Semidefinite Matrices. Ellipsoids. 
Ch. 4.14.4 
Homework #3: Due Oct 28, 11:59pm (Latent Semantic Indexing, EigenFace) 
Homework #3: Theory 
Oct 1718 
Singular Value Decomposition. Principal Component Analysis 
Ch. 5.1, 5.3.2 

Homework #3: Applications 
Oct 2425 
Interpretation of SVD. LowRank Approximation. 
Ch. 5.25.3.1 

Previous Midterm 
Oct 31 
Midterm Review. 
Midterm: Nov 1, 9:10am11am 
Midterm Return 

Study Break 



Nov 1415 
Least Squares, Overdetermined and Underdetermined Linear Equations. 
Ch. 6.16.4 
Homework #4: Due Nov 25, 11:59pm (Optimal Control, CAT Scan) 
Homework #4:

Nov 2122 
Regularized LeastSquares. Convex Sets and Convex Functions. 
Ch. 6.7.3 Ch. 8.18.4 

Homework #5: Theory 
Nov 2829 
Lagrangian Method for Constrained Optimization. Linear Programming and Quadratic Programming. 
Ch. 8.5 Ch. 9.19.6 
Homework #5: Due Dec 6, 11:59pm (Portfolio Design, Sparse Coding of Image) 
Homework #5: Applications 
Dec 56 
Numerical Algorithms for Unconstrained and Constrained Optimization 
Ch. 12.112.3 




Final Exam: TBD 

Grades:
· Homework: 15% (Graded for completeness only)
· Midterm: 30% (Type C, one aidsheet, calculator allowed)
· Final Exam: 55% (Type C, one aidsheet, calculator allowed.)