ECE367H1: Matrix Algebra and Optimization (Fall 2023)



·      Prof. Wei Yu < >  

·      Office hour: Wednesday 11am-12noon BA4114


Teaching Assistants:

·      Faeze Moradi Kalarde <> (Tutorial HA410)

·      Buu Phan <> (Tutorial HA401)

·      Susanna Rumsey <> (Tutorial BA1240)

·      Kareem Attiah <> (Grading)


Lectures: (Starting Sept 12)

·      Tuesday 12:00-13:00 MC254

·      Wednesday 9:00-11:00 MC254


Tutorials: (Starting Sept 15)

·      Friday 16:00-18:00 (HA410, HA401, BA1240)


No lectures/tutorials during study break: Nov 7-11. Last day of the session: Dec 7.


Calendar Description:

This course will provide students with a grounding in optimization methods and the matrix algebra upon which they are based. The first past of the course focuses on fundamental building blocks in linear algebra and their geometric interpretation: matrices, their use to represent data and as linear operators, and the matrix decompositions (such as eigen-, spectral-, and singular-vector decompositions) that reveal structural and geometric insight. The second part of the course focuses on optimization, both unconstrained and constrained, linear and non-linear, as well as convex and nonconvex; conditions for local and global optimality, as well as basic classes of optimization problems are discussed. Applications from machine learning, signal processing, and engineering are used to illustrate the techniques developed.



[1]  Giuseppe Calafiore and Laurent El Ghaoui, Optimization Models, Cambridge University Press, 2014. (Main textbook)

[2]  Stephen Boyd and Lieven Vandenberghe, Introduction to Applied Linear Algebra: Vectors, Matrices, and Least Squares, Cambridge University Press, 2018. (PDF available at authors’ website. Some homework problems are taken from this textbook.)


Course Schedule:




Text References



Sept 12-13

Vectors, Norms, Inner Products, Orthogonal Decomposition

Ch. 2.1-2.2

Homework #1: Due Sept 27, 11:59pm

(Word Vector, Fourier Series)

Homework #1:


Sept 19-20

Projection onto Subspaces, Fourier Series. Gram-Schmidt and QR decomposition. Hyperplanes and Half-Spaces. Non-Euclidean Projection.

Ch. 2.3


Homework #1: Applications

Sept 26-27

Projection onto Affine Sets.

Functions, Gradients and Hessians.


Ch. 2.3-2.4

Homework #2: Due Oct 11, 11:59pm

(Function Approximation, PageRank)

Homework #2:


Oct 3-4

Matrices, Range, Null Space, Eigenvalues and Eigenvectors

Matrices Diagonalization.

Ch. 3.1-3.5


Homework #2: Application

Oct 10-11

Symmetric matrices. Orthogonal Matrices. Spectral Decomposition. Positive Semidefinite Matrices. Ellipsoids.

Ch. 4.1-4.4

Homework #3: Due Oct 28, 11:59pm

(Latent Semantic Indexing, EigenFace)

Homework #3:


Oct 17-18

Singular Value Decomposition. Principal Component Analysis

Ch. 5.1, 5.3.2


Homework #3: Applications

Oct 24-25

Interpretation of SVD. Low-Rank Approximation.

Ch. 5.2-5.3.1


Previous Midterm

Oct 31

Midterm Review.


Nov 1, 9:10am-11am

Midterm Return

Study Break



Nov 14-15

Least Squares, Overdetermined and Underdetermined Linear Equations.

Ch. 6.1-6.4

Homework #4: Due Nov 25, 11:59pm

(Optimal Control, CAT Scan)

Homework #4:


Nov 21-22

Regularized Least-Squares.

Convex Sets and Convex Functions.

Ch. 6.7.3

Ch. 8.1-8.4


Homework #5:


Nov 28-29

Lagrangian Method for Constrained Optimization.

Linear Programming and Quadratic Programming.

Ch. 8.5

Ch. 9.1-9.6

Homework #5: Due Dec 6, 11:59pm

(Portfolio Design, Sparse Coding of Image)

Homework #5: Applications

Dec 5-6

Numerical Algorithms for Unconstrained and Constrained Optimization

Ch. 12.1-12.3





Final Exam: TBD




·      Homework: 15% (Graded for completeness only)

·      Midterm: 30% (Type C, one aid-sheet, calculator allowed)

·      Final Exam: 55% (Type C, one aid-sheet, calculator allowed.)