How to do Matrix Calculus in data science and machine learning
Derive matrix and vector derivatives for linear and quadratic forms
Solve common optimization problems (least squares, Gaussian, financial portfolio)
Understand and implement Gradient Descent and Newton's method
Learn to use the Matrix Cookbook
Lecture 2 How to succeed in this course
Lecture 3 Where to get the code
Section 2: Matrix and Vector Derivatives
Lecture 4 Derivatives - Section Introduction
Lecture 5 Linear Form
Lecture 6 Quadratic Form (pt 1)
Lecture 7 Quadratic Form (pt 2)
Lecture 8 Exercise: Quadratic
Lecture 9 Exercise: Least Squares
Lecture 10 Exercise: Gaussian
Lecture 11 Chain Rule
Lecture 12 Chain Rule in Matrix Form
Lecture 13 Chain Rule Generalized
Lecture 14 Exercise: Quadratic with Constraints
Lecture 15 Left and Right Inverse as Optimization Problems
Lecture 16 Derivative of Determinant
Lecture 17 Derivatives - Section Summary
Lecture 18 Suggestion Box
Section 3: Optimization Techniques
Lecture 19 Optimization - Section Introduction
Lecture 20 Second Derivative Test in Multiple Dimensions
Lecture 21 Gradient Descent (One Dimension)
Lecture 22 Gradient Descent (Multiple Dimensions)
Lecture 23 Newton's Method (One Dimension)
Lecture 24 Newton's Method (Multiple Dimensions)
Lecture 25 Exercise: Newton's Method for Least Squares
Lecture 26 Exercise: Code Preparation
Lecture 27 Gradient Descent and Newton's Method in Python
Lecture 28 Optimization - Section Summary
Students and professionals interested in the math behind AI, data science and machine learning