FOUNDATION CLASSICS: This course (1 of 2) covers the mathematical foundation classics of Artificial Intelligence (AI). It includes a quick review of calculus, linear algebra, and probability. Then delves into linear regression and logistic regression. Then constrained optimization including Lagrange multipliers, the Karush-Kuhn-Tucker conditions, Lagrangian duality, and Support Vector Machines. Next it covers Fourier analysis including Fourier series, Fourier Transforms, Discrete Fourier Transform, and the Fast Fourier Transform. Then Eigenvalue Decomposition, Singular Value Decomposition, and Principal Component Analysis.
DEEP LEARNING: This course (2 of 2) provides a rigorous in-depth coverage of the mathematics of Deep Learning. It starts with neural network basics. Next it delves into deep reinforcement learning, covering Monte Carlo Tree Search, AlphaGo, AlphaZero, and Alpha Tensor. Next it covers Generative AI, in particular, Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and Diffusion Models. Finally, it covers generative language models including Word2Vec, Attention Mechanism, Transformer, Large Language Models (LLMs), and Contrastive Language Image Pretraining (CLIP).