NYU CS-GY 6923 Machine Learning

Professor Pavel Izmailov

Friday 2:00-4:30pm, 6 Metrotech, Room 315
Virtual lectures: Zoom.

Pavel Izmailov
Pavel Izmailov
Professor
Majid Daliri
Majid Daliri
Course Assistant
Harsha Mupparaju
Harsha Mupparaju
Course Assistant
Shubham Rastogi
Shubham Rastogi
Course Assistant

Course Description

This course provides a graduate-level introduction to machine learning through a mixture of hands-on exercises and theoretical foundations. We will cover fundamentals of machine learning: regression, classification, linear models, neural networks, numerical optimization methods (gradient descent, backpropagation), unsupervised learning, and a number of other topics. We will also cover basics of language modeling and understand how systems such as ChatGPT operate. The course includes hands-on exercises with machine learning methods and covers a broad range of applications.

Professor Contact

Email: pi390@nyu.edu
Office Hours: 5:30pm-7:00pm on Mondays on Zoom

Course Assistants

Majid Daliri: daliri.majid@nyu.edu
Harsha Mupparaju: sm12754@nyu.edu
Shubham Rastogi: sr7421@nyu.edu
Office Hours: TBD

Links

Lectures

Date Topic Optional Reading Homework
09/05/2025
Slides
Recording
• Introduction to Machine Learning
• Course logistics
• Supervised learning
• Regression and classification
• Linear regression
Probability review
Linear algebra review
Deep learning book, part I
The Matrix Cookbook
Probabilistic Machine Learning: An Introduction, chapter 7.8 covers matrix calculus
• 2024 CS-GY 6923 lecture 1 slides by prof. Chris Musco
• Numpy demo: demo_numpy.ipynb (not turned in)
• Simple linear regression demo: demo_auto_mpg.ipynb (not turned in)
lab1.ipynb, due 11:59pm, Monday 9/15
09/12/2025
Lecutre by Majid Daliri
Slides by prof. Musco
• Data encoding
• Generalized linear models
• Model selection
• Generalization error
• Statistical learning model
Probabilistic Machine Learning: An Introduction, chapters 4.5.4-4.5.7
ISLP: Cross-Validation and the Bootstrap
• 2024 CS-GY 6923 lecture 2 slides by prof. Chris Musco
• Demo 3: demo_diabetes.ipynb (not turned in)
• Demo 4: demo_polyfit.ipynb (not turned in)
• Lab 2: lab2.ipynb, due 11:59pm, Monday 9/22
• Complete written Homework 1, due 11:59pm, Monday 9/29. 10% bonus if you typeset solutions in Markdown or Latex!
09/19/25
Slides
Recording
• Underfitting and overfitting
• Linear equations and pseudo-inverse
• Regularization
• Classification and logistic regfression
Moore–Penrose inverse wiki
Stanford cs229 lecture notes, chapter 2
Deep learning book, chapter 5.5
The Elements of Statistical Learning, chapters 3.4, 4.4
Probabilistic Machine Learning: An Introduction, chapter 10.1-10.2.3
• Demo on underfitting, overfitting and regularization: demo-overfitting-underfitting-regularization.ipynb (not turned in)
• Demo on classification and logistic regression: demo-logistic-regression.ipynb, (not turned in)
09/26/25
Lecture by dr. Timur Garipov
Slides
Recording
• (Regularized) Empirical Risk Minimization
• (Stochastic) Gradient Descent
• Non-convex loss landscapes
• GD Variants: Momentum, ADAM
• Beyond optimization: game-theoretic formulations; dynamical systems
Deep learning book, chapters 4, 8-8.1.3, 8.3, 8.5
Nocedal & Wright — Numerical Optimization (2nd ed., 2006); classic textbook on optimization
Kingma & Ba (2015) — Adam: A Method for Stochastic Optimization
Garipov et al. (2018) — Loss Surfaces, Mode Connectivity and Fast Ensembling
Zhang et al. (2016) — Understanding deep learning requires rethinking generalization
losslandscape.com; loss surface visualizations
• Lab 3: lab3-regularization-logreg-gd.ipynb, due 11:59pm, Monday 10/06
10/03/25
Slides
Recording
• Neural Networks
• Universal Approximation
• Backpropagation
• Autograd
• 3Blue1Brown video 1, video 2 on neural nets
Deep learning book, chapter 6
• Neural Networks and Deep Learning by Michael Nielsen, chapter 4
• Demo on simple MLP training: demo-mlp.ipynb (not turned in)
lab4-mlp.ipynb, due 11:59pm, Monday 10/13
10/10/25
Slides
• More Backprop
• Regularization in NNs
• Effect of Depth
• Initialization
• Activation Functions
• Residual Connections
• Normalization Layers
• Convolution
Deep learning book, chapters 7, 9
• Demo on simple MLP regularization: demo-mlp-regularization.ipynb (not turned in)
• Demo on MLP depth, activations, skip connections: demo-mlp-depth.ipynb (not turned in)
• Demo on CNNs, CUDA: demo-cnn.ipymb (not turned in)
10/17/25
Midterm

Acknowledgement

This class (lectures, demos and course materials) is based on the previous iteration of CS-GY 6923 taught by professor Christopher Musco.