Date |
Topic |
Reading |
Items due |
week 1 (Tue: 01/30)
|
Introduction to Machine Learning
Day00 Notes: Introduction
Lecture Slides 1
Notebook#0 (released)
|
|
|
week 1 (Thu: 02/01)
|
Google Colab and Python Review
Day01 Notes: Python Refresher
Lecture Slides 2
|
Reading: |
Notebook#0 (due on 02/01) |
week 2 (Tue: 02/06)
|
Pandas Tutorial
Day02 Notes: Pandas Tutorial
Lecture Slides 3
Notebook#1 (released)
|
Reading: |
|
week 2 (Thu: 02/08)
|
Practicing with Pandas
Day03 Notes: Pandas Practice
Lecture Slides 4
|
Reading: |
|
week 3 (Tue: 02/13)
|
k-Nearest Neighbor (kNN)
Day04 Notes: k-Nearest Neighbor
Lecture Slides 5
Notebook#2 (released)
|
|
Notebook#1 (due on 02/13) |
week 3 (Thu: 02/15)
|
kNN code, Missing Data and Normalization
Day05 Notes: kNN code, Missing Data and Normalization
Lecture Slides 6
|
Reading:
The relationship ML and other fields
isna()
notna()
any()
value_counts()
dropna()
fillna()
|
|
week 4 (Tue: 02/20)
|
Normalization and Weighted k Nearest Neighbor (kNN)
Day06 Notes: Normalization and Weighted k-Nearest Neighbor
Lecture Slides 7
Q1 (released on Blackboard)
|
Reading:
sklearn: standard scalar
|
Notebook#2 (due on 02/20) |
week 4 (Thu: 02/22)
|
Weighted k-NN, Graph Plot, and Metrics
Day07 Notes: Weighted k-NN, Graph, and Metrics
Lecture Slides 8
Notebook#3 (released)
|
Reading:
Pandas DataFrame: sample
matplotlib: Pyplot
|
|
week 5 (Tue: 02/27)
|
Evaluation Metrics and Introduction to Decision Trees
Day08 Notes: Metrics and Testing
Lecture Slides 9
|
Reading:
sklearn.metrics: accuracy_score
sklearn.metrics: confusion_matrix
sklearn.metrics: mean_absolute_error
sklearn.metrics: mean_squared_error
|
Q1 (due on 02/29)
Notebook#3 (due on 02/29)
|
week 5 (Thu: 02/29)
|
Decision Trees
Lecture Slides 10
Paper-based in-class activity on decision trees
|
Reading:
|
|
week 6 (Tue: 03/05)
|
Introduction to Scikit Learn
Day10 Notes: introduction to sklearn
Video lectures on Blackboard
Day10 Notes: Decision Trees
Notebook#4 (released)
|
Reading:
sklearn: train and test split
sklearn: Decision Tree classifier
sklearn: k-Nearest-Neighbor Classifier
sklearn: evaluation metrics
sklearn: confusion matrix
sklearn: standard scalar
|
|
week 6 (Thu: 03/07)
|
Scikit Learn Practice
Day11 Notes: sklearn practice
|
Reading:
|
|
week 7 (Tue: 03/12)
|
Spring Break (no class)
|
|
|
week 7 (Thu: 03/14)
|
Spring Break (no class)
|
|
|
week 8 (Tue: 03/19)
|
Random Forests
Day12 Notes: Random Forests
Lecture Slides 11
|
Reading:
sklearn: Random Forest Classifier
Random Forests: Leo Breiman and Adele Cutler
Application of Random Forest: A Computer Vision research paper ICCV'2007
|
Notebook#4 (due on 03/19) |
week 8 (Thu: 03/21)
|
Day13 Notes: Dimensionality Reduction Techniques
Principle Component Analysis (PCA)
Lecture Slides 12
Project#1 (released)
Day13 Notes: Project#1
|
Reading:
sklearn: k-Nearest-Neighbor Classifier
sklearn: k-Nearest-Neighbor Regressor
sklearn: Decision Tree Classifier
sklearn: Decision Tree Regressor
sklearn: Random Forest Classifier
sklearn: Random Forest Regressor
sklearn: Feature Selection: SelectKBest
sklearn: Principle Component Analysis (PCA)
|
|
week 9 (Tue: 03/26)
|
Perceptron
Perceptron Update Rule
Day14 Notes: Perceptron Code
In-class activity (linear model, perceptron)
Lecture Slides 13
|
Reading:
The Perceptron: A Perceiving and Recognizing Automaton (Rosenblatt - 1957)
|
|
week 9 (Thu: 03/28)
|
Neuron Model and Loss Surface
Lecture Slides 14
Day15 Notes: Loss Surface Visualization
|
Reading:
|
|
week 10 (Tue: 04/02)
|
Optimization: Gradient Descent, Stochastic Gradient Descent (SGD)
Lecture Slides 15
Day16 Notes: Stochastic Gradient Descent (code)
|
Reading:
|
|
week 10 (Thu: 04/04)
|
Introduction to Neural Networks
Multilayer Perceptron (MLP)
Lecture Slides 16
|
Reading:
Multilayer Perceptron (MLP)
The Backpropgation Algorithm (Rumelhart-1985)
|
Q2 (due by 04/04)
Project #1 (deadline extended now due by 04/05)
|
week 11 (Tue: 04/09)
|
PyTorch Basics
Lecture Slides 17
Day18 Notes: PyTorch Basics in-class activity
Day18 Notes: Building a very simple MLP using PyTorch
|
Reading:
PyTorch
PyTorch: matmul()
PyTorch: transpose()
|
|
week 11 (Thu: 04/11)
|
Multilayer Perceptron (MLP) - (Modular PyTorch Implementation)
Lecture Slides 18
Day 19 Notes: Building Modular MLP with PyTorch
|
Reading:
PyTorch: nn.Linear()
PyTorch: nn.Sigmoid()
PyTorch: nn.ReLU()
PyTorch: nn.Softmax()
PyTorch: nn.Flatten()
PyTorch: nn.CrossEntropyLoss()
PyTorch: nn.MSELoss()
PyTorch: optim.SGD()
PyTorch: optim.ADAM()
PyTorch: optim.RMSprop()
PyTorch: torchvision.datasets
|
|
week 12 (Tue: 04/16)
|
Convolutional Neural Network (CNN)
Lecture Slides 19
|
Reading:
Earliest CNN: LeNet (LeCunn-1988)
|
|
week 12 (Thu: 04/18)
|
Convolutional Neural Network (CNN) - (PyTorch Implementation)
Lecture Slides 20
Day 21 Notes
Notebook #5 (released)
|
Reading:
PyTorch: nn.Conv2d()
PyTorch: nn.Flatten()
PyTorch: nn.MaxPool2d()
PyTorch: nn.AvgPool2d()
|
|
week 13 (Tue: 04/23)
|
Fine-tuning vs. Training
Fine-tuning CNNs
Lecture Slides 21
Day 22 Notes
|
Reading:
Popular CNN: AlexNet (Krizevsky-2012)
Popular CNN: VGGNet (Simonyan-2015)
Popular CNN: ResNet (He-2015)
|
|
week 13 (Thu: 04/25)
|
Recurrent Neural Network (RNN)
RNN for Natural Language Processing (NLP)
Lecture Slides 22
Day 22 Notes
Project #2 (released)
|
Reading:
RNN: Serial Order (Jordan-1986)
Finding Structure in Time (Elman-1990)
LSTM: Long Short-Term Memory (Hochreiter-1997)
GRU: Gated Recurrent Unit (Hochreiter-2014)
The Unreasonable Effectiveness of Recurrent Neural Networks (RNNs)
|
Notebook #5 (due by 04/25) |
week 14 (Thu: 04/30)
|
Transformer
Lecture Slides 23
|
Reading:
Transformer: Attention is All You Need (2017)
The Illustrated Transformer
Harvard University NLP: The Annotated
LSTM is dead. Long Live Transformers! - Paul Dirac talk
|
|
week 14 (Tue: 05/02)
|
Transformer implementation with application
Course Evaluation
Lecture Slides 24
|
Reading:
|
|
week 15 (Tue: 05/07)
|
Large Language Model (LLM)
LLM Application for NLP Tasks using PyTorch
Lecture Slides 25
Day 26 Notes: LLM for sentiment classification
Day 26 Notes: LLM for paraphrase detection
Day 26 Notes: LLM for question answering
Q3 (released on Blackboard)
|
Reading:
GPT-1 (100M parameters) - Radford et al. 2018 (OpenAI)
BERT (300M parameters) - Devlin et al. 2018
GPT-2 (1.5B parameters) - Radford et al. 2019 (OpenAI)
Megatron-LM (8.0B parameters) - Shoeybi et al. 2019 (NVidia)
T5 (11.0B parameters) - Raffel et al. 2020 (Google)
T-NLG (17.0B parameters) - Microsoft Corporation 2020 (Microsoft)
GPT-3 (175.0B parameters) - Brown et al. 2020 (OpenAI)
ChatGPT (1.5B parameters) - OpenAI 2022 (OpenAI)
|
|
week 15 (Thu: 05/09)
|
Vision and Language Model (CLIP, BLIP)
Lecture Slides 26
Day 27 Notes
|
Reading:
Interacting with CLIP
|
|
Week 16 (Tue: 05/12)
|
No final exam!
|
Reading:
|
Q3 (due on 05/12)
Project #2 (due by 05/12)
|